var/home/core/zuul-output/0000755000175000017500000000000015145677634014547 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145724656015510 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000350362215145724471020273 0ustar corecore9ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB PV"mv?_eGbuuțx{w7ݭ7֫~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHev_uαHJ2E$(Ͽ|/+k*z>p R⥑gF)49)(oՈ7_k0m^p9PneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZR}@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ{g6R/wD_tՄ.F+ P'AE; R j"b~\b$B|W XWz=؅ <%fpG"m%6PGEH^*JLJ)oEv[Ң߃xQrMI>QQ!'7h,sF\jzP,mO(f=rWmWɂN$r{2k혿q}lrCy u)xF$Z83Ec罋}ksBK}xnwDb%M6:K<~̓9:u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+?}]uemNi_󈛥^g+!%Gܼ8 Gx.b;NH g#+@yWNKyL~Ev.2;mi9ZIi$2?$>Q IJipqc2*;z+ϟMn?!ԫ-H&=J[ܓhkB\LQ"<LxeLo4>o_m24^3.{oɼʪ~75/nQP2z>g8:.bOp4y}|B}yuMp(jL!M72llg'!! e#@\YV,4&`-7 E=߶Ūhk{~ Iͻ vtoW=8p_T% lvǫ֠*[*5]^ƕUʥ 9MGb&0[T*'e8@8-X[Mn3feTk9[ům@6P!{UcSxlT![`[klzFض˪. d6xl@ήNήܶn6@w|W &E[5?X3lEp៚U|,}S@t1:X e{Hk|v;tCl2 Z\$fx_/r*෮pq5BGެn'^52&I pѴOw4ǛJ5H-HKC5Aw͢.@d!"3껝`Ͱ٢x'H/ n`т 6mD[%Z` %9W!n&.Tu n$A  S'kmOl7^ (6_"aRC ¶.vBTcm"Bsp rjﺧWh]ut/xYH0Yhu ;ZUM 6/"Iit } w3n_9>Byݝ0ߒ5bZ8ւ 6{Sf觋-=O߮bC,6B< f`mPѮi!M6'5m 7aTcTA~Ut,zF u _5_W!:-N@SĔRwtce܂m(^Q!j-7BtK|VXnT&ͲH-nမVbeXEtOfD-˸dy.0]?ݽv>|WAޭi`HbIãE{%&4]I1vu@K?k^*?T3{k5|VϟT;Cq+ts[T Sa /ZaDže(JgP1,MNi˰ 7#`VCpᇽmpMktWuk0 ?\34 P U!7 _* % ¶]fXEj@5Jc/1o٩|S戻,6t錓"*'0/~ܖj49lnAvoë7EKv8M  /`-uk[w 5쭅]7G*Gڞq ~]lrkJ0C5PdW@/#ut{xH9Adim-L^!~߽f)QU" cѲj Ak-ڶ xIuҐqI$6ʎ@lbx\<uV?.*E!qQ5m㎤9I͸,0E.ŊygcEl#L)(g4^aNtNbm7}v+7Zo>W?%TbzK-6cb:XeGL`'že]M 1Eumwg7eY(cmUY&er?JNt=O7^r.@DEuU1}g$>8ac#sĢB\PIPfwJQJr(o6jp}vE#!3M. x!0=k$}  D&T+̔6vmU@`E|= P*<7.Va*H7MQ%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5*ǯɘCG_[rbkI%8zPU:LN=TPOlI&N:oWӳM\Qe%*?vQ~W  yr3-2+=Щp!k2 ûQÝBsN&4wx,k%*nD4qL~`|%4Q0q["< HK'f dt(d/ZoQ%_}~Yki7}SWUږv_lw OZդڏv8ʿulGU߽сHS1))e)`p'Z?јhaӟo>uZm8pngdVj!p2֬u>T |a%ĉUHSR0=>u)oQCC|l3{c` 1nhJzQ=Ks41l2'>^&pn08p15w q L:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]O^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8?R{3,>0|:,5j58W]>!Q1,6*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O D-\ n4fuc:7k4p8wWDeUc.)#udoz$} _3V7UݎvxyRC%ƚq5Щ/ۙw* CVo-1딆~ZYfJ"zEk1y [rgqUFFW>'ǣC~eTE6sJ $ oa"$d#HXu`\ozޮ73=nUu u~xUF<\fc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: Jq#Bh4~aq >m\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܖ~YͤBoK&9<{̻*RmသLΕbDOJx߭&;~+WrVXyԋ4ԙ./_A9B_-Z\P TqM8@4Q2l ,}.|S,Jre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊ lv+ H\FpG)ۏzk_NǸ*kb05 V8[l?W]Q.#=Im41diq:Ԝ czZ*8/'wMm[eG`̵E$uLrk-$_{$# $B*hN/ٟ<H(^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A6@\2X#Ih9N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dTDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb/{&Ά+4*Iqt~L4Ykja?BHX!lk҃=pnUגZ6p| G;;74^l{Pclwů Հ}xcSu)6fbM/R(*ȴd.^Qw %"=nluOeH=t) Hİd/D!-Ɩ:;v8`vU~Ʉ!hX #'$2j1ܒZ˜bK@*`*#QA 9WykGk,8}B6{/) ݆Y~ 1;;|,ۇ=sxy+@{l/*+E2}`pNU`ZS̯窜qN8V ['4d!FmaX-6 y:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;YKOIcJF"~ %d0/v ݔ5xgrg5_,#Y(HZsy$d\ 4Awj+L?_^ QE>U.A!W?/>vj難DDW3q)pCxM q q *KP1r.Va; !Q*:fʴ%8Ky_)# eo}]Ų]"tp 'QlgG% .h\-]!TIZFEU~is5k|Nߋ.rW:-ZE f/NZ[w RQX)d]UM!ki׶dcZ]sBDzm B8 h@Wu LNזFwiԣY\G̊8;[?頀 \K6IĶv`b13"ʬ.@A8гitx*F$OQ呇%p*H:?~y <ù*w4K#diQgDyFs|" 2ODWa]V` 6LT`Q1iAMNcW{MM8*d;ITb.Ji')|v]+ 5ucHeOU;pKc+#NGGۊ;bgWʤ߲kTT5~%gA>J1n' Ƀ+Z%Qux`HYp*=qj"XdAH)͆Kh̀LAYѹC-5uxp_ۦ#N,FV폦N㱰p 7{fyXٕ ,o]FtP ~&cG ml2}CcϣinQpU֒m7ݟÃeg7<jӤJ ~wV$IBy_`HjDE 21m8Z KԈ&x|}Wbd7T|q̝VGr/ y:cQʬ9d;SzӬS:K}etx33Ov.ޞBi4,y8hg@|kW ZJQ 3eb)ˈ">BuS^k,,eH #2D . &ƚHZKjYyDZZb<!pyU*H64._}aAi *i.^;@zXnd}*aV0[7U95M@RluP%w.$)ޅ< XG\C$}URM}/UuX+1ɓ|Z"Yw]:Q;e%.V4-wle `a{;<K{8ʆzDD^m|=0%ͶxJk< J/`,`;dpDeyGXeTC] hiªʌ6kkRXo*54T*TɀS_ZXkʘD Ce-ۻEVJ\Fvԡ%!+(" u ]KղERoQzx`.$Nv#.22mGPU[8c*)ok9&'dcDVj}SeR_*(њ*CduX*2*4>復L[DV-`. ]\Pk.z+1"* F>9V>J{4H-UuM'6ӵaLt,P* Wɲj6V!5RHu$ciD43Oh.QkX=h\W )[-*[pRST ߷->Z5{WU \l.uGMTmxX0'6pvVEk%GYݒm JTl9;RgU {{",W@ 2ƶ"\{v.7>R "ySp)EH";F(+.bU&-+6Qwd*4di?8 3L5M5Óe@"lZPX*"Ɋ=3dwlxR[r96uM]Ai%FTĜ?5Lwj jJCKEbzƴ7O}-rXt=}F~?,,CʐX;S6g :X<} .drUdwMrUcK\kuVެ@ԝZ]O֖VEn@GDI-F \YWIyjCnҲ4xdzC4)Ś`q}Oײv1i9gE vA躖;kK8by$z#6ǒm%,^6mѾ{m gPt}]I<6\D/FWm&趻y)4KY/Z_ՠ^_Wg0i i6 &LꪇPyVeI ,zYDB%y}|U|V5<Wmoܛqx?z,3 UX$i"ҥx CT=.n{RВ&lȚ18洽LXܙ4{cWQ5h۔mHUu7ū>.qvv0AwhzF {h9i Ӛ8l˰'cfU\{,U!ɛwzhPuuq]Jry{v%n{*UHYo{?QFtg?+#*,y@dr.Kȫb%ę8 չZ%:Q؟eBaO5Vɹ8l&|~wBe~uL2< 0}:K0`0A ? ;ḿܺF,x%bM~!n j4( c@2idkDl!; Jrb?H]b{Ù` &62,I n]ÛLĜ7 \HOcc[øߜ8 $c-T&qGϟNQ??{ܶ 3גk{u4N|ݙ&Hf#ZR9HQȦRwۤDw8xRУZ50Ƶɩ  Mg|A#EKdwBZfFÐ=pP8cp>|j>d *"xZ{eϪz%EJ^bsyZ_E4!oL:~Q]SNN19C}ypcӷ/_Q"#_)%{ zK+x!5EΕVPmu OqͪZԩlVfC:Ǖwе[1v7d[hE>/18wXwk躭w?ȷ^0]e*~ӣ>;w|~??:~O" w٥f[,0eKX6q%e3! 6nP|fJ`خ$iY07mB33mi.bΟt;OaPg#4d@6 cx"pDbE,NBd9@<ƹ!!$f7z%䆄p = @:l*qs'4ú Pl)+X@ԝ91Fz}S73<8JCN=g"1`AXFGP^D`U3FWl@ЉaލxtD(Iн6yy&9" A!;5D(YapRݒcGuP;C\'ណH={@'n"`HÖ{W>激 `:lwf\{Qw`SX_YoCahB ;'wsю0SoSf-Sd8!Kݪi@5B.êvw+N7y9X㴃;nw%4ivX}sChu%- Hh:w3AM qٮY>@͙cEn@qU%e3½r\o 5!5%`֘:ܡԔ$OPP_KunQ_ ]8"e83$H! ˙jBwVv d5##yY",AIԆ"1tg* Vɋ\@Wqy bآ!%u)78GK|bYܑidTb nr0Ά v0{^DO?M65պ Z]irS-?v]Mm^.0g^<ܺFt>t J"G}vx+LA %z=jlPq;;EN RcX4XGuC;E.VI;s):(kQR5\&1K`12FocPz fNDiBZ{CP ,,d]Ը'2,ߪi :F4)yYeyh=^ oQcj sRQͮj-1"o$"I( v 9G u=Mh0` j?SEG:x;Es aKU:dLӇ+s8YQ JWPjUڡ;~+X%Qo3_z T q+'䆑RIvfxLҭ82eg߇J.Z$:H:tDdWiҵ`DyhUwLL.Hy؎AWU(WHٓN^*tH}/^b{qR7tbqO܏ܤ2,0o8=7QJJ}468 !C6"8GۇtN: gΧN] ֹ|yzrz*Ix~0X* ֡9~ њdmAN=j英O=A/V[drw3Aq`Ӣ 1> r1/ PFnS6L{NОK Q=n`cJO?xpbWx;0u9^KѠ4/ ꞩwh䲦oH WnRB. EVQa2:jY.JÑH0rtozk%Q(gQyF x*uϦ"h=h6Uh9t<7*HE:k9fXK8dvò'h?JS$DFF^ˈ3*k680UӦ08lcC"-L6؃b!>i(IaN 5dGÕn-fkSIoצvQ8p؜YR%ɆԳk,yIj6rq`כL2q7 mb+=VZ#?/lD͜4dS.5lSIL;2cK7b1(`3H(jQnvYG؇<hkPt%t Z$Kq _ȋ2U|oe˂-e ʞ'({lAWmӦ ʞ٪ʶjՆݪeҴm'76k/Kko!D(β: p.tiUt]X|n|MOH"iY$|j$%L8C%R<+ >VI"TlWUMMVOH5S 6Yå>uz}&>NxE2#*CbxZ{ >ߊ,iX:ֱҷ_D h~h^BoHĶ߽;@xnE0 -u’gmE~,yJRmClG_dxk-πABX iLU) YT=0>v^/‚9ڭ׽tm^sW?LθP$L(\6LuژRi!SX#hmiׅ4+9[.LCBS(zSy]2#x5T r裸Xthݰ"JU3r0'y |P,)嫞%ܩ|^K]ШZ3u Vj]!h TusvЊȝ+|*l4N1A VxԊ?C˵])*Et+k2g#,:CCUKpl4O=[ؒŻՄl1xfCdXa}su\TnW,S?H9n-뗅"5OyµVPRR) 5D4XkA~T|4"#ŔE2 '"ŌlEhF\|6B(0AR>o HIJ.Uq~6+dDg]EgZ:JRaCZڠ"H ہA_w,kCvu{҉-tmD>y%zx^S&ɮN=xZH #EUu\j[R WwRG.k-wTGO8u>iXȥҐ uGpF#vY.܈%lw=uX>{D/iiXnɕVm2p#OiT) 5-Lu`y; ƹlGY QSPOjcj*yJ-OU'Ta@,L3t#ذYKV30=EfH;oB@=Y@us;Ȑw%Ӑ):GT WD c'[fGK2:TPIM;~$͐(7b 8]]WYP6yZ?h$;_#dGmGJB4~o=PUϣ#MxcV:{P{jJ'k Eo$.W+4X&D; ` t_`Vf -BT~cw#7}tc:bp9 S$G$hT*pRpH3F:p{&sC-+Z=(8c[E#:LE~BG$8s>B8\=F S[Þ jDPcLFCX^HsΒA{yPd+uF(ZE@] T/(bzc'G`c%J Z * ׼O(0_[bZ,;Ě@ ᵤPOŢhwEۻ~ߩ遣Q$Ǜ?&7`tFSL U)>ir E>^pLc2 m5$RH}rQK+|ښ8J>q|8+bLg키Hǧi9ꤢ=`aѹj=!jdIFsBRx1ͤrE) km ,p,1H |?w=pf"DrB׵*#H3l$ x<.E@6_0,Iba%虩(<9En)9-CQB ݪ1.Ax9`/Vd=~omFAӲ^9*J#+ |Xq,:jǎRpyo411>8PK]ON򨒯%+&d#\qA[5qǻ:aM5Ŭj4FѼhhm׃Q"U^%9ߺ[dr%1J A/~{X5k+5a9j'S C\n16ӈ7KZR$];is쁣c#sV:H+@Md0"#l$H*l~nt@q1 ]y3 )rms[F!ì&ƝI~^[ 6W֬UL9QH:ҍu#FZ PePi&ӆ= ԃ4Va0 Q$zQ<B楷^[ԜR $]r큣a=t\΅Y)gŞڽE)4npdޜRY6[ƹ h{H#ك]+KW#ՄRIqP.}_#8)#gܹ5Zg[a yOfC}ѡ`H'i+`դZiGGs9޸GB*C#{m"(-,d:ARj铘R,~`k&2IW>>tD58ҙf %&3`R,:/L}"OڨpI #rY.+ItQεq;03J+Ѳ~?> <ރ~ϼʋdF(JLMJ3YNS:,%p!HA\][Ivz{p<( v4?U[JM)FbU ooBs큣lV}]wAO,= stp3:Swv)ukh%e am;ga%u3)T/hz?Rogkyωnu?8z QUDv2Q-t@[ ']dN-4%2E/PX(m蓃wJ$5qh":Qf>uiboꦗYX.h뫃BV)ˤPq +=遣ukGR[͝}큣sr} ק-ӳT?:66~gE7Y\G:.1A %נΤBԹ~x[SPE0o,^?eNeG;:&ͤO JheOB1913m$alL5PK?m>ծNYu#J{qEM${y9X/GG us1&FsB3Ex˾U*7GS*5~)lzrYq&0$&umnC,.{l+͊njPB75u?׳#1M]VI%n= xO5lVM ~5=pSnDȔW__ʹYeO~ )ׯfpֽN!e[|VKl8\zRxc{IW8*bӬﹷ}Dϰ@oBкj=W/]O)+Ed\ )[{^ӷGB9nm[RY~^uXVN+tx*@ ->OnUؿj@5%Jp"*0Y|cԡ(& bz?~}k% e8r ǿE ɞsh7mNߕ܎N6: 0\d聣̵tkgGJg[WmcG*n]pD٢quG޾}<'s8BoJ_5Qx%bbl˒o~~9פݹ7B ڝ]g&J_W#Pwkmrf kt :1zE ^g~kڒ:>be'ֈ_Ƅ\?U+nXw0#cm^IA!F悭B$<Ԃ~&~ϛ|O*{%rpJH ")jP[ח`*E霡izD{fkbЌEtzjx&kl\ # HhV;PNGZRzMqZ_tf .XE Wr}j\l%Cچ6/^8S9 $#}zQ=-hIos~֭ߕ2j8(=:<7o-s5evxʋ=lE|vqd$DN=ptZH8YkA!G2/{{{&С糚ɽ\che޵q$ۿ2 ٽ!,\f !:z{A(R!8[=3fD,3էNU Ƌ>wMno?رr!$MCa;aIIGq? ZtjiM7AZ߻3Sƍv,]pl8%N)Ļ-6UXLQ2q໹`p#a2'* [vP8 J:|;sӼ&I#`sS:Hot&wLn8B1jC$Z)9 >MrKY+NF&s etjSt&+fG٭p.$V^|neyw8w$9h:.:^@`AS͔@3C<OvWhd0/kT\Vj—'Ɠzj)~=޻LnY`zOAS0+1(B#>A_|;o9G諱/Fӱ@9$ۄSȩ>=B  v Z=-YY1ͭi@\<6!Npx4d lOfbW `~Cs>BQ+m>`p~R%}a0oOj*1 03 dHF)z?InuN@qJ{ݫ9eOYVf '7\ľ~;ȋ/AɊ7W!z@|/\4N<0s~.[3 {7?-T@ތ]qX3FE1As_rIX`BC MճsHYWV'eW4N-jZ%;z`6rAxtp<<Ƽ;}у Z7B9 Ka;nJl0bJV}ڏS9mF j-oW. ,>sH.˃8Maz`83aco\.(v-s߫$@\VpohNȹ Ӥ@7ɧ|0M RTMLbG%1+54M(ڭPlr褬 /Ďsf!J9?y!tӨu (E2VE1two(Me.GŦ~t1PRC4:FR)\v@LD?[J%Ub]>fFPuMg?+\< ~W_orSx/޻P)VSswprX=~V-!o~kՌKzo"3V3f*lIa.Q) WNq<)S!Voc^1EK&ݙaRۀګ$76ll.씵Y X\v@i)W[J+Y= &- -耆H%N6IĴJ֩1)ptA ;mm4J!olBN ~S%=FR &I KL3$S 㳚گ t-Sm؜!̻ӥ)V+; aUE[iWyyW$-]2xИӽ0럇>|3uzTLNƳD* 3ꐳHsJMNfE$UQ}euo.B= r:1{ꁻ[[,ՈI fYĥsA!+v'VO]P8bI u:j#ṮXGW /eYWý$tŋe+ᒜ:`DiY!LYp'%]΅(1m7Nkuqz>\>o3X;gtDa[Yy!x]Xzd]+{pY8B%[łk8;vcR _`݉0%ۺX၈y Th0iIZ]a-}3ywZ!&fpPn{]]޳[U*~4yu6OvZ-",I8m!OJ2̻4D:3%x rڊ] &em,׻dM~y@ CX=F:-cUFYv>cr^}Ê> gG0:׻;(Ǩ56s'">z#ͣ#v P+x9Au/yԕ˝[W %;Qt_8hOP̷jW.Ch/{r7|_b}ǺD?tu~ C|_e[æ}U~ 0:Ba->؆Վ$]''2ǙdC%)O:y|diuZVcM o)j#oUTeoⳒmd4Z CQМ_f\cKfjUTw*D:[GIJZ ϓt:V!^GPvcF Er`6HgPRd˰dNL-h!e&Iu__?K~|;mZQy'v˘! jIYgu}5fP)Sk:kuYR3y*2KI zAjIxƩb3f4f4њAC*7gl¶pƏ3+xsFŷ볓|ħ?V'rmȸV} A~O9B__@ K׋ rLj1Wkp&Rj Zep7Bop`gpUh=ɕ|̶S\RoN\Kl%X%g[60{lLFh". (bUx;Z|쮠N޽ NS0ס۸hfdA24q 2C51zH(ZNv%0f&ۼ%yD "'%9#7?&r'``O)1-vJsg{6u&z`fk3ozÓxju|sa& cFp* CSReFLB8T^Lsp0Wj,<’MlJrN d 2IM%*T[t43n\JJʕ2k3͹vZ)r:lt!P0[.Y p[ŷYD:/n=i`o.Fղ0UWE'6 ڭ7_s k̓tWӺ462{3Pi!&H .4Sz1[ zv>IdC($8y; ?zm :ʩ.MxBH^cT#@^_onFp|vZ''? .f%ތP\6ף5i҉ICX dbp 5C}x*~#K'[2>ʯ/tmtD:־>ΖdRo.gNg/9u)c\ 9>p12ᙖdN"Q/d&yFM/0T"3Q v֓d@ o%0E3$9KPWH28>J+KJ+Ƣ;JnȰ A`"0O%AF)51,kOB3hg A4""Ee4KWKK(TjHfRz8e\1. 3'7(m-E@rpPJA%04s"eqI8sZA j)g}C@1S yy=&S I 0΁ .:,ې2mvˁbu\1bMQ<:6],ej\a1gNKdZ`t_^ ̐֫iSl,EF;ĉXA*:}ZݩR GwFiAVAv.]ij X,xSe@` QۥSy7~P";V \`Ӿo%vIH|6KӚ? 9މm "Z߲pr14OeSssN1C%AࡻJqJ=^Mϐ:UT,uFy쌥VK嵴NY =&9,^rg[س ǽeN"lZFL'@!,lqpW3EF ɽY0sChfUkm@PfIEZ'ۈI(OOZFZ߂ԥS=ENOakO("*5 *)|/Sg ZMtL@8C&@S!b8iIyk<1TI=}zklgJu;*jfk _qSu8Q 64NI5_k9GiLi1F1E&V$({BI$(y)>]vԸǾ5'A:A&EɭZv#unZ(DKFY"e*"q 4uՒH%M0 TNbt9O%ײs^ @]РRc2S^h)z-9yTK}<|Ӯc6}A:c$ uI7`-K)ק9WBңC8HF ShECK17gZ1EnzdڝޘM4a:AC,$HcD*Ui ~˗RFʔj-l*@Uj#bK#LS%ɕUkfT= Qv:\ VrwGZR-w|4ZdbZ]`&v8h:@Υ:t޾z1䦧!&olwxjuUj"p:;q j=\"ކw$lYH+i!:}W*.)(aX?M_DaxK@v3:- +ʳ&Ǩm*0ߘdͿ'1/͸ՑQX\V,E]ezRlv84~S rhLKf_h-'vC?^Ȥٸ~hj􌤌󍆀鎹إfcJl[)!Nt`-Hzڤ7ߙ}z<{HW]b8LҲ65L G{n UEkT|HY\Ο5ꇅ0a.oYEsXHR)l*sMܫ64qC WaxG+9[YSp;o549] ﶤxTV8 Gׁ2ҁ'-Y#NbB9]Mt ys: H- W 7*ܭ3'\o 8'||h|K0i^ݻ7'-?x:uCtSz<*x \7; W ׷/awZwx~PR}yӏFכ,p#쒞]R45<hkb,ϋࢋYA|k28*JFU,bHyJ?JYM T}B*V3 \e>{w*2D\pŖ8pcT1RZ9B+.^z Nv,R(l䤮d|S=B?Rs>ޗp>rW†,[o圭C5[.){Z 3!<)GЮIOGn?Z%̼trf!PҼ|rxѲKzx9+ 慐%2: 94MXCxaq),V"C :d>O[1h f7ɖb@`d0 L?&W]'c2L^nM=SҔ|gpį/>4sJEf0Fzd?՗}t[O{k[DrD`6㑽g#& R2T_Jg ڄD6q0geP`ls#Kt<d` .Ayp{ Cs:p<{|d6r!`\ť__ 3Λ\OGI~Aal2.ׯZ)uu/W2 e/{_a"/I}`wT▞8dLu0&,#{ܑݥ%I)$dPKD)q x%oW Hƥ5?/ wq[/O-S<¡, YtX4s'Ҫ @<8_z5vtoyu)_| zus=&Q/LT灱x<=ݼ\wtC[Y2kC[*AN^a#1O:A!벌n) )$" kr&c-BΞQNoQWqIxT3Jg٭qSxp]{6l\]uuu>p+p? \,Bc|0J{g@21nun<27OMߌM ߍeicY/Ap5usނԙp9[F$e³lf~~f^#;K9 QoӇ 0~;oI&AKXٲ n,>:۹QfCc.mxz(tppѷ8" bQipS[Z&aIԧsjXL[H+]U(aӂik|p|y!\e]|I~'K<-[!SǏY#!JQ@@((&!Jg,3@@B]sNr(Pʣr;T 1kT8B"Ŧ(gS:kv F\|Wv}bSɟiRlj8l+SZ3RW]^ټHbvb3u3Z#ZDUz'lENKV11{}1{6 EtǑGz( }}~g}~gddDc3<s>c|ƜϘs@St҇>c# 13|ƜϘs>cga#fm‡‡𡯎 ‡ھ/| zWl+$>/| _>}3j.3~C{[C)%I"LjjZT>qїynBvBw0G4mҷf"lJ) oCxۙ8Bԙ[aZ{:_h~qD6&g 5_!3$\"p.HSt߮.-R'l& K$RYKlTQi$c6 'R`eS&F߱Vmpb1%2ODR-XMJPG)G68X:nO+yu-yvM/w~ /QZJzg-: ɛis{Kw\r!"ƐLȺ/%܂H5UkыM>cN[J{r7ߊu̇-@䘗!ْw7O9r%vȈՒ p@{30}q(z-φ~ZJ;YbjrF}49*R7B *%Y/Rs_;siP!)G2JIG)b D*6GY[j:9>x"==v|l jkBU6: ,"9" I%Wɽ{L)iΦMǫ[C):m/̞~O+WW:w6}p|p_?%XFkmD ! #2O18EO ݦ][|$&n#]sGU8w=c6GZ[[^#G)Zzݘ5-덏yj> sw~rh;ܝ{=w瞻+-䯿zo7> }1:_=cw/u{=[{0h؃+ЉV\Fn?m?[b4v sAgvWOdi{`П~Њ f먠S`Wx럎ulb3|xZ9?8F0 V-a n1~7ryBkh勇pٿfw./n{PV7~Isדvv9%ݹdXnN/r=ח g'GOe1$6a_׎W} _׋F~T{=;2h@cۇ o1>cU<9ܛy=@}conxrЅcVS1M.~Di^]z4x9SjFuXy OGyї17ᖑ<4nMbWNğ-L$a^_mZxQB}9j;)&i[RDVg+m֑oy?4I?ݦŃ !d]/ȟ1v¶f\ǥ|i.GkZY-oB8lHI0ue#KrXJ4\iIŢPq1br!ͩbc~_`sChNk9xg;X+"xסւYT2&#h QkZѱ(I%cGk=@O.{ߑXjM$dȺ&Jk帘&&k%q9VZI!cL(V̘f /`M lZ+ 5gΪrQYW={"Z#9yR-#C!)$NE;69rX"6D2 =&8}5ZEvc(_[.XD!={#TGQ<\oYyc& a-[!!P2(֧!Jh.Vkqsݳ*kjjlM9D4pNUulFf-@Q:{'O0>Ǯsl Z#v8$G((E? )bBZVq))4Ibɳ*.zC6)[F27Fb&kb5VzV+LݭPBjJEu)I72]0Am` ƲK- OdbBxUZSAng3/"7ix0Qj<ĪdKw%n2 tn X"BӘHāNql¤Cشj+p0s:yX%EiژT^9TMFdPK #8jX|.@1x7 ~nV/I!EjXf Ȇ2#h0m9#K_'J-GUe =;)<ޢY7n8/XH脸!jITx"2D&jZaT^e'õ"`z ƲM}B$H.:/u$`Qx3(nCe6 ⥻'eSUߨ }U8¶BMfF(NAk}t><:󨓹(Χ}'X2 חXk#xh7(ccP.)49ЋIuD! r9X.h 7LG`i4P P>Q{cUBm.F(ZNR;ꀇ0] b ,ٕ;f,wL%f,,5v-@o0H_,"U_c'7n,AL  3%VYds P  7 Ww a4 1!tՈĪ)ANPJcu_!Գpg&*d4,NuH@)MKz+RpF 7ܭEQа [?Ht\`CRj^`Dƍu?\d|e7:X֟ϒ-}q\ەIB418zTkp)qaw-)rY:F-n6-GQZthvf cZ&0E?~qlH|IiFhф}_ÁvCRB^`(I!Ts ^!72{W DIU.*HwʨT`=@Ð:A2&M UZzxa0fA= TߴUahl 'zV"b5ciNQ!o/(.zӪ@=w((CcI* tT#!ss!Ut h*=J6*\T19I5,QF&L@ 4֤U TJK`kSIu: BP͙OB?)E=kIO[iӗaL}M#N+v'Wq`-Pڀ@xxXB¤ VC ā3^ a4SQAj|dNГjOQ$쏡Rₙd5 kBi0nNĔ X˥>t`eSFuLf3."IES8lI{kU~DZ+"\)#@<~uNM''['^4[>t_(Y џ WKVJ\b_Y駽yxz,~dS3?cs:C|΃Cjٜd?ఒy 7;~ |u엿 N׆ %HK˸UA|2q2tl}'çp2|'O8~`w"~j'GƲ _mxλ*E}ч>ݽ2?"S~>%7539f$CI}=gP n0wp% Awa0].} s߅wa0].} s߅wa0]?7rHۢﻻ+n6 m-| qp ,ڃww^ɘ[JƉ%OSA64: mck `%l'R JKii s#1lj]XeLqyr(9ޮ;O5kʈ\ƛGZ eeANn\ij=%>piFSbqUG466c{"QTij ydP%V-eITލH)|e 1n^ăm%>Tƌz2$X JD)*]{}bi'>XϳQ?ob爧>{\ܬXd|a[r;H/!@f ^~7w)lYw? bFڰH 7n^}lFXR'  mY3(w´Z\807B+heiY&(e-ւ{`yt"/LxFQOgkqӅx&D&er|QxmćXʍg?!:2$P;!KB 0GmyUPZ6gr͌c"3c;С.ԥ2TEϥ#Eq5hnIG{ch'>TNr#M#|78* ۺ_ןJIEZsR&.%n\ŕ$V<,N|fMńe0٬u {E &. ̓-ćֹ;eQ1Q]FxQ3n쳟cS@X) ,˖ v<%>p8&'ÛVŀiY (" 7)YGYg~* TKWFFi/R_hxQvj6ny9eLJ<*J|VG<=K F9E}$ Ԙ{ Eh{lO ߁ :5/}Z+;Oa]VG +j aR^g_a%r)9k}f-51Res/II9AsHd#xvVH*jr#|#>Tҍ3.=j6 DyF~$Z!Z?0]~QɬX=rgG ߂ fd*4sk{9 QU v}4y6y(*Eu^ev;pNw\ +)YE!Ԛ3kh]3P}A]VNwgz>&ljB,ۧu] ^ ہ  C,ҫ5%}4868(*g&)G\r$SsԈ=_׳l= /zrTXX^@D]zfVWV5ѕUQmnϵoPCUX2G6֞WGKkGr/tM#PۍA_Q#Y}&jz>g+A|?ֳ9k~zŒm_r8+0͈ ~7*hKUbt"hw6lMw\2Ӻ춦U{a,{]××N5 +f`D O#(j˿F>ޝS%E$o_-Bx/@$*r 2l7=ЦjʗFS׎M z!ThF~[~ڳ/RƘ76g"7,=,`k8X!0,ƴld [759jG<٦Gԭ epmvpOv5 _C'|<3j\G)Rъj}䁠kTD%>t,7: zcEgY>bF/?=zdfFoqS1pkL*X^%` o*D %>tyH^lB-Kb*6^ԩ;^``Zmvp v*ހ ]Ro~V nONI1݅U8lqGR{^sD^sp'kQors4Mq8M10&o\_2xցA^U% + TřR1(Juz=:7}N$CN$Fk9-I}Zr`̷uz߄Βj3YPQG 䘄i X3 X9L> LN<=T#an L:(ӭ#AsFH6POj'z`e4:&.HU`8xA/?~h@Es&ǜm6]|d]`6I֟SQ 89B*p mtPCGDfe| i4 'Lk)"Ȏ.3P 8+vѶB͜+Zx7#jV>іvq+[~(9#;HbTi(MzօD!d*KZR$Us$ 1̦K+^>LvFպ܃(GRȬu3bj5rXIt(-PPg>_zPRoa$<}\x`: ͥR,?M>U NG sʌZ kUy?[|V\9R ^IM*7l@'u"5A [h5i)rI{ni#q3N)ᶫi,J+1{̅e݋Y\zN|~8=t1e/wDM( 7BY=+a;]]ଣexl?|Ng"603$s(X]t*0.'pҖ:ikD<۾kDԯ6c}L%G2GREd]أ@2>@%ePk̈́%9׻>&aDm&a@;) kQ=9=rr8|dfLLS΋RQJJR E C()e h;%>T#l+ϋgSC*Oad$+K]Hɳ4G=J|5f9s_Fvp(>O¥M58%p6-@dK6K{*|ݎ!p>uQJ8Jd`)I.6ԦAЮMk9]$cV'}$\ E33]bYY庇ʍ7D PD(8 j-etѨF[(*w#Z"jE}z*' '\Ts>]JW=jz!;H$R?bH+kVmIޤ憧YHЂCl8b5RUBFP;=Z1xMA!aB[>>|S90DԽ*[2Ĵ JsY:~u=JuwÄi&p0}S :ђV<[Ļ#5~iN`-9[+֊׋o1@jtk 2a7} aMXmUX=ܢMo'/u5dD#Em}l6aC{6pt=݄;'- `WJ=D#]yx'h~gv"ô J7<=|ZExeؐq4㶊%_?mmGf6,,x绛. /ufy7p5_]m6+C750f| ܇ JvI=Ӌ-J,.#Z7 btRUŗbj#-B}?{ᄀ[H)S?G`X+Eq׻Vmum*v,ݖ}C(}S C_ޭV[ijjjݥd3E7KCV^uz#[d^s(-@e0-ϟ8ej|Ȅ=g71ӭdCwqI> (#cew/TsOvAi*]7{c?Y<eQ4¤^)kTe aa~LadJ01C=_SGR3q#(>5O_7>B3]V =/_>6ECIѳ2mG g{ɫ{X bnv_[@X.TEj'~^&xcAmh~fv;I?oy||ҟ|E:Ĥ7홽-NQ/הې!IiX$2c(X 2 xB$T8BaF?XƧA]Ypf4ߥr脦 BY?!mU=x௧ݐfFMu?5Ld{ν=+K>3\ 4!ѳod,۵+$%;>m*7fqFFU,oon*߂6$ oW0/w_ZFlJ6s?H3;/s}kbK AL1I(]?.$Emq.͉ S~23# hn{](Od~g#xt{6RIЋa6{="(bŋEUV 50+džCP{7اBƐ_pH,~d 6V$NCAHLLЌCqd`9QWrz)י9LG q_~ Vj̲͍n9ha nXзb^1ݺӠrN9DŽ9 =g\C1iu,\H"ׇ3ڢ ep3bQ;r@•1RZ;u$S+ԀRN<͠˂p4arujznD҅_ܧt^k ʧ.sNG4n `c=X Ǘ%گiױMCkf\#uE5{U mцڹsװ$w4VW26=H[N wqaŹˣP: ``v} `mׁMCM3T,B9ܛ=~Wk-"FOu3bQʕm jgb9%ئr>&_e+ ʧ^ˀXS;zW,t"@OBWDqu!w t٧ xQIGCAm0 `(,T$^m(;Lվ}hQ԰LP?Na[Kqt4_"ɌO;IJRḧPQH1 Y NA ):ܫp6 ;r7PX"wݍ-ྙ1}ڒ"HQCM_$T$KMTnWVk"f\HgNt Mv3{͔B[n O$"mqJg(=;?a W9CcfOG& aKSh4 '}305b$`P;H>zEHb8pnHP;5}f\cAH!x΍ΚרLEquva )9%6YJ՜@"l Z@2k{b8DFJLs_QUhhC 1&*!: R˧! (v шp<)>/SHA'ygz & b{D^~(&#}G⾣/S}w*/cq?:Ș )}9=(?p;}I9.lK"%|bb{>x&#<>7iV=U81)}ĂFVD j+X!ﬢ}UVE_fS׳ /+B v~22kDPU_gc?OEWڊ{l6aغaQTo>Q0y7:~$1Izd 0ܧFޥd"t51/*Oʛ~YcYi n:qALo30S!D(4޵J]n~Wy<n?FQl9"2PW#}a 7 Ra(>*mzu5#"F͸m:XcXߴȳJȷUa0dž=ث)-[]_JOA]Ҫě&; SIHJXblT%;wŶW1Kd&p~Wu30rp:)2/Z~Kxy %|Vr䘗szd| #Kk6 /(:|1~-NUqLݺM{ܧ9Uۀk܋<ë|# 0}Jv"Hx#ifYywiz[o 77*tqKO%^"mm1%iU@IJ:]oⰋ|>v s<͸ưX:4+.% =76>pf0Ep˜EݣDZ;@ܳAc" Ip+,V)g!r>BWumg!UגGZnCHnhkPVBaf^s=XY`*akLŚ(5 usz,%sf 6&c٦Wfv^fY;J+LEy&/jA*/ϼq|aKݼ-˧G= +˅=x՝³l޼jޔg]6{psVk۝= {A#KPТ|:]-3Y@EKb(Ӷd b]Jff̱g֋ Bfcqƀu#IƅFO_ yq ^WCs>]D8u387_)WkWD ƯIݙu (ø3 A190CjR-L;d΋ȿpM {4O7:; dw/5΋n5,JcN,mF",O}!s~# UMCI34jBLfCO0/X Bݏ4/3|d$8ޝRBߐH(I#4'=2"I@ Kb*n?F&=/hj8rh,$S=1RϦwh%\B.1֪ǐ G ?)2Xy{>@o r() ldJ%FUϤ(!V"e4ধ!=+4-;8 8 ^|jF̃dE9.݉2)wc4>5g=D\%N$>XEbcW@+cR2:MР`E: dJ +S'쏹X+ZBoL|4YQ{Lnɛ{o|&v[~}W 3 y5+%hZ~sǛP+_EÃ--(1:Ǖϰ#.bbRfT KmVPc.e4ؔVSLT_% P!{ \0U6OKj29"B9,ѧ{?vOv9rB4m\h:OvK_M?Nal.xmk㦶k")96 ~,v+İ&';/Q>8߯+[Wδ0&[W܀wRmwk1BًjzyR{pCmjnQ چ(w\i߿r jj?_ˢRH)y?h$:ӛfBY*ZR~}CO|x OG40w4N&[/]Zg8_6=|>%K4Oudym%⒃zr?/#CXE~fG/Sb4]&,ii9IK 8EF:0Df8h{n@DPb?ʷu -ŜO}STDҋ_xbADkTSK].%^Geb19t )N:C"s 6\ Rq-eZ\#ZJ[bP|916F4vwAD;NqO6́;&C _v{QzߜF*IVLTEA,$M-Q`(PU8^F&T؞AMxk5:ۋS*˩\< eZ+n|)k2V"Lۋ*x]HZ(,w /!j^DCѝߍ۽9^7xV]'ZUW_Y޲!4X{.k2yXK*F lDj %IQ98j?+KMm|umX6Ц Ff vbbE%VwwP@g6ɓ4B4HFb̟PDޫti <18}Q {.JX'/2Ek> /_~Mҋ։TZ}q81C]" &[YbÃ=1J-i}>\ -CSc~G[!}ccg$~8c(:q>>06,!00¸%'Ho 3}8$Ξf#qZ+lĹ$)\0%%m3?4\Hꛞ_By +Lc؝p k\QwD!9`?p Y|ٸLfӫaԢ`Qӻ-⷏3l.ێ8u_z A$L®,5; f"Bf^:R$kGٓ<ÛV*m8ӈ/ JH8u W%QZP) *X#-E L"eًb _N"j~8(m9EV$ w%<͠>iAB]- dkخpʘc_u_YpR"Aљp$KnS_buLnZx,~F JSMːT vJ :7Լ+Y1]8NFP[ 6H"k@w~lAW?Sbig\F#8:&8{X%JcҾßϸ DPAf)x IId.I5qwz>w NT{#8%m]s]W RZR␋hDsD #mX?|$jZ[ZK{v5NcڵPnAT>6ҩ1uG1=0rueT_m0֌1W1 f>BXwYɕ0b`!nDƺ Ql\J9W谨m2kbh%x0~f$ X2kWͬy9Qg4TG ,H^:⭖^L+F9l%~ &zo //]bUV;yzmظ߷œ FxSN0O2Aog9r=n\}.Yߍ||LlzeQ ٻ +_}ףu_~>ͿMs;U,a8d]33*JR/WGeY9Bu +ZXc_PLZ.b`:5OiZ 'rMRʰqC7_鷭)F4K-Hg<-&%ά"e l#3R,Β`oԻ g~߶H-mV=ʧ$d R9%2ŵ Ȕp^+,3Թ0-LҧG7y+e%GdZiP D<~~oɄяS௩gB^xw7Jd?~68*^zs3R;? =kHl= | K–n`U0t> Szp3Р%ȫ%}iG `>rQI-& q|~oVBxx%}M<ij_晝 J>.~}z(g,*Cy %K0jaT_]-s=.~OK%/E]Y\{7,‡b9BN5m*2O8MX:u,hvı1LpN)Z; k5J(|Cfm=ףؘ"֦H < T^ 7):$6')EZZ`3!RY=ryY\h9<)*mFEܗL79@p=lpbsn/H6Xma[%Kjդ|1f$~dWP|N߈U:}cwm+4eV3sW~9kl^z ݽ5\Ԫ"Ҿx+K3NW ~sYe=up1^^m?-0,|2wH*"FŠMd%2tH/6ƒ&ၻI{=㴹[Ku0ǁn3vhX!\A[qy=u z2U&Y);c*YUZW*PtHu5<)%S|)Oȇ_a~V.Lf  M|HjRd09;k{d&'bGK^G.2w=Z81.ޮ؎PZ7bp&XO~p*ώ\~>0]Ytn}VO^jި 6:Pw_Rch4oqrb#m)f()j/?Z@ЏފYX >=cz9[bZyԹC>;a3dqPA8P$K`DIUQ]چImC ~jQ Ft*X %L4CjWhj]qi|vGoz;&f0ź^F~+E=R']tO-2YQ;1C ~,C׌{bKV/aX>{Yl`bOe;o7bk[F̻-cwَt}blխMU='Gg˸Y<}ݎrxAᲬT@4j"JϚl}[+ܹVqL)qӹ]sw0.&."qϫ&ܒ"n+E[_E"N-(?+5Rfi'YI^}#fX.+ajmGS7/C֑};Ao4~N#s*t̼BqoPW)K5Wh$̠bMFÆR3qs#$B(b(h֖u}w4A`Ym?;꡵_yP]ћ{K`bθhΘ)T|Tliv(JMcXPTmP⪺fkr~w}>v[#*Έ=ve\ر7wv][0{\D2 n4aSrK3\{XNd#>q=z D ƞ2.r"QጨyIdc ɷ*QLTh3-Wн&I>( q5ҁk帱'rY/8g0z~%eɳJ=$FK!2P@?2^NJ&ռRp6Ae/;\^J"g@QYTrA=)AZ@x}V֣L`z^#J9wԿ|0,{}8`&c#5} G$BPKZ1L:;5f6>9e O`-&Ad6K㵻Ku=zkEDY1zTYrEQ E z[M[}^fF@j\<Φ̾jjj/"|\\7x(ȴ{/iKs<9_.E湲*Q@X*b'G E_K۝=T1N*_C9F*;r[Lo 0I _a}F (LRjg@N b))MPLyi;n+|wՇtc52.M&|ՀӮfdSa;ٷ{)ٛ:cIDzeYXS$J`A pyy6bhGg?q B"g|Wd ?ԉ{ m-<,X6 W?7d_:*nTXV1^! }$B  nv"!y?c'Ǯ#\7ې;X%ܡ@kUv`&??j]6YaRP: ~/u 0'k'dƶ]齟Om ϋRAY{{97ݥ}ϡYy3^IWFq&WBtX_Tr^UJ;)f,*(bz֟+fSOsw,d \bXy=Zjgm Vp^flGgEޟk0kiM5UQ`lѯlf'ؘ|+oQP+"P.CE恔9!`*x`QhbzKb?uㇹ_~k͋8$s0'&O9}'"y@8 9 B,1!6*V^*ws,){)Cq@}h+O **Fb|8;CӗFvH ^\ܥG/.נ8^e7ۏ@bX>XM'R[ 6{ʡ8bΣw$hh3l_Yհ s<+,VdZ[NYσb{AAA;C '|prp@g@a!FGWfՈbvF|KƑo#Tq"0( 5)Ec4^Ǝ{&Ϥۻ1"3óA>!NϦDȡGO.G׀.xmx{Pm[<8&GV#(~|sI}pvIXtH oG"2{UKy#(BP84amtyu|ynlh$GEq9 *=2ە#WeD頑7s,rsb ?ez PݼA!]cX| 1\6&Vr@\vSb0SY[OJcgmӾBNۇb>m*P\ (c`RL-Ce&{E,HA9lLC޼AУ sE]VYc]K-QЅޑg&Fqlkhl&F1$QA#18#9~\^"?f:>'L~ZE =o$C3BȜh3gW1lbs NP>SFbpFN5?e\FhhQnv+e7>B:~y!ýkg"_Z FuRH6=*d)9цIr\jzTGՠ蠑DioeG*5T(ks>B@@ z$̯s!NLٕik,͞1#V|Fdd]Yi Y*x[ĕVet <頑r;QUFjp8AIߘ;C$AQ}:w%{@-B0|PDuDLA+A*HIiފ y#d?4PPG4Rp:5^ߪO= LNqRP7iF2Cf@"qEy % s@@Z DvR0a*d] Y=KN'(~sڳeڳ\Hz/?g&K|߳YUsT;M5  Qϟ~kq"5輪*=xVnQžDy\,ov`vܫZOj?cęŹ{SWӺ!D޿J{f7f5kTJX$Tk8py;J: W-5X81zo034BS,|EK!L멇#;Y.vkLVMBӤZ<'kk|}Wf1Y5vLWq<<`V=VG+W*u6q\_& uޯՎbK~ͻG=vڸL{Ae[=sӄ&sYYflHʿ-< $d&MlY[Idt+{s!>ǣ8x 5*~y/ɤaw@gL8 fdJAbN3%d|.&Ȥz'0z>Gz|2h5T i}\7Q֊waO[sr>Ku63;}BȵiY8wAO,țhDs/?4&2ͪS5/>l6 $[ϯ aXxmŸoeqnR(~E ԢަxrN,n#w.+^5(.#xAe}NC]@#@ @I4C~,@=;#01> ZU*`}+1p Ճ9@φˬ:CXx09Pf kT2aаi#BE`P 녪LQYcלZSzYS/z2>IJT#w"F.B72B?pg>5t39{l=*I>2d$S9]<Q"ʍ z}C-+!Hy AKN^wA:s\n,\Nʯcʯd@x`ɍ@fW+xePhIQ>eNe0i>fJΆ+M!ϫ 礂LTdgr"Vh %6RZDF6,@E|bĒS\X2XO߅1LeE20 zār/kHz(IQlBk"˜.S WK,0ϸBlǥHm=h ps1A+q[n4Y[)YT._ZR9`ILt1xĝE?]%h)&\ j**goشcQPҿwcz PeqDt;n$qh6r#p>"~A;usVUC"QQAVei+rʥ|P2 Jx+/ fL3!m/j H\*0 >)OsSj* F^_߉FzP<9OLMCX)*t)$I:5F.yPgIP*G:T V!V GI +sK񯲱׮Y3= 0sY/C;5 '!MFZa|T4Px *IPwy 7sC|Nh*mPUy %%=m0@ўޚRc}`$dU! uUȬ;2+#7-=˕C3!8w텞1DxMA:4r<2|W5BS@W+Bh3[ vm Hz1(*l 9ש2zXln#ϙ!HqvsO3uPmk'kRM,˗zo`=Z1R9">% Gv(=b}_ [p# 'B$C5Րegɍ14T֠P~} qd|`-IP41N{987NN[wuo[\"TeǙK<2׿k`FuxOeH~,L#}w,pR:,iP>Te .U|R^t&'wk׏2ks~geG7۪v۵Z]o6Yo,RiƚneWQߺ7Gd>|Fܐ>aGhINIr-m'Bn֫ݽ`!d'(!>`; ؊a;E!mB]z[Nw贚@ĥe \(m 9qT؂r'(] =:8~Sol."!"uðKJ$@L EyRصƎBP_5,#Ҥ6m!خ],_xE"$q!х֪jw)F MM8ðˆ āCNt=[R`2Ä[!' mnX_U߯m q=CA97t`1qȃ,(w,`"rh(c,FkyP C#eF -`.p4υ?"F,K"$J!BA#r!(0Юճ-ذm`iUaQn[#qk)WN^GqN*`ho:oqUotjῥ-̳.-op]!@= K߯i^[--)I|!vAbE>: +%!5ϲXJ!k{X{n > C {p[dڋkDBFc"6H_)qW>gT޻nyګmӛZpmuGnz4SL7oQDh XY)l@`PID- ׇ/lv}мP8rPpD.!! CY!;Z,+^Zwt#N0],! l"0܂94uV!zCOyP۞ɤ)h5R=ץj%Xڡ/X@'Ha(A}plߧX ?jN 2VOuи%em\^8}9bQ`cʹeښhN^L Y0jq**% ǣٳ8]~f9#ôK20}kׯ}STB)s*;06^3VU^}_ .գ|E*^_9DEsлFXu5.r{J ٩͟־ Scs9m;G-Hһfp@< ,A;)|n}bW|G#&|7T hJ xP8Pw&Eq$wTOEE 3^cq_4tiCVEǒ)>t:x_0\DK&;+m[?=׎~ؐܵ&?~71QIptf$qP*,v[k ٭7i7[k5NZQJO u]0m/;#nIt)v< +w+/o\! tGi|zOwA<i/|uZnv 5A+IF) V~J%]2z3|ޤxX@pw}k<l  OJ7Uiyn toRߝH^eѥ+M?9F l2Ԛ;N,ó$q.EnUYT*{YYbP NNwF}|3>.&^.4c̼#v㺆]}"}F~ ɔZ@N. =~ zU<߱=ϟ{qۥ}J%=9`z\^ddR'$$S=]_!&3!Ƃ⊥\_ۜDsjLId|3tBߌ,GbtC%y/q3*޾:qp<4tx= 3WL: spPǕ>^-dg13%I|FB7-gKB k9m s:Էj3iDQ(+hٯ6j{uWޣEԤd#:@/oa w׼~1xq#Og:Ȅz`23pjaK_2Z-s!}1[1ZVPҿ6zxtۼZGmZޓqd+N0*1ښa&u:Ȥd ѤHv[,a>^׫wUe\v0 A@$RhpQ@98BuB"p- .Pd^cRF E[B_ĴEJ˭oJCF.zޮAs-o$PB*曆Duq"Bl~Nd0X}fh:xh._'ڼ:6j?;kt\ވddQ'm}YgAsm)EladEA3$ ۰(ٓiu%uf$-0XoY[n C3RiɍE+=,\]kZ@{tLl D{ޝ^>;^_Y(ڏ*^{7kɌSt'嵶#{9;Vja(o//sd|>d1 {Dd(suü9]ɹ}qr\A,bR2Y;%8hF<Ep"%>*U#F4-k*BBRib< ,u>* /1Oy '"JG9Vu_@Iu>j o %Ց kgYzKu!  kWgK }>R:?}| }OP:#G:!sLL.3ro+Gɻ<ֱ8;Yxn-WlXu?5H`qkr)gRFeS!S)J;c4!FD , ]Y5 צT9U4!%A[$#wa@`6 6{<@`F awd@),B@\QC*iLH= @^jR;+!`j04$FPW|F;:^5WUOQ(48/ 9Rf!ta@Vj*XB%ab i81RPy#dn9 sZۂ&ˇ.{@ƴ ʊS%pQ\aP!f= TA^ݙHA <4޸ MƯfJ\%eq@mJA.q$044-',5Z}f굚:K{ UOP&5<"♃a bG[B5ϕk["0`օ0م|7*]\8n +|(bVRo Dп~X_]ɫx~ ''e=kU,čGӦ3WRUXa[_~^2ơѻn~ILy*K: -CRmlŀ-1d|g:oAB30xBc}GJ@C X:6 <$D зSE= @*\7T$wݐ EZX ;mXH n@\4#)ks O5y. 4]C`C`Jyn,@K,k`1:q^Wm{mm"ʓ VxEhiS]mbDt!B 5ޖLjXqG2>( j#B hm3w͇LבXg/7Q/! 1a=36p$NdIr.<Ş8bDSURaf3؊NO&˹s GnЇ'lm~!:(Ӏb$4 *0BVG#1:A"Eq *q!qZB /03hGRPrk.jTڕv(Vx9W̝ Eea ;U(% %$+H* )t\wPyk˼,F))piNzlZybɦ>j /|PZl! $G.Ȅ)@>j f*JbC* O6zq4[+j67hD +q+r@YHkG+[9bqaZ'Esۜ-4"0MIO5Z4w-ŽVNO.bsS+I1Dj6.8터{"l`B)mդL,gBSzH(|qǙsgՎ1R¨=eH 57}=h Vi-oo Ϯ 'hsa5K칢^ɅG`R`g{oΪo#OnD^|9\tWeu0: ]_wɜSɽ/GX[E!"oGt*% AB́)x. y3y5=O"Y$&Jf{+1JSUVtk-OpV(hAd"KM)W C㾊(Z9 +ڇN%i!S |{qZD2Vyʰ @exx`OXȃiX i-Hg܁ .r#tA 0I1e%3ATZ(\/F->8O.LFjiS{T$KHQ B8)QxɕP;4W#*Qfynj?" x',<t|;y?8FJh`w&1O[F (ƨD4J?7W{^M Z'VͲCIUiҗ ޝVo50[Z2k h˅W;TBk)VP&Wqkҹv]B5k5)!$yK'W56[@P"p@F K܊C|m9a?t}wv}wOi [Ŕ<3C/)07Ps8!dA¯UGȵE3 Z8bik;a=, %>ן-1E9cOOBfa0EX{͏?:+g7)] }|=׀9U|c`_L~#GvöUG;w\ĺLܣF̃>+,|o;r4l{&d mj=m1=p$/xn6C`x ͥb Γj~w(Lxf26q<,xrbT,v GR(lxpHDP.!c)G1xNY O/FdAXf=LϛI'Py2SUO};:lDPO+ 8S^Z {?hravB3`cק|c>=s<U3^VZ'ԥ]A&xOFu}i<={tv=qkbe..W,k{Փ~-_1NoъХ[b 81s$?x=ae8؎ѽS~4s5^]ޮBu`Dم\0d&}ͼQv2SO.ހ#.$sn 6qj P%LC]BpWKp&Ip$6nig&Y"!wOj&}Tೱ(~^yZټ] ^qĸ6#?R(WQN%.7VpK rKw툸Dy{w y=@{͚.Q]ޝϼ7 TE\͈_62 㕝,qh:G"1 (UH ?bx J Qɠқ J1(mGjj>+됐T[GE.=cz߾jYo^]o {m~t|)D{+r< w)Ƃ3;"x aI%$!)y ǩa.ާr>]N*R=BU bG!MYoݲŷEZ',w ?͞ǯ0qo]8tnܪG:_\ /}w|:zW;y;_ϟ?8;?}Ͼ?ѓgϏ7qx3}x^b|2_:..Gp|w9@}?AkMG"^1 y\LnqS7%ycINV.=>̯=+u»p+/oĖdY{҄9MCIݞ\Y[n @es򿿑l!@LЪ?c43ͨ?/9:jn0ĕ_֝+\|\ǵ{΁ 9=LG=\&qAGAZ4xv|%}~/4O2Wo DpCUﹲγmdWyOn^Aߙ]0]>mk6'[dMnO(1)AB=K 7)[S{1QwV_]oEfCS\շ/%N,'J)Z&߹޻TW\v:'N[|&ǽ[Q4U= Wv=e}2hN/fNGf߀ɔ5qH0~==3+ltzƏh L |лgwvW> dEi Ĵxztӝ *|v/ЃF!THtb.>zͮ2=P&~u/ðW>da(^2JQ2ϼY1jk|HrT42Ij#iy3:4IkT>r=rz0ێ]}@VTݏ -XC }#,6 "(8MHF1vnLI@XB2`0E M0*DT%,ϥ"8Cp8 3 _ cIRE 50Cz ,{>kj@{vnck:7?Kc+^_ƊtZ^ {d=3qן o$ XIw8ɍIAB}?:(۽CPnV>VHLK|{\CWzˤŒ|Vٹ cefe!C?c쪻ly3bO2*mm+KF=O݇'uc#, GFy泚i{)XcG E$OX>lh>bPЌ<!'E<.Mntom"b,)% tJ3=<0"8A3س`7M;h)^>]R&'HXH)R8|B0q"}<42 cDPwT3ιv0GGH" ؋|uƍ>BȀHcɄd48u4~1˔I{9J H(,+Hr%Y*I}lNt}ѥ>mO]2xK8FW4 CK+x` ERP'B^4ZgF<4>$FP"= FQ?Ċ SH9dBǮBMP}AcM$'Y> |%n@ b[RVL62bw8zy<:_B a9cڣ*F6(hPWa< `'vI'ũR.f$THIBDF f N0AA”%\[I<~z{vvhKh[7[g/YfhxwE|D9+5Lł譿oHcul&Kgƫ:8VF,1ж" Ű#up* :NPG3Y׭:fW泽٦;GgtEDxЖ9/&M1—V社e5UjW zkwu{wxO;\P< Ȟ#BG<gwB;3Nr ? ew\7MdH\?d @nt?yUrZ{r:sCp=U.Ze`Cp%%Kk'Cο"Z]U 04;S`ޅΩT<.2YP6vJ:vccJF4B{Mw޹\z]%np V( #{ewF[ݥB {ߝz뻯I~kx+P4עequO-hYB<1h=`#B( xDB(QQId,e ? ea CL= 0JIGp"sc(* GU'2$Y+3ӇzqTWh6~ϻ~o|L`zaM2r|35%maܖ.ZgKo#bymgL/a6ta(HcH!DBƌKD)X[dNDQႛ'$l۹{|4T R8$d(4oN@D,$KŁ4ɭ(<^؇ݬk!{1 +Z0^JI>Ã`-ċ#N̉aOX)_MiښÂLbBPQKxP>DD4 dI',dJFح cJ$ AA~\qHX SL@S!2uAFzz”M~j>e5@5ӝUvohE?cB@W[8S*Q}db#D坴iJHDY+Ijǃ'=J>D9= 983Cu}gΑ]v}om p$u';3Cf4h !Ekޠx a2]e~T@8͝'%"KC %}%Oe#8Z xᥬD͏oJç]dPw~1Q.V@y_UkH?fVKZEbjq8ofW* ݲDctzB?hr[oVБʀLZoෳOuC]hگ砷m 6py FN"^dDdw{ʦCs{gM͊Ě3 m|X9iP.d3g5Vj|6fw=OU=!ZJ?^V ?Sfw|S~$i ID0{'AG/~{rxPZL|Fe@QHG=ݸA?vA[lmg.BA5kKRnGMwe:i7Xn1f+UwK׻f r7'Gm|oy9eYn?>̚^/AY趵AS;5:xLimK٥{3no}pe Lbk}B\[m V[ jh-o~ӱ1Җqk, SN}Xlڊ6c >qjdp.zi泷u aGA)<&7NGj>8 %1]ԍL\}`'R BBw3~gکDS##V!WX~o+{t$>^- ]f'/V6sm5ty4hDǙXdf2P:__0BIf^ v}]-\s]vFX 3= ٶdW7R{ ,,<{s;7LY덩),leo  n{{FN*,Vr,-wÙ5CZr@ks-ܙ >XW\L> {tw(Qݕaݷk%4:oo ЯjQ *Bq;A>iC*΂ǟSQ)k.TmE[,fo4/[n l[UPJܞ)6C^/QH-Vž1?m|c7g,`+8[ M gvފ6+gpWn\B C5X Rô29}S0mƿ|жBn>js37 P8)-Mn4թr п,/iDN2]8a,ѧwvp=N aBBc%Bwe4Z!%QDm:kkbuX&m:[-՟,՟l'[V՟l'[*xmjۧ}mjۧ6}mKDۧoT>Vljۧ}mjۧ}mjۧb%C Mytmk3Z6fܶ%ElI QcbKXgKؒ"+ڒ"-)]!x*u>6R&zRCS[8H+uȳ:J,Ա;l)mibڦ)mibn 6Ŵ; 56ش:6-v^G}%@7&M ajj ~!LMe!,lCO3][s7+Sz9I%#Ҹ9NRda+DžԒRlӘ!D9H siך·>*>"jѝj˷Lz}*~L;T8|F ~M VU{iEȚҚs\HC{^ خ*%!F+yonzJN!r?O\avQσp?^*!Ʃ?˶X jN:WL#S?AzBh9R\ׂf I,^^ʕTGwD"GUϏG#r($htcqYZ 8a< űe.jy^`HGgv Y3f0ǟ;'DW F+7<_v zNqŦ* 㝳Aj.Z5qӇEhf?^ (r :0sG0lR_ Ka~9}J"u)kkuֱP`Ѭceq:VVC_B)e%ci_(Jl]( V V*[/14Wv#N|ǖǏaPKÜ_x*FMg0VsvfRg#ջԻU%mvqFX΀5Dj:4EǩQz]wT_5oM+on'׍Ff64Nut*Bt*Nөp:NT89өT8 Kt*N?p:NN29?r=xtZ rtꝫNBmkͬ%|Z]p9\1f )' <\ oY9t„OGv4h+-g1iX7ůOW__LgW3K~09u>YpPEfbk)#9b]lrڏ~H??HR gj <0.YQXG<&޴QFGυV()5-; D锨ăCx os2=:UzOZg?A} W?;.ȻI~9wțme2=A^IQ=qr-B ^u{wёRԿgC6t_v~|k=Ң V_( .xAϗ;0 K f䛭>[@ZO֔ iAEĹoV.КsS)RBnޱb1;E}xę Q;)BȔ2{C r="$JBBDfȲ:@kv:"ZuN@2a (I3 ,SaԆ Ni0rkuuGpH\h&p޲1R/࠰;m<8v Ӻ M+.ΦKh/kobq}ʎ\gYѥ4%/@0_$]W<> _>t揜ľ;B7%Z|Z}2BD)hNRQɝ r͡+Ɨwt?$r۳ʛele+x4S W`!iъR-Bԡ(IZAP{l CN.f␌?nE AG,h2'ǒ&⌅f;'&(<ƢO/C_\cM IsoDBu\#tA E2b8GbdRtrzDpңڣ/u<1MMܣ`PV!\¡MGea!bLmL$P1=¢`8ze/0@ Xc J=e:$48g2"c)(b)/$^]_+d{X8.G[n;m;e/q+Nm0NN:+N=ftqfӭF}eiI ,Dc)=KOd))'^` n@^CKRƏf~} i n}?]o$W>}]VkTJOo@6-郥|l)['JR>-cJPÌ`rgDH]J]Jd*( ۛFQ-t W. ٥\4f{aps=X71 u'_b;jVO]Naxvp] n/ >nJ9' BSg3k&./)X &]gukxcj(aҔȮd &;LK@1$.UK\ꪥjZꪥQ.U +UKuRTW-UK%R]TW-aZG_k!uZxiMa5MIBJAi.̩KC&rTZSefj@[?.1&sǹ[S(J胴y59#Ѹ!0XCQrTm)}<%ϡD#zB2Hq%kabU)> 7 LBL,qʙ`TƔFɤ2Pʒ7FFx,=? .vvF@hYphZ8O-(!h[Nv@)N2I$ivfI\11JTH5(02KYN:Ftkxv֛8qxQTPX% nװXA@eOMK&8KNz8#8}1>MӉzyYUQ}jF$+1*JfᕢQ|2EZA-uqH%NzZvԴF&6i d2($N ʇ!!Ze@j8z87x&Դ!@oirCLf%cA EyG az\A2y כlk|gU<À$ :Q'8)F, YMx3<`/biȶ{A'ٱz{qKzzs{oD>2;P΂,? bhyD(`hݳ {C~YJ vk0{K}'RGqPGm/:op$ŸCDujN8S'T%YqUIǐ2ec.gt\`R+XC#w&@h)QO;wWj>idg?w}nfXB}M5/r$ Ww$1xwqY{I)s-c ^bGH8.xALKٝ~#2iS&zl˖W`2OE-T .}P02'gkF%mD:Yk?[rov::КRi~nxg3(nC-sR5%D;{r:yggphV~"g}3~~.~>8ɝ/( '\ybK$Ք )ko5o-s󪥢MapΫ3WZҮcV*_W;6cv~Z>k&=DbF^:qkBNJXdJNoB(9 k@IMDf6.8터 6XpJ۔]~t [6Fvp:=8vDbE24C}ګ/qևI!. :pbXiڮ<\HƢDkBiD'#DdR@L 60)T r͡+Ɨwt?$r3x:Syx$xd'J3,$m@x -"Z4(p&Yk!CQ(Ti)?؛BadDN%iA{Iı8c9Ny35ARx#Q\cM IsoDBC\#tA E2b8GbdRtrzDpңRS%h0q8d65qQsdF j N? O( czlc 'y?# ̘$sp`p;VˁaCcGc8d^[#o9{S bllhxk;@t>5 y|JC~3|Pn.?BNMzӴW[mۦ"j|kLN|s;&w[cCu#>/09DJ Jf2:w;5sLO=yMkLcQFAN#ʫfWy@AMV޽}]}| Z(Ͼ S$[bO43YW.g n}D|¤vhФ%:o.=JW=׆k`|e] ?0rKgzM?ݾ}~mtfIfP-rʩ̶7?$k|\7yDOnF\TufIח~Go)S}^?S=>$禲c?,ֹpMeamGe~X-msSlJwV{껫gFïXt^8frэ4y+! @FcAۇN^51Mwh"o2?FWٻ6$W=,Vއc1hvi4v]4yKLRj7TQQR>XŪ8ȊOUouNj.]~,cKsG@:{r-˚8eϫvp3qeJL71hu]DOi6ă`HKdn]c  +K $# mnv󎻯,QX~P#"<8 ETW"Kq{%Wqמ ip©Iē\x=qpg})GT%X5قC݂A~ tGky< *8|ʎ_2vLS}(oBw ̑TA }O~O|OVY+YvpGk{ P}+o50| > lGJK=\@6Yz5yʣĨG Z*e4Xڌ#;,RgpGH9ûԆ,a'f8c&EFeDj6.8T牰 ,$6 Nqa< [6FB+8Μ6>7N2D* c֙8σUozFmWvBnQYGR+6`( 'ϯ}"/m|Kgx=taZVh ,/O&LXЗh^H A.x"HV TJ"̹ 9p2ZG5UxPy oK+M4.i ъ-B($ZYR{ Sil2>/ce6m_9\`5R$ջ攧 _d1?mR%4+4mcHy &wB9lBX\")ttCz톎ZT=*z℁ENLq-!ze!bL6hWX XhLGXvL$~p.ѠLcL1XA'Tf`cTI4%$껱D@(:RjmQ5mU贡.Tٿt㑏2US>'An\! EARwr? nPzhN/6u~fcWMo7ݕi?[ݍVاTd o]oVawUb[cv+$&m{Fb ! _βL '_PPrjau?qSPo'9d2v.(!p$e!&(h4R#c&ˉ 2T'};P -GZDZO96B/ڻC^n_ 1<'ce+i+Gϟ\q 9K 9H2gfRҼM+ }(iǏz#H65LeM&!go& ALWig9Dyxv7 I ?dIQ?2v*nx7tĤI^04q)I$$OR&V1i3b$I8҈yY?.Q =j, 6I'#+'<'YSݒi(JqN残IE ƆRE#7$q[qmd]f{X'RȰ5c;L-8~ڱ-IlaAB $(N1g9';)8$;L" ]GwcUչXGguEлi^_Bp NAS^w$NEAB^ bitZ-VKitZ-JԫtZ-VNjZ:NjZ:NjZ:N~јMK׬ę,n||UlA2"Ab!,A9 JTP@Dԩ =jXzyG951:Wi36_zE)q M^DґE0י&]u?{ls|òk7sJ``'dVxRn nb}B fq0s6}g:4g=ČG$[e61™TZ]G_C(z:o뽛lﭶJT'YYx_ H#q(w< PM.=]JOӥpv.KRz-=].KRz.KRz.KRz.Kӥt)=]JOӥt)=]^gQ^8k~_й DQ⿣OBx0T㟿pQpQS>\GlKip/f<ȃ pg ]#?~OZ~`mX^'x:0J2H2pуh@"β8ÿݼT[p ?GpH.)gkJO]|ny9,tU<0?OjS}*_?"Y`1QO5|gv7Mkx7ӏ$q8w=b0N6'=^0q8e$GB|ۘ"rZ#α@--r E,I ܇LG7\LY z>FeZ1wAl"&$~Θ ,xGT: a7dKE{؃p}32g{_1aUQv`gOx;|Nfσqt2Z1>0%VAO9vguU!< yʜ<<]p#F"OyѥZi/b3bs$v?ċKn0/n%Km}+L׎ t&mໃP-N/੶= c%U&&Z8P;{>,הSǂiϩ".AƩLI-j_ .M]e[C7^N+m#I>Aw?,;/3a)sMj۽CŻ$fXr>[bQYYqd~q}G~b nU_rL:i=#L&l&4*kc&PfTetۨQ1,%! j C&]cnkz%Tk5$j#N:\:+Ub#1 K%:ʝTb|1 Q^ ^F=Y~!5tEv[('EOe;r!oe.LWVS o/a i'hޕ]JcV2]sQB/}ĎnҨ 8:*]Y]m47bضE=#z*U^=W] bUSPIӲQ 4;,*v' ZZJ)DwKܺt)f&$J=ѫ,|m _:J>q+D ayKle{U׋Fڵ,IvX,1H)E$r{!)iWSk1ʥl}-@ ZtW"A3ϻ>E3H*>PΒ@ϻ"Џ[lyݧ>W<a9V-ЩG3vVR1|%gS4a>+yyIcM0t.`)хJ"$ E~_'y+8&`X)S^cB>*:( $)@d1̫OjUd@,C=bA"T8K[쓧2Oa@z./Cg;SAØq8% .H#ˑTV) ."VYwvƪӹ+tBP#z_2%tYTX߄l__!ܚ kp~> ?5k 3CEB/TׅF, f l.fC,fu;˦Y6Jg麟6(R8]i ,Ni7MԐi?_S^LG kaj3&*IDk1hFs+%"[.S+UXr4Qwpo0)ߴj/ Gty3F;%֓<;C朳CQ?{tF SN7u;v+ns>18:f[q n vdr3 jP VS `+C#8!X|8dIid =!"}B"k/%%\2NXYh6Hs% o,OqUU-$хi~ddGlŸ`"r]'0QxHu紙㉴J&2+6{֐i%w_1|K߹e y5dY | Yk9ǽ Y&e1/;aNx$@G2~͍nU 7Na쳦K1~͖Rafo0>0l>vxX'Ij-]E]0f  /9+Dzן8D{"~*Nn1S4p \-sF1ڐ@_+ @a )?ʏ?㏩cx=g+cd&D.QQΙJIi~)"~@èz1|rkohAwXW䟖=uk쯾᝺O'}LǓ>IǓ$ 8E0AATY @,ˌ',(!]eū!&ȱ K/ߦg_~hEwck}r6~I]Oݯ/1E^wOrŃi&C٨;ui*)J*ԯaW l:dGq>2t:*XA'x9MOsH ~a/>ϵ,-rRA9:/W͜e۹b\n$;kN>)kՁwL䚧~1lT|ռNk._KXnU|ռ{D\5gH?s'+g޻+%5IKg 7z!*$~4*s5,9HἲBJӐvqLil.ɍSg;sn: eʫ32:Q("6/͘ h&:=:C*˹aanPѩ[L (WúeBn@1jve X<a8zmoב i47S–̰4N+!ʓHXI.zMLXi:j&rV9 VFaG)!YGdaS+t"A4bboSW63"!2 [YxBiJh%uHxm.Ի#]lnLZ(ab0Ӓa ^P|?p'PMřvt M,xL=/ 9~0Ƶ``kB1Dn%D$%Dg ap>V'ߋMc}V}]oZ׸U؀w^# V3 Ŋ9+˭"8be(=ҡa \)# I*@ ȐB(Tg8P=K%*FY^1Tk%iq^C{Jxjo½~_>L׫B[nǑD.8gp=VR5Ƥb(J`*) H9)ȫBhHRSFDD b FрG!K$T?d"Ňq}GZ/>&WaiF>ov~{AӘG"İ 62( F刊`) NPNʒ; wd 3\ 7Sb4ֈ@$Xi)Lt\JTCu|<ĸ"xKDŽW0 B|J,P$R+1M2 rJGpV?3(* %B# bD$2 RjDs$rŐ*"if?6~v{q:3 ol 6i#coH㞦11(`V8)AY_^\:x|+^[ӯM6[F-r"H`p:T礒NkpRb` .jEZ٠[NnY^6cŁ 1R%y!X08 5leȲpck6x#!@aS"Lf$FA@@Ey:v 7"Ѯx[ƌ-q}ŭ8_6yηy=$j: s-0O0B0<Hw ᏈFG>؂SL,A3{:39}R_6 z $RyȦ ԂG]. I˗to>n?C0o?neLukY?m:!LѪ*Ei=X6t6V,`*SC+&<k41OcCARZ_ Q+fQ"PPk7:l1PN*>I'G.~>KoK'F=1P5T8}\|/ݧ~u.W {=ZZGoI}huA)o}1픲?B{SVzȬ{^XCZ#}bhӝGjNc:=zM :)F^!a_CY#˔*5o0h}֪i<mf4%L;Kq/Xa lG>D8 ,]`AEY{}roBvVD! ZA zKf` =.ǁ99Ofw45AY!Aq-jQͦb+.zxCBMgNy?qQs7'U~BMz+˜Rem$Kط _ 3n &LwbШk"9lw2U(Q2%QVQ6p b:9u. a#a?9! kF(q *my:?k$;G3ޞ2OI]|0)\O)dIJa! 4h+$@=֎~-e@Xl 1^0+:Gg71oZf/W.[Q*l:[de1%+7]jYWͲͧcE³10X_t0gKLD;.3/=!36ioٕw%HBO"iXdFMc@3M=7uR[ ITh) \fNƿKTOxgX֣CŠ1v!A`"r7׀ck2ҫN8UH#!pIu*kf&rzzdX/ŗ aJ!˜*rxJuw'y! `Ra<Қ?أ띈'qXXo'RwuXXo ddayL>ݩ1]O@eyx|\'CEq;Iaы:ȡ``s)ɍHQ'QtDy'&I[hT%7eep`}#<_/Uz( ä RT1i ^Q:"vC29Ӏp"b}Wx` @!M@V';!1#ڈg^ETzRm}[(Q\iL26fQ(+ uLHh<3-pAn<J4eo&gܗa7v˝'f=Ǐsn⁊U#}'WScF%&| >Fp]&g-~o0#`?6*dKUD|EU46y!\Q\KpgLe0 cC1}im{퟾imdeQ2аӺEX0L;Q`Ⲃ]o(\(mFB?5頜){1-nm+@+M,4$R'fCu|}iȪ/w> jFw6<+jo /~SVBUYYh5LI}6 1Po}6/>HbQ(ʪABGs+ɉ,yÀ GevΦm/1o~ןA=~|9 r(:\H8h&H72'`Tv=dF~pQU-fM 5_0hV`JXa-4:4 (Q"T l`{GqA3AA5ȺTzuU Ugys#{V+i$2PsI9G1ĥ(f2UԐM~-Hbō>iUy`U1ov_o1dC ONlأBѝHidg|+ѥL)6zy"zESO虂RArx`}cPGjy9a@z='j~ՇްxФO1:0y3s\onvϦFnt Y\.R0ҍ@h,fڭ Eںc.>v~0{J*|^he_@ߦdyF=e`az.V|IdP?pFM[\Um@3CO_u JI6ōkdNBȤTVa(j/ )9B0"SuD/fUquoZ呠UwRoGuD=+chIzŢeXoq:V!"H`?6aw&jĜ%{/»/&/>+?w_pa_O0ý,l7Uڷ3PnRt|k߽l̾P{$[Q\ϧQڈI}ifrn4W4Vn'/I Gi9F#6uEG[ ͓ߴ.\rS{;p|8<ߚg;v^#0IՄjh6C^Q:툇f~TN‰ f6wE~% ;!caPB!xecJCC\9 =YM (%$A䁵{k%T3KWyqE4_CHQJݨc*%xψϗ@KIܽ3VE΃;*-O]~ 7=1{a?wŔKuu^}U;v6ɵOGҶ>>ypu=7Zowo2NثGttN[{Q?945ϯޗġ߰nsNB?i/_z[60Ю K.%}gOAa%2)R|iF 뜨;+XRKyk!|ԝ_LDZ=5l`kc:kvV3_&b`2Ma;V D ¼? AwWۗSö DN¹H /'`R`Ջ0".I%C4p灅޵-q$_AeX #k9v;:FÄQW HIoVw4DHj="]<'2}B"k/%%\2NXYh6OՕaE|iNG bJkZ'Y[ k7_=۔|7?ybR>{^z5[eǪ}+=7A#%ltVz0qgh{o8rd:xQ\sVfcsؘ7MXN7MƗgҟ]MIY*1aR(d+)`~̵Kgl]?}z`>P \|R)s2s޽cMOP칥~Ye=nz%n}k.8 6L.Wbo;HZpugoUc=27 f̧g#WO  k|05t[$SQ2inZFf9t\t}#B}trǤV{M;w:|hFtz6AI|,8szGӹlTx&Շ'Q{|;OL+JL<^/-`KujDvҳ}rHZvJ/<35wL 'F\>p4}N4/垵rl^6' ݞe{MųB[7e+US3vy1Ό _~ U9(|Gk'LS%~_%0E:XnL\a妆X/Vל=?}*N%3{ؙm,M66:%@f?<37"?`ko23|"HXGPb&$ ViǬQFYJy>kY"ٚ2|M!Y%CWI)KL$T. Wlx.~|aZǸx|]uk=%R0 K0Ri1{\Ԝ0f(.2ĂŹ)/:|P˛)x+V|QO}.:R$Sdo9IE$(fEwZI% "B'H$;x{(K;⒞ F9 VFaG)!Y%jÄ4teHCI?9iz,H:0,bY#`( M)p- |J{ =:vVzMI %L,fZ2LEA* LqnSG;$j`iW)t[8~0Ƶ kB1D0ВR"$D7O?D|7e{mJt;K2Tp|Cm+KG_^yl^t^_ﷸ7opF:X딒p|[N0joP vexǂ\b҅VpXf/ Ⱥ`=Z yM\PV[ PƌƁZ1gz4‚S DFSd]^@>6esu ihf^ܻ~,Ga[NL).S_+-$ ߃O|p 0;*'+}椊sI9 >qяj<ҩӃ71ǟALo&[sxZa[;' Zf::O ZƕF̙pVw9 .插Uۚ%ikb&$6C׷'kVO5ȋavn s-\c].i|%:Ma- ]|60|;әO*Np~sIDB%ZR>RL[)Q[M*Ffd j0 k/8z@&EMZo85_],wǠMLr< Ng(uU42haD%$@tY$)"R:CUu!"'Uz/a iU3`L1*h&$>r0 Ɂ%c=^ƴ ɀ(QHLd$& YobA% 9Ij$f.GXâei=gbGr*mZ>%ϣEV-B-6x8&}s<^؆oߑt)BϗB0.tz2Nx|*w ?O~r'@Y HYJt>o pS0y 8_&qa )ԛ%!ɕX FHIe EeEJ9r޼+?U&8#53WҚBVb1ߍG59w㫫ȝT `` NlNGa -Is!% L}r OD`mDd`IaOa:Ng {\|Td%XZH: |7z"_+-kt wxz]ٟ=|b qC}+!WZe>R9[Zw:^&l'[in%-Z<8L\ SxGzxi}D@h~eR%2קmN\(VgϘq\IrZ[,&i-8xiE;Vfp{kέwȲsLj:ζ-1CKx : K: wH~.h{l0 f.6IH2ݺL֛ܔUjյ}:[ Rm6o-|/q,6pKJKZ0{Vurqw$!jM%U<Jygq4kݦdi'pێԥAWK+:zwjpדpm&i%=Ư'p/0+ K5~]|4KٔNWrn<"QDkh*Ilv-cDWˮ^jiU+5W_sʸ_yiEӕϽ WQ`ʢXx&Z UӁCE,GD)!UcFsfH_ 3hDcm,E$FTWN8l sD ޖDb.U2"S)ØAA۠R_:ɬB*2:2 T2FG12 :a2#yMVwQ$/`6T$:tVf)d ?F"u^jҊ$%Ü#5)m&MGwlta47+30:dT4IlsL"((LEwӠ)dX{kr6c=zrɎ~'=4Sn2ì2dsG"`"RS0-` X9sLD:slijbKQ֡r nzu6н]9_*(=o$9mdEӘG" 6غ ;6` \ 5Ɍlٍ<8-Ħgk_w5sqsOQl+3?rLG,/J]]:lxaQR(9Cq;<Ãsg/ aipCb/S+a#̛(,[$-w Kj,0>f %%c8%Ĵ]PҠmPV# $0Tf{H5ESСTqg)_ e nlGTw+T6OLFAvYp0A:?+)1%a`2{Rdi|-R<8;⬈"ޒ~15LblYɔDTkB{8gEC%DhAV,%DF4QAJpu A*!lhgٚgWw+EPs. V8mb{ Ie)\04( V8)AYz@7ܾu =`˨ENQ NE;Ti-P '% `.zj1?oO_Ge8a0Q!P "V !/ DJ.<xdr`T{őEqL溣h;NĽbhS):ki-tVU#7gNx7n)(EHM1SБnE]+"жI_LXMD .j xD It (ZJa`d`& yNLw ]% 2TIБMBE͌5.Lau9C9RD?b.m)D $ iC`&qqj%q1t쉋S+)Tks8AeNF],foq\I#o y)b,`kmc;IErlG[v:X4EQUȪ"X " ĖphK!e!sZHի'̷Er{ߺ>:7GbIzZ5  4q0;c$:'-uڽ R!Q|UP mɊJ(@.V,%9}0m~;cɼw~۫L>M_'-X?mA{]9(qD9I\Gw O1OaVvd,Ot<\5(Nw6f ^՜\Ւ٫ *9\BIWgmϩVlggt/fاKYa^1n0>ӕ,،'`^k+?u8ߑ]S VJNvyv*Fi(t ! )K$)P5w?_e*)/$.k0T/1ج[kL$(H9wې F)x{R.6ph tQؗ;\ߒa; T~kvJɥFT)y]xiydQ'-|**ܥ>#h04i#۵<3\yck"wA8>廋eEh u=>:vo!NhMwxiB q>6~Z<ξGp\V M. Z^8l!)a F ]MJt(5> U!jR 3), qT@4~H>:"_ڛci9C#l8)!ʬm+nFhGrs跖5di6Q)+[ꊬ7s?IZpC6eG!H q-\e%ލa&(nNpG3)զ`r€cdSKG_ (us+洡7rn4Ljg֏6/ձ@ϧR_|xDH0[!rG:<:CIm  %p큡 &N uP2#@'ڢSbz z_oju9@ ?+/?.5k;=l<46*[엨0kk٠7\ޜ}m ؞Gq됆35V؟>+yv<^{sP̷S-zƊ5(řychЉiZDdSoP2}{uzAdmXN/5UV-a;ubH*NoK(pF'X=T{f^9evTflx!-ajB Ӂh ݔwcXWbql:]ZvJk:t2CzHZqyW6P 'Sޞ6NzDT-k~-wՋCφmn-#_F́RL{ŗn3,3 #hXO7xO7'zHG?OgOG )_q?qÕjuGQ5A.G& Ǒ廌dʷ[*7?VѾ7?,CW2h?f)mPᵦ%jt3 meW`6tWy~R}T~*\^up{?oPKy93l=&jmo0-j0\Fvtq4}Uqٜz_<ڂfkg7,* >ȧE`(0<2F㽃YKF˫@.ddT&S-JU2 k@IM&2˵qi'T$!ĂS&^X<,Em 8!c6V;E!2aBe)*V cj5V^jMJY(Wl9/w9oN|iOkdx@ޢY9N#$xcQ^5y! "x"H)Dt&%(;CS7_uLeW%k7 <(4 IhEB!J.(\IjE X  m,9aw|V./Xt*I (X4e;O"K3dWZ'&[/?M`e(ޢ&GddpXȤ'2̠HgWuڿ_d6~Q@9jQF J uJ? /A ea!AέLy>QQ{]2]N$<H΅:2{tHFqd~dp{qێz_#'Mng 8׬;4*j㶫h˝⭇Χ/C`4ҙo:u`*NnQQRZ m2"c!,-"T'p™"11jk/#]o51:"hi` ܙ!ND$4B:OaK{4)rmr6gChf^l\"i[g``$ֈpCw$,\8kCeLмx_S&7@+PFɔI k`f=5b.C=_FkN%/ !l,$6Eo"'Ǻ/uĺqŅNIU-YY9^Z#("< m0I%k p^]ҵ\;p'vUqս?ɱi6Caw]xGEo5#zBsfWZycT+E (djb|IyL:#m*dPI\ 8UCB4t@X=ae(d8CuS 8'*JGM@@q-t+lpc-3B痢|ȡzVz$I2t$9ΝtO(qR$$,3l{άG%fԣ?X6:7ю$P=k7Aٰz;nO&E٤!7r $rxH0Ă4ǂHUv2UЮ,?`|!"0• j]Qbm 'A灅g8^3$@kM'6DB-ME Θ %ȔU1 v0èS⊦d69MԼ2`],~U0I{5_^ + [asoYY'I aڮKXG_\7&^G1';_r‡I|#/uԲSQ )uM (ƃ Nmș!ӇO3LcB BF` X ޺\f2;cp?g<_ {O4uF==קϳK?׆z0Ss:Ewm8}n_0 f6:ncW7*5~7=LOAmN W/5Rl*dG`iXHDf[쐳"৓;nkz1*\kǓ\kʄ 1a=36p&ĉL{-{H×ٵ:\0KMiA<QOF|=Lcc@" &#"):xc 9PQ31*1BϨ=H} XέOk)q0^D I&ʤK;vJOXbtQۂ;19ۣ|AC% }Nb6цh 僣co>"tGU69z "@Cgu-1S@% ;#A 1#)诔.%ϴ6ASI^c\TJ'<9MZm4X)N_20ey\jJ-y-ۮ3cn»O_'"7VX玡{;-y[]d>e}j ?FћW[&^1bjH#ۖbOiy4xM3/*v|iOϼ TvI<ɷLu~*ސ9gۡfz1xsLyoE͟5=:w3P;Ek_}T5gNreF 66{6$D16O9Wū~1"kigJI 9u=ە$&Phl Z8F -ګq>v$84hrZ$ ~F.&+B%> Q<1C*&H^bT(.R0 GWX_^ iZ[cرi6~Dy!'`v3{אinV Mwo;ϢCހiUI=pYzǾ-{E'+r Yr8(\?JYY VsFf'6D E9h6R0$[m}SB5=hp{Yhx&i{ogv1)J`2<TG"nrX39#.AƩLI-jXz-8%fo+o}qGv-n3im}^MGU[ef<z)Hn糟,"7EnRn׵l{U\uSX7;D]VMQm8JE.!5.F2`<`xEB;ĂUB1Td//zav)F^&ouZ+M<љOnQnl{c_5z! "x"HV TJ"̹ b^ ȏ8ul]V[7੒ 4c>5NĝWuKpsk}=SU ґIUKYZs8JCblE*$h1蔸U1H"%DprtUVC=kLYGFf=V^dG/a5-:Lā m9GzI%wEs$EMc^>tCX@oH’zSPֳ@B^H`kstDօ$l {Rd1 RqQpFL8MY\p^($E=YpM"iEҊR.Z [ Ί1*jų<:9}r] Gd'.sًJy{`K_=Xr|A*|K]eru*]]e*(+9#܋`ŨLė2ʜTޣb:_}i\ fB:XޮhXO>:_A(`IO` Ǐt64j3D"8Ǖ2>V\ Rӹ3L] uj:SpQPMs&{˸~ʎ:3vs8 5ۥX nLbz@o;& x?{U;qP&]s F٨sN9/\`j|c7L ràC=kZ_jU$+GP^<#5}1)Y|' tW ҩ삀\&X_  U܁\Ґ#3p}3KWk?\1 "q( -CQ$ToшP_Qvf.+˴zexKR=zCRGf\`W̭.^4V n +|y6;d>Ǽ/dp:yA?>IչJM|*NSalhsԃߏGohۙ:^ W!ۛ%Vud;.9^w~Z>k Y,A)kGA %wq=F(3G mAZ)'RzYﻔUmAQn~e-g=ZƟ]i }>PLoOUj/y"nrX39#.DXPq*FRiEk NL}FC~V~26p^MGUôlf<z)YfA)rsrsuB3" g:6rnnj5~tųh4[9]|qv\C.k]$sd,y:\񬋄vՉőbcw,5//zav)F^&o@՝ӢI˭lMNn(Bfhuz|ꡅM5zP {T7&1(jðqe1{Җ'FfIjjj:TOcn̘#VYlGۆ/jq{NPiXg"@'}}-"Ѯ6pfPTUN'=m7m:-w55gg]y{%{}`73T%\ao8E_(o$W&as[#!l rJ+Sc ;v-#?u ?%FeZ1wAl"&$~Θ ,x@(!R+Lvz:|?Fɡ՜mIr[0K)nύmB@fBvhN1 G8N-R()#A *n/Jul=X`Jkw2]mt6*v59DVo"&&8a.x!GBDX &g rTyq/zC"]tXtD&HBV p 9Nֽ4cgz|J'p, ZƠSV dm)Eފq(ҵG<`i:f7ЏLoş @v42VӢj4K(sѬT^r'yP-bm&r# {+aʾ! KB*LcJ8@YϢa {I#^! օ$rm {RdXVne"iE)i-a- ZEm.bgy}Ve[|vfu^{ns {iIq=>#EDǶb}EV}\ L2xE =bCN|"=D(%aBT@sB*]L?Rh6\O8|.S6HȌP<+#/z}~o ?7󳜳W Чjy h8es۹5iyL{w :1Z,FwfRe;K Z-S5ٮO`?4}# q#n܆%a8agvZ9/?x:h|5e^r,BY~G u~cd]2$.K,FẠ UVeq|-X'O[3#B:u'Qf+a!w2:]XT"tB#Jj50|-q/fs6 U{?mCYw IbaJ?[nܾ^i ~ge@u9O! ݅L 5Cll9 mOGӧIN"`::훛\0/2_d|%} q#ho=Ґ}|wiV,yX3]:Q3(Nb!ZRG|Nh JSDAD#X `@zbUA{]zm.}Gw ޭ mt }9ݢ[|x5~5! lu&ϝ3:K.+]\nI"/XQڱ(_*.>ը"b/\3 w=$GfMjWl  A%cPQD $}Q:`g#dO*KJHG|Ξcirū [k1u|fZ#^jNNډ@(EnPjKBƚM!kھuTހ.w&;)s^Sv:z,CThg*Q5x8jV6i;Yz$#GAާs0m+SJ1lSi %HGd'v:_4!,b,#ᑍgi=#CXTPXMv|1A AGVjiE9&ASTv!SpB8 RH07V{d 6=zLz{/G˘? dx.Bo*7IF0G Kg2{^Rw],!YSlezmQ_T4[ּ< nzZ& Hj&h%!XE ,@#%,%A\Qg^,(0NH, k7+ 8lE *%cBV;5 6=5aY[o  cAm>:o L2y :E`* !j*p \_,{DXUb#8gds6bW, AV]c7ULpȠ0`xP(D1%V Sf4' E6md8*"dYm^*\,[)JdP3 < Eu-CԊ-tA} Ŧm*b_wϩ|* |S=ԣvHcC%(<{"E0}l"A G%p"6N=nئϢIjX;q/7W"&]ߤ[${=!ˀ !Ƃ#.b Qi~ao?{KK@~z|q)'C^/: XW4-9o4 <*=H2%*Rj0V^KA[,WldMX,.oAS`*a@]z3t; sÍ }Q{Ԇ[$wgoWg;g.uC]Kxu}a|󀉴JAdILx 뺎<+_29O++m+6DJy@"D<#&kWIWD)@e1,˩9X=ARt~"B{.:x&!4exj/+59M Egf:ɒFVx)+q(ξ=G2F$ٵ!'c1 %@dHN:~$gnGW }L =z̰ hUnN\8%. ,sޡGsH#@[ֆJM&xI]";Fst%.eC-ei9b"Y$| ZmTu_yvhkQj=T)UHEG°6gH0Δ.jLQPN"f El$K;L2 6=kK8&7} ;㋺'7>9z$y5]>G;6Lgcibԅ էY3 龇]޵yJyF4)Cskzי&h$0<qz[a^+zk "FfOCh2СsVcn(Rwt)}xV1d]PkX6Q3Jl5 EZm xN!:#yQkIg^Eb?UdN7'i-g(}[,> uѹ:yG\lTi6m渶ቛ2؍٪!.Z0ޖe#g&!9~ ݵo/` ;T] +[WEz|wiV,yX30tg0Qt [zC<)`GlXo?*hOwyXomn'|m78e>b8Ln=|hns>-(}ϯ?!PФԡsFP}e 5JFiyT\0|HK%g)xC)I$av̀T2E`.JRH5%YL xIeIII7`ٓr|>M~9pI<A I+Ԅ")Raڣ(.y" l78:%L$N;--X?&Gf[4_~ ,0Z?6%ݓ{#"hO>%CL6}G 늡ss[ m~bڭy鹐x~pٿ]>f˖ pKܼ@?it/ ]ԯm-G,헨NYهpl;9sNf= t>2P0(l ~ǘhucڣ҇z.Nu $u< yy.l\_oVOV`7 ?4M:5_yE' `-rbcxtb}Xgk}/ka]cXd= ljvW=xn \ 7E-ӁF.O?k</x(W 9Q щ&kW4 )@eؓW„X8˯Gkv-8B1 H)U0ɕPd 8yZ:&2=6GdQ*FQFZEu`LT8:zER`Ou37-z#z~,9Q1ѬJw9X%TdS"@IVx-俓WA3D=-tsI/ΰ<:q6ɛ<^Kj\S4"b.:8х e%&@M AyPH+tsK0皚D'4$ 0etŢChEYP׸7>B~B6??ւ=b[dl#kC$3Rkh(8 U\2 >J qTJg 8႓s }$"R"[*I<)O:t6?~0/ A#QAʐBTW- p^d!&,T62tz\9U#d4 B1e(cr :kWv&5slZ;~+巾av'hÞtMomўUXV:l0GgUv-6w2eS]ϿqTB48V!GPsr9iR/> V\'9PZ^4ٮlw/ \|[dW4~2>/ hm{,a }B.(c`}b^8*kpED뺰w ^ !hc} NKD£ʖ2h,|HE'9&.voCl؀71rRcԬQI*J,B@ 9̆9Y@kZ۳< m@ڀS"vP<2s5ʹ5Qu~ףVmgCZځ<ҭ*>*1RUu'R [tKIg>#sUSiZ5x.JTU`ޣrROU3Ƭ m*m~ݎתܴy臛 YB]Frui6|F6RrYZ|h J [~}#fX36fqs1UZJz0L2V<ˢ}죅0z"imo,jr< h6{9q\|Fkhm bf9G?A|tfi2c)5FJڍ:A%kt[F!ѝoJVBS CJ&:^gL|ڇ| OVhRAO!JSi|ImB*ó t6,J?p wsUg^JIb0W\}>qr/ןʹU}JgW@B#T; jwx%e" }S}i; upBj8]y1[ X! \B ZĘ .&iF(E!V")飊ͶbR%Pg?(%,%5(2Vh Zoΰ G._BTѸ b2D R&ECқr翄lN d'F6Dm6! ӳ[BEG 5,Fr.yF|*$es"ʛCdWUĿ^^fڮbݘ9dˇVjU*ZU%f]>ceWZA%{q;tWǷ±Kz.Yw/$0Z~_5\|?[VWb9` }ߦ~>BFNO}Gn ޜ,s东y_򂅹:iaֻ+6 /9*ٯY'+Old?(+7 ڒEެr_/Hv/7oK;Xw~٪Yqٜ(Jep:j%a6:pbm d]e()TTǹp2Y 8q M|1A AG\(향rdnF瘌'%p T] !BF,ƀX%k9`>qNؿ KgNqEʂCB"gN ֒P&F(ڡpl1!+Iu9g#`=v_9df#y$J\| 2:zLTEP"ĻkC}~mRP],9z(*@Hdq hLdU!J.PP+.'}R׍=T9 & 0`@("٘C`R$MIx";ptb8zۙN!v|PT pL%rPs@y9u~ Q+)@:a*{x^5}_g*\A (Bt֢GNOƠ"Mt OQ|?$qjg\tѡ&EuOKf3 𪷁D )Hƿ!BC_l v-bŲtP04knX&tf-GijzMe`7.G+MoVV7Lmae;owT7 kX1f?!Yy%n-a þ>E'ydt%=F2 8jPhЈE@bz]9zJk Jivuv JSDADjB?X g!;2ֻhOԻۃ?{WF`#2X,bXyDJjSLRrˋYEJR"zhTʊ3#Iӵm"k2rx%ntm:!w9]ܛv'aEPþ#rT 0mJn@vy+oT&+#|B*RdDA{,QYrI]6'jY2i FT̒ېJK2NЙ J61r#- *-gs+7>6Wџ. /NӫlO_}͇ase;x-_Oϻ13oKMm[r,X`FMn(DxIdBt`()Xp.3q itk<([t*z]CQ،TSjT[f^jg>.yUݽ"pv ]+~Epe∠ji-˕GU[V;/-}![g{|FkVm?zA4r1$}tl'ʎpTVx>)Tn T>)WLP|nSm`Wmj>Ԇ8܂}iVrGqЯ>hMƟ蒼".='Lǟ 8ȷvǣɿO^2Rk?eG'y}޿6r>ttwOmD,Ee1M {FR62o):#~"k g #Kjlx vpςtmF8qAQ|>ZYrQǬ:ĂbǩV',}9Hy )ȟP,5;|;x9lRRCD֝ at<`EaD$z7M9jYe|>@Ay2c0Qu<3RKL')C*"'DUb 3'Rr&s9x' .xj2o!t8߶a9ËjK֒M_mo1]h}dmx>F SϴS+т&hE]ڤ4j+?2g;yOˠb.&+~zdäwc6×vBfFOn{|?x䳤-f^JJwקj߿64?3_?0,#>||z i_>t[ر/}7_{ythp{x>gnO8 >{}au`D'o-& 4'}ERb)ye۲t9-pˤ}y>xR~q FggE^=wвٓf-EI+d҄N&9?dN cM~5gy~{}UEHI3-|my i{ci}w~f~;<fA|AޏI AoNW|3!;,oQ>:+-e-`J9r|=g.̞1{|=|YN_z3S-Zy凊iIojJ7-Tjm"l^iEVҶ5d\٠SЍY/I=:%XYef_wkɎY1/a>C X7eA;^ Sq^xbB^n2cٌ%idґ ǁq>wN]SiAYE԰NzbVX v^g(oƊ1o g?_2bx13A"G/̉Mk)8Xl`^rtEl UD IDVlg畗]bBp.+޲ƃBLacۓ˫aE@]f˪ׅ|Afy %%iu <^rm3g+ʻ~r.K:?' h ,״&޵Cs.C~zOuEKl]/$['c|r/[V])OhcOOnG㻥^y{/VszҠEa~N&&6Y؀W ZI8Kۼ:7:=;+@ HYPxi]H<@gmc2e>+,iL`ҍt+>ChkR:߅+sJCt "*l9Q)ƘR(s@o'5 \~1~ ) U2G0VRXub9T I p{Xc=3bg(|mf݆`d"ҷA'HҖF)l;CؓOhVFO,BB'Y(Z0'%*Ng##y$ wFj+q6^u39mϭc0rp$.BUxi*Td2^:Rit1 6-|;Df:("g&&ry1yŹ ¦Ll .Q hYgjv {hIrwHpBIC pFL;tA*^5g>Q yr&, H=jF򔰎IW:-5qɄ֖iG\ӥo$dHh1td˜z:-HR>hNQ9&ҳ}_2V,o?'m[BrKL Ja7Ѱc[2ƺC z?O>߷{ur4y:߆_#܇ٰLիz;L1&Fc 0rU 7+-Ɲ/ѿ\e'6f+Kb2ߞ>E3Ж|n~v7؎6y`2.v;0m7Yw}f!QiihpSO˫6WZf*onOOqT\^iƧ1r3`&4p0>;h$ miScC-J?(w@~_ p -ɘ YW$!U n_|Y"ԷFZiʿɔAZea ;R )@.2KəeNNmQ{s(AE Rkyř d$z#IA=9y7crƴ2Q.ke2 ra@|6F2 XXk؊@i=KN;jSl.[{ 59[dEɜ˳UV^lH;p'wɸmLY) "4 @;KFBo@8 O^yU5*‹C=yߜG!Y2yСbp.ifLb:9I(|yafclDupp+ jte*-]uœ. J-&E #{Fjlx c_%dǧ~piE%|ڗ (bxVk\T1,9Z713㸳USe w#$čV[$-_߉II=EeTDy'̓BwVI|NXݝѹpUތ>i|r[,*sg[8">'#q!X|{a ̿pQa)2]Bs<&өr$G"z{x!ȷHd# J!;Ec#p1wkP!ȦheZ'΋ÆDS=YCs>sh- tI7?TYy[^ݰaMѷ{mwM;_櫮oav}~ww_6PR#xnƉeU#~=xw~*ocܽX۟E $>rw.63jF-Φej.Lz-S(23Ln_6۹-`q8[~X[$9O?,}!*@d!I}-2(mZ(եE-jOVC&p^r>÷w:-;˓ xOҚ!czr62;ƶȢH0oyG/c3kTby U{˛k?{cP 9!MD""BI6@d*2Fxp)X *P!!E~r0JwA [z69>F%1(́>>mFؘB*rbDIBWW9Q1I: 4c|5z濦qFC9]}v5g,{߷ 5bB:ydp#!6Fw }L/o`[w]7o}:{nM}˫GE?4?ネHӪpPo' MzTkm< ?헼L^cEtZJS{*)*7Sя ξtUpFCHo1QpDq:ۑm!BN=>U6Չ_6\0&~h:M6F9lblb;fWA&3SbQc??_bҫ1{Ļv٢o}u5M&GMaqvyiF$B@j $C!%)P4g}귓~#c\AOjVӺtǢ̞kz~ap?"_ng;4s19FPX VY)㟲7i Ã|kF;{xsϥmL \."T>9q`ayrZIQ󵁽,oğN*LOIu^ZL-&ӘVb'lq\LY~{;WQn)vDVQ_Nέ=)8,I@} Ǹx Y9;%Ք'[:G>dρ&50ݻ( y!@7C7ƏRjоy@ΐ)z3 ^(:쀄wJ $֣@ a0dGNy:_W/u5 ;ًP8g<[|eJR_kyF<*d-hEK )Sy U*,1 E2f8Y۝{ybq5Xxx45yyq3HܼvӦ*#@beo$Rm[-bUΈeϿGdn@zD\]X#(;G?1&z[~옶ޫ۩tߠ^R[˰mUL.z~ ~cb\冴=Wuyzb,տzcZ n9[#a>qi]qk]ޮ g.t_5id4UOַׁxJuD v2ײYadݎ>cl@0,j>86N~\_rz+ṳ=YU$ZJA+QsG5mR:G[xN<bh>/~ܼQVm벷]lQT7YaK@۞miת`лޞD 5%y 9{v u|dAp>Y:Zġ˂Qz?ʂ|Y_8}c؃VonsF]b ]*v@tZysyuQW"%F]b6+*C ڗ́Y">PЩR1D;_|q#wnX;ffBXwDL1oLVYJȄICHn`/ڦr쒋C =U\7:fME㕤; x\6ʘ/y,oj`=K`8f. D$c"w)&x jӋuΣ>1C:C20BB *fmq: "#38?݁+1i.EK{\El11eRaʟM#UJa{r:4 !i؆y?-]Osɉ^*La/Q #I?f6Ufׯ-:=O7|tU^_ꖇTF/a/\, .K(D`2 .js:ZmT(cf k;kQi@^f|ӵ3hD]4j>@+F};(m)֟˿Y[Y?^i{(Y{F&849߾XQZz~Kм+0/^D 7:Ǒhbȼ> 4M/q=Ӣsh'3Fj^W7{1udhCy!o\ >btL9[S (})!E K,U)yW$SmI%M.(Lsux~5P|ZGRR4jEAF $KNIgr* MsRW -MA?Y(cqJsc=fMt˝UJPk` UlW\yI~}J&}1@n^}N}|m[꥟.MWҠҌt4;l`k 7 d5Qb,;PB gxMWFwu"'Gh\*>D9a)@.'+&) PtWfٿ2f)'Y YkBx-q-<՝GnR|ߌܟ 55_^ϫ-S,x54AiUW4m]dZ`SS9kdED-b'-gGD-ƎQ h$p84@LQg`JYDh%Ԕ,(bAeGa϶/G@(ZgZbɁ*Hsl9v:cL8MNBY:-0@`DڈiDY#أͺԛ헨}0hqQSr@VĘ9[2elTplTxNLQj1L}B)u8>Sj(%h@Q)3ւIc8*Jm)JQQja.do ,ÒІ3Q & &R mQ .Zt@, YCJ9KNMr<'#.*m5Lx RIi/ȷ=x(T]Un9ZNEW [gV1Vr?wc#]ڋ|Dn;h~qv!BYlBMB0 l@E9H`'YO#BQ6B) 9Kgf6FU 9;6d-8k6q6<6ZDm 6{ I&RUg$]eEL }Q!>Mo!6~_-ӧVhtcɑP +V$:t{-2,s2DM!]TՊ!?qi۰wo&*  B$SbhX$"MJ,FvddW3 ])C|P >Ŝ%r`Ecuf] (㈶;"<*X;=ڐf&Z[*8 )A9#J:px"E(TLUSZG7Kʋ8£<ؠHYkѸ⇖IzX#]mOK@'5m0mI^%/ Ҵ1_Aˆvd2@@fQyGWly|m3N!H.PP(wQ#;NΗxsz6D1&u@&*ecQ!8uB5r7r~qƯ/qAl_Ϯ{*Dn3}9fN]n#9vm^{0X`X`g ^TKRڽ?d˲neSrLLfDD0bNU ;*T Tzj2}B䁺W_BC|P?O7(7ˠȧo_g 0`ꓒo?DI.Nqf> _3_~c+c =@xɏɓ7_5|5p!GՀNqѡTaŅx ѹ"d١"dYu64 ,FsО9'N?Y䷣(Mɍ$@i#*fmHF1'A̅ƒ@o%r̒arPa|lhCIhvu;-'ۇ=|dB(V!HL6Yܳ/_ sR;,d<INp: s f~䏇$dRp[xrȿ&cj˳31fw.8\2NOq,y0<~*@.ZD䝆MjzTW .ӄT].Ӆ.jzt]VȵW}sE6rtsUf rͨ+KBWe]*uchRtAr C+즀W/sӻr6p:nýD߸uTELl43 ɨ-*)B3-CӕӕV` X+S"z d\cOЏ\K2}.zeZȪr_b!XJ &( &2 )1iryT Vei~؆d1]|vʻJ nDLt0gt\j8!hPXd%~SHLIk߇O4K Y2FIw24l>xsdPY"4Fta1& W"YjQDs^U˜ 5NR5%rf*)ݻ%ë#4-Bz4iscu<CtgƆg k^z~Ā;:=c;H܎Y0r=0r>~s'I4]\C;ɽSMgl[?f2g;A\ n/w7O9]kf; НRZv>Ã/yPjv񙏑8Ǹ-ꁁ%NE'dqx=,zխDzovsx9|wQⷰmEߚֻ]g! @u@֫}#rzD>:Cк'(LUCKܒ?O3 ڇUt[t: JޅDsQ)Df dy]m:}]_?ϿܮpltI\KM u^1E~I7="~<Ǭ@L'IiZQR-8AOr7ʠעm5"Tp 60`vyJ &:~!X0tA% l"XH~u`D<\>") ʂa&dDJ6,䁳Ȭ78 Zx ̂V"l$xmZ9 ۬PNFxf`Ԛ^ޗǰVxØS3X-edrO sA1&8mXWWQ*+ Co d.d,2R* r@[HKPmy tISLL3/!B]Q>((5)+gړvhčNTnz FSHݨK:,{4V$;C]VdMԚ8!z됎M1Np#е;}*=>Z=Z\$8oHNɠ)ehjr#s>(L> ÷.zz۫oM`6~*3M`}v A>'0W%MиWePUYf \e\F)'&jϯ5-bV 6iz)|9Nu{kVwZBxK,uL fx*AqBPRhό`RdG{`Vc~xKGh6w pepn?9&?6Nć|8zt^6Ozaݎ]SrJԑ]q=lkpOe3OUcoCށ YnV \ !ܾ\`jrܯIBbc]Oܐ'Ѯ-J?:혭?0UX?r7# "ha2ث᭚=|j:t~C&0?V KN䞁_䷋$A&A ̒ېg tiAMLGf=]mP1wIs%>a؏gq:۟vyx5ͮBpؿz~{nsf͖zM'my3 [-ś#&eEw=0 zylw[4W^1N17<^Cw+VgV5vX'ZY~1cwNJMꇶڰyOpEy|$+5l\Mχ Kum{K2;NGս`O˿sZ-XI͉>֏?2_t%RŶO뷎(V=#gK"IRtJLU`҉BLVxV"%iϟ]T )rPNkhw\i\1yŹ ¦LjB(DNyz翊ߩj2egjA/hIH/Bk}:"/6*/wQm"}X:}eT&By @d *& %`g(z_H !H@[㱸&"AHڲ6q%HH:]FBɂc\+˜z:-HK;>!dt&c/%ax4t=Y.:;|Ķ#1Bu$ۈ._-j|Z~:,Ь+M+ /A# 7(7q=?¢@k&)i*ұ˅e lU'6Tw9-[x5xo&o G* l!\} L; ].9߷scP ek"AZ)RYȼzIFI7DC՘?3N0פETJ_4!yD7FH%JfcVQ. v˝}շorPMN^Z) ϧq,Z}+I&meuwy-YD{GBY* I H%dSYbP!+ |BFƪQ[SYkkS\hU&eDl &͵+dUզs2G)ob IKml 'gvy+um0p&~_h}4߁ŮaL d'dBbѤ6TV!bL. pX2&#fm[.{D546>cC8%R,jFM(!!eX$ 54Yױ^ޑ9Z]ֆd@Hs 6wm},o ,pY,śFdz[%۔nBV}#;VkN m6"mD!v;(v. {WֲjVȊU6g+-mˊY)+'3NjYxNV˩Ӓ^˧_h_s>UR|^0 VK.guUu]ﯿnVyGfo9 937{uo87;oe>çV|h- z*߫K':lvi/'fZ~Z=LR6UOs|P_z{ϫ3zp>{ɋtԍ?׿, P/'u _:.~ KffW6g'+_>_t:~T)5b騕!:Q)ƠNb(q|<,(%GF͛eBڐY`!Jr&0VCAؠ#r(E!``a](sLASqYCv ,dG1 $s2RV#uJu~/Ge7+h1IQR vDy:ph4$)'V٣r^wd]L;\Sp@Ve.Rb$0PJ]|1y)xU\q Qi>,DRh GC)Mʠe_.$eI^$l]j8{jЭU6ug%iab Nlrz>$BNxȉNRz'5n~=5Ȫjcv&¿u>W^:E0PLFcJ^碳I0b-16˱6=ڇ^U۪uO,uW8zЇ+9 MhӫͱZ9c#gմnm6ޏ.w=ܣyg2V#M;)7t {Ɏ}Q'Ľ0j׆e\?# cw'^f?g):`s|@Jɍ⁊q8bgGN ''Ţ=ZGU-إ )5@oAT~+ lyyvnK039O{KuO}r׭eSVK}xasGq3HܼrӦ#@?e-׷|5g?,nzhOfeD|[},쌸| \ 9 WX#(O}`L|0m}W SMA=ˤvu1Fֶ^]"~ɋQ I=X5&ܮ͟2,$U4/״{Vt<i7V#Z~빧ΗM;y=Z䶭Z]x!yma2AvM5i?XFK1q8N~\|z 9H̤<!~"@A&kG4k"* ^*S<<bh>/.}jTC&qPAl*Sؙ(y뷾aK@ ݦeWOsb=jJ1~s&yWGiiiҗf-мO5LNy :3u#la>(38Mϧq=ߺE3Ff^WT5audv~nx4\>~!\m3lt'\pJRDH"ٵTyE^cfOQcT!hA* RSEAZ GƒM`R,dFa=o4 c)]Ƃ9&.6^o6ͧLMUզ߮LN^ZZ.f[t=MH2t-wQw+kR0 7ʻwv*6ȀW^ȐQƢ ;RB YM禚8[R⻋nKEI`ՓNT|D9RlNFERjR3kf٭RE6BX N]MʂB*:1H˖ٻ޶r$Wl7&'H n'40ӻˢwk#%Iv}Gˎn((}8#b}5Ψ8l>#^zc9c@r6"BIs#@Ip%m kZiԴְ(9pk*|_`Rt@JK98#00+<SK`V`B0)[U=˘b%/Q%yô 5h0d=u{!/}m>$gdr"1ȎKG9Yh0!S+gƫܛ%ƛBNikLR&z˘ˆSxpHNxWVwH"ڥuUaoX17Ap6aCn[eS|A`LҙBH"K.20I_MaKR8LrS|4wya`P^:t``nyI0@d62 hdXv)e9q5n deR B`1%Z$U΁z=fuA-Rv |Am5\ hyt(K!,\@LruY &{8Nn_àgCN`KeO0h&i$h$ !HEjUsK=N[T3[4Җ ʮ^&VO\faʮH:hOY21#K˂5nEmv[u]jŸ~Rwd,$rGp<{SJ^29 ׿l0F j8o_/9hڵ1lԋ)(F~6]عj B Rrj 5Bge6y4p5Ro+ h`uCVL6oa:0Gg"#9D.BJ傴.E;ΐ5 EĐ~ц,zD~$"/hOwy#svikv@)I۳㘄+C7t P;[һf޺v^C;_ÿm16`\jf8SIlSߝ HD犐ee"*~1[p\Yq)rJvV9GN!Bk&Cp VM Z€ʐcQJ:sap C8FF00[9 Zwc]h'O;Gd̮/'ÛE'v;!˦ qfo,q'OIToG`& %(rAv: }h,Υ:U58XGNsm.Ip >q1qcPAKGJW;5x d#nWuq0G47#aRw"'Uf&Wqi`_}\4k>E+~^|"%4Tk0< Eg5`pF[d?\oHbC(JVSG-22|G!W++c3kbn#, 0m裏QzIsYJRO7Za }s]]'gO\66 ڡlExu*6@B mwxwn=Bn(tبh-]xF+rKg3Z3-^jnҤyb"%5<%%ujxN^rK{NqG6At(At"KH"Dn8 56uR'XMXnJ% ܱ]8iSܧ|T؇HkH)~|uu._^,0S& _$& f)_v4 g \Yhݤsh3RɑR._IɥU.z6 gt,$$.2њee,7 "^֝[WM~o~]O{jfQ잦M׻`;6wk$V9u~B.. 2z񗧛ӻM4zf?il'zry/Z}xZ^{r<-ݝz~;Cd~<ʇ>ޞ?EmN z J4_>z(f4K<%|ớ=|~ /nn[7CgD8HMofH<\c?F50tN043 3Z8UsOu1к2Xp#^2x.t! = ? Ru#(6;a#GOG ՙ{!UtGꋊ>0Jգ1t %gﯗ~x]Ҳn_2(7;|%SzƷ׽YËTȻm<.dR9+ W+)(af"s+-Cwq[HIWsgmMǟfptM5?/:T4ěJ꺕t}cq2]+RM} FAniXKWܭΫkMZ@lkb(*oeVj_ &7y98$χi9K 8dWEjd5S:[;^.ď4Wr'Û40`ߠ[]iY)Nr@mFG(ryKpBz-Pݵ9xݤyfR7Uw-AyV}~;f M8YmT4QgB!,6lTp%l%kxwoH5f'slDW6oCC^gICİ:a__[&EN{#]Hi"YU506+q1XK*GR:I#爆 R1s+D@,A+,)~`L/K Ty6_&a;y]s|Le?zJ=Qfq|yi'JÄaù0~]n>%[l,')O{FxO HdD\ۢ맥I|BϳGyZUen*zdV!:EkI$?vc )\$$Jˬg&Dɤ+՞kRlCR%x:$ <RIDګa*'}W~Mo&~#*&v';l^b- WU>%4I6KUYbHҹYi"d *2өe$dcY iĘ%dzR(%)&fEt)bLk8W2ږZw[^W$㱶PU“;T.<]`ym,QobȔ dl qec2O#Otv 8`AKU&%"(MC. &R) d4d,&:Y0XVw[l?5p]M:bUo{vIj2.xT!gȁPtɐ-q$E/BIJԾ2Zwԏod2,ŪaW,b5xE4--boO gL ؚHn6\k r1I_F& @#JkS->5״kYk "Y.|(ޮ|es-QÊaD)\QJoAwRCψ'dD! eVZ4yqf.GjR`B 8hL٨tLt !LV2Ԗٻ6z*~a 37\p=QdGR8UJԖ)ew8E |,GK#T&NϜj: 2OΏo$Kia ۿb3}y_9yzX^~LY[+˾C}IzFIԺBVcRLiQ9t[/0YJv8Y>^kkU9@[tֆl,`fLE$0H:܉p½*L8Ay%b1{LeG%O^gcTX4F0LI&|,So(":pAKFϟatMrM} ɈDB@G}#MvQIEqj@bU"PMDpv4RS9Ԛc8dQ.7qNZQ,jmJƌ?ɠ@K1јmLy9)h=hLc R$.ჱGNI8=&f4w u Ýcne/h$\*5 !L@2NgYF m:g&!hREoNB@ lvߥ-Ph3}n&|QYVobj—ן.kzY𳻞/LM~M; Gںui۲Ele X?> l<1 AġL]-};L=sB3*Pv-]=-Wr [@BJƅ.RYJ_1 Yi`Y\lBk˵Td>ns hWwv[@m%0~SF],B>ait-ïVՠ5B\pk ƹp,%y=:\䇂oF8(  hUr 6 mu7<61 6Q}&[,wvgwL rV9G3 0MX-mxjET 5 ) f|L.2Vq<9#3@ .8V&NϪwe<26xGMb2}?=|vmwu@\6yxsw8$ 9l6 SSV|p[G2d;eLYKIVg>1M&'Ǵ\%U6c{5q[p+v\rW潫b$WT큟lV1+/2odA\ΊHB2j2#b|0 (a#b;L{#D!cLF̕&*-D3O#uAX Q9@eL"Yek!JCBmWMtq Bƃ>nH&yqM) !3%=d0悴X*SWTTVjj^KRf jǚOXd>mFoI(pgfT$6S(JFbJqe>uI'KxQhMJヨKi QF012HxcLLeT OqaqʸBW+*e9C:'垻4caC?LE^ >ƀJ*BjU?'"w1U@K$@:%9(&UcU@,l.\QsQG&erbil$|2{\JJfn(LfSrHe@b<L\ΊJZeWMU˵#TOrlŻ@h?Y$>{LHDz/wR=x Y0zOh<7ׇ}+!|K yt4=KWff_P r`4u8hzI7c']̈́8H23oܯ{O^?Ls×=GL~a8q󳭫sQ寇=,mvo'R(<^;҂3wjێɲ㽹̄=dHě@ p WmVsO9aeÀؑ;YwQXTN\_r=$Ø\H&uw>(HRዙFDUhq T ͽ irx,DFsL*Y%y4ҥYeXMX1L~ѵLJ}xp) gΏZ[;EW&cxv)_Iyk3zeo0 wR+Y`sBʃw#^R s61BB X4s)1i8ㅐIĥՈ|ԖhiʶRM C!Y 䖂V*YiA-f#AB!@`J9.tY M- z B8zIUHIm|@z/ ,%S˖貌L;oAGUd< 79-!QLdAl6 WRJ1-C"Bdc.aka*ܛB[enfи[׾[gd_ӂب^!'Js3)).Hb1[is]#WHxqڍ;gd22mg;V ęsM P`sZ|E\7X:( Ji?!ߠ԰kZ^rUZ"C_PJ~\>reT;Q1 kR5B_2i?7zMxq)ٻFrcW}^m}p$v4#ɞ͟O%ٲmn`jUY9?ji6 ݦgdyςg~Z )~}#4`tPe"M)jniG3{~6s~[nR$DH{9 j W U_-3#vxNɍ ~=H\5:JaA1;ܡXh!T*>r߯ sB#a`.NK\P-+s2ѶkةD"|R$A1ޒmX p:'\Ճ_r.RjӸ+͸|jNoWWgR9_ۦW@vA}j cS[_;)`\$dUʂЩ)VqTȡn2ëPstHQS,Q&^Wu,$.[-'YuZK ~8[ˍb%B_ZrpBfB7XH^3V@=-z3&qѹ-nS,k~Gt>U_G)}.T(ޡroY +׀|z=l9 S:;%vp2Y@'[([Rd>;ݪ 7Z2Gz'V%͈* !2_6MK3*)Gir&KxF'ZgcuքwG@%~בBA"$!]0JhLX ϑyɴh \̴T%XRrB Ke)ڌLD`{fcB5(̵몉*yQT|B9z֕5z:&R$jl L|4r O깈թi_NWVN)w̤GX!O]42t ze9(&)Y8ұ&b@ɪD m}dApXiR&Gyjd* cA,yl%:% ۔GN#&hu"GwV8+Td82*-|ԗҒOE{ 3E]qjAt>[=-m+4Z4TK8,)22 ~yeSS#!whW4Mv|1H rwZe*Mu6;5O|?FcՕFևՎk<{.ݓKG̷v;HTsumophW}s[[Wޡ,gnmosꬌE^=u$aym:{i=ChvS)Sʹ|i0rcj9\Cal&Jm ծtF ``ܞEGs%$&9z)nFn('I<>rl&Y!>,2r蔕RT"WΡ?^!e tX2Tn5r/"no@L֔5%kMZS֔5%k)ZS֔5%kMZä֔5%kMZSfk)YkJ6:½BdϤ5%kMIkJ֚d.рЃ9;jw;߾JY Lɉ p 4  >I{%'?]SNNpY#Yd!hr 5JJ:p\bA@v]i^ iV <^^OY.T.Mgn~Vh~v5 ޣ)DYtm=DoM5^xvˉmW&rJA+rỰlgO}낺|Cs?6Q|2X]s.dΗS֯to!{қOcM! J7珂VnC_/#!ӑ<;-t?xV¸_]\ԻQ̞X>D5Қn:$H?Vz%Mr߀gM$Q)& ϥ7>&EV *pLfYGy w]^'RF0]Wv|\dz܍W_CCP^^o /mf7ni7oܴi#b{3B9NgSpX< \nzӢ9KT({_6 ;..{,se6l~(nŮٰQQq<5v\EّeZїmiEX-=GW&>z:]昹xdݫ[& ".Fcaƻb)-dI^}?,|ugN]5IC_/E_Uab݁b-|5k;3J 'F?]2b4BR)f{cpJGâ̩$-"m֚ABEU%*;ygq+L4~%V ƃ l1nOǾ^g.lz_gdQPJ⟖^J͕(%'9OspS\yspR9v6T"h+Vb2E :tY@Iy"PNg^]6 vz7'+QtDYh8ax)bee,c"f^%q7mĤoCLwncyHqCAY^[+H_T̔CJ qqJs%ռa1QBZx+E h Y(5MRB I* pϵA4]IZ6C_B:+iAX4F4|`%[*H4J)dYEJME?L,z !yoJ2B( UYT4gfHonH=yed>kc8dQ.yNYQB FJO6%c&ȟdPvB[h2+4%YC4bKJ)@V)hRw!mi\Sts `b&3ZP!䨙%UA^WӉ^/;{U۽~i,|$7}׉z-޼Rvȣn:īqZ^Td.uQa_FfJ,]; v^ze@NPc5#yX'߇-˖K%5& Ri,Oۈ1ME4hWRX"Ƶ[SGmIolT &]/Tel=x(]OӶNY{@oHl33E(_oG"q4kf8e:dwV&iDm>)38MG̞׺\&{?xz K;m޾)`۔G.| {b:sepVz=j;gOɀs?Ch=s(DpM}(Xf-0'93YQ"9 ri *E!l.yCBOvAHa reZ3*WgrRMB*߇_MO^z rm2~7ݷl/8!ny_G2_?'7[SO?~^y>>YՂ6 H+ $H¹Y*d"fӒt9F:+ʘr=)`z.E3:J8%\*{jQʻjR_(+BltRwD@h0~Iq=,dz퓿йx4p#.Q2XyHbrO#Om(.)ׁW24E`{PܦP:_PH0T#Snj̀6ʐ2 T$%Cմc(^[6JTU,'OL)}D>znfh|2fnфPl9OUPCBo$Gʐ r,:Ă&(V,1c,јIK8ơxjR+{DtHge/Lo^[#xPX|*Z,LLڙ`H, umWcʺHK: kUiV,~육4` z(U4¡(Ϥhh<)Pm)Hfe#3*; @YI7XJ5`$-* 3٤R> 欨lg9ΆG."_GhCQKcā$ݦuڠGBFQο2ʄ2WAdL0W5x ~_:{Q4p³`BN`1 EКG UF  jUZa>c˓ d eÐ@V^X!F2 B٦b)(6Qq ӫN|N`@ 9Y1AQYI=Y[As줚h0)!ɖRe/wH g^)t,  j4ݺ:f j5 yuTny^fVkp04^<=(e"ezɑ_-y ݘil3%%U RVJNrۙT*Ȍ~A2H΂? "Hv4N' D_QQ@d:c#F&t#!\źNFw9Wg<'͓wNXࣱ&'  ڨ"`+=1q-!+>0wuLS4 ܤWiBs栴)lväPqe UfZ0gVu>*zo=L* VT'>^fIcHY: u) /u$fX g>E\#ϥp6Jkh{"%]wX8J=~ڿhL-δiDv MU#8wWz]hz6S=9дCh*d\邰.E& Le @N2^?".4=04}h+pޡq:dFxmwXw}}vޣGo])Î龸pvGv\]osTK*js96uH DˋAee㊆qK}5ze^!p)1Y UԈ$f!8J0 [3>d2Vq<9#3biPyٳȻ@Fo-: ܎/FZݮ:{~nVwB$~r=mSyFbW?)-h4Aa pHVHI@SS7R`L8ŭ͒3d;eLYKIVg>1M&'Ǵ\%UmۂG$o:5wuԉ&uh}L1!WُT62}Ǘ˩5o"iPˤT-{/{k_=OB0g㮊]ik*Rj޹w许tB g㮊]im*Rݹw讄bҜS0+]sqWEZ`H)eޣ*]cQ$-\R[239>~X|ǽ У;zzd4hY|{L}IRj Bi@=mQZ{oOseyx?QA m<j0Oa< )|hy83;Sg.dtw_Я2 |]훘 ;XDNl7_9ݠZ-Yw  h)Bx*b]̄Sk]CެK+ݛ=ʛd\Yd8A#Z p'n 53(lpW Fi3EJ틼G(,Q\*qW(f\UV>r/RJѹw^ :)9u *}J3ɗ.M?TT(埧z_󟖃Y.iUy-\@e쉕4X6K~[:Zn%Uh!^*(YzxbR|IRᬳJpe(w\?C`Yk-CA u8>7fU#/93]~Ծ)?%|M^>xH%('bA|}Vb|I? T V˵@G:ja'=;¦T-}<~=VB-%L_&itnTڥGf͖)Pę̦})MK-;huf"GwVn5&ΞQS=0G &j oCATryݸ\gomSWzŏIR,VV=`zzûI̍ھBnm< 7F( vޘG\ f~VlMuҚL[o<`6k3oؒ`KRk6ϞhFO^j] n'v[y~u'hwD'RpϚ[Sqcg|mN>O.|>;Tt]K[:A%!+4⪏qA1t&i'JchCjw'.>V ZCЪE1jJ2#Bµ`Ȁ:K = L f^fwPlvFf0GI<>2l`&YBFg}DYd)+9MQ=>8nL1t?ϓ/w3\ԟtv,K(7MzVP(:^u( )U؏NnsJkFggxSu.7Ec xRr1ko(2أmN{f2CL/+2$M!#o+ E!#42;ۼ\(TN\Nr=$Ø1<;|Q ‹94J&BB T ]+-1%cdSJQf5])qvG~\ oq(f"I?`Ocu2 l*SRohWl٤)ާbpnӔZR;L}ʃw1\BKjA{+!5k+T )̥8QƷPB&Vc飶,GK#4l+穉M޹h=Jh*gVv4lDa^`$:ɤ5R+к\ de)6˗4/ ݄PgUpfa :;]Y/yrc2^>Ejٹly.ȴV4vIjn6j9-#4C($3egd+^IB4:(D L:kR>Q.BnE؄v–Šnsajai&,/Cܝ^>L*SKo|:d'vO 6\5'˓m!Q рʘ]JFB)KVHN/#QdeXɎwfF;*1&zriBAL#% -0J Rp%w"LBA쀹;Hs ):<Y{az1KLPMzE& DRi6F&4x@28JDUk^)!k@@p3k0 |Z{ ,S?dSxCCROV܅Le=dJz[nc!I Ř;Ǖ0d%|\tp5ʸa=Wjoa$K&s p%;QĢuw2?噟fGŜ vxuUUe[#cef.ATM{cf/9n{Znn%"*3,1,ʌ[HD5(`'e6+o]^'0S۾ߎaԿm?qwӿ܎/kܵOk)yz}eѾ:즕vsu8w@3ٷ[~# { ?2-loDa,tvy wgk,300D֝n76(Nl[VޱMT:1mP}g&2ESjӰw 8<_Ȇf&|\GrmrO;O:-G=eGr#:>L3 3F->Ž|u|GLFϙu#TW%nQb0l3x&u2=޺=kbi*-ME&I^OM]nx'ƺEoC ' <>eDПJR)jcpJpFSIZ"m֚ABE.mh`Bͧ݁t.޺J]۳c_\ Vk6W-UN2$R(GN¯? uhTɢ3 =["H<pR'V$5h yWevtg tH'B;a,FgK.+c)1qYmvM;0{7\{t^D鵵RA%9ybBVcRLiQLԹ2_Z bZV@qJ0pх3,ѓ4y^jSQJ4-RZNƺKʁZ @0M.ց"- {mARosM-xBOv.*I(NUZ>vtrC ɸ:cLVrX L%̢V8HѦd NXˊghGBy ?U 5]H[ d3nbF3ZKYߖ2-s@1wHlJ,j+ՉF6i;(J8 *~~s)M>_WϹxҗ0< Q}o~Eo3?&QǕv.Eif]I@ʷ$ПUsa53ǢUOSu.x-xI8x {mS χl{y_3uF~}2T}s5T/ag&x?~4~zz㓄H 5I넁z44SA5 0ݗ"#K\nc6^Jmh z1ro9lȸ̜x?Ky,\´ 鄅]Շʴg;5 .ɿ~v~vy#vi.Yp꼯A,1qAXAإVTҁrň̜ÈK]2ts<e1..p^ZgoG]Xhĭ_DOIr/-䢩)⨝p{*x,b)Y莆h?M"$]\Rj6R|qJ`UV*ꬵʒ3ﵱ%&صt)꜆C'4{u$y$jkmd jY}J-x:ؠ)/\EW}xZ|Yϭ6(sbqU$8ifU5wE7m uhC8QeT^өNz^v qR9p˾0"s-LT63z%qT|'lO2Uj(up2ލM(U1h|k)"Cd@!h=E;"m##qf2ylwPzOɍc1By]a&뮍ԹJb5zͮSx d쨶/F%jr&ja-Ϫ dDb6$zAvPt}7#~M=i@&_Ƃ-uQXۦ0%'H=v/b%Y:*Miwg xTjc'-JuT|n=Xr.^]ѾM4*=A4Iif|>^]~s ۭ>+z5ջ9|٭WP+~]*nr" 8n4oNZ=aVث|t=7,;Biʼn}}wj^ˍuņߦ ^ɨ*oHܐ)V _Bzz/}'/Kz1%bLVZonT& σ\ūJ2gG8q'rOywv73vyG~q\/Gy&z>8nN"V_#6ג_W7o޽eC( qk7N6fW1 '~㑫чcVmq%5XWk0{bDhme:bO9ȃ .]`JlRȔ RҲF <|PF9qxy9(V7/Kc/yqƈ?{;7n횡^mc^_&h&??&QFΐRf/HVGGjUi){V5o=ITMLG AF7sr4ըnu2sGNŻwWc_^l|}^#Πp@cwMq뱿8~ ?sX4K9,qZKV;.}j}v6Q>/\}9<+\}*Wd%9 GW={\zW}Ap󋁫iW6xp5S'~uY< $s)p~ 8tf/~;C`nez]ٻf?}-_/smWlz/νk 7W.u*TҝleBHve {gW|{x7K ";)~V;b߲Nu(W)h^G1r;#Z!.$_b~1-{88땂oSsfs!e.FεT[ R-Ɛdt$,ͥȾr,zD78dt}_Y{=Dr^ۼ_:޶[s8 9RQ2Ze󬺒' |K*ÝVb蚖1jΚCn )lP[2қ]cĵo} Lǹ4"()9%m9Y2a9Y's0„_#JGr޻L-aOIISsIVL_:dB`e!7cTY17!S'SqnsO0Gt}8mϐ}p%43΀4rDlE  Kq EJЈ!yT(3̩:\ए"kVL800)=?sCvŪ5EtT`^^Z,BZG'9yɬͽ&Ug ͪS9בJMIFTq 7 W &u~B-+pE.ke YJc TCJ0@\/  czH}2H@Sh'>.͋5g_R6sH eMw\,d9X X:B[(Gn٥?gU0\AAw+X HXŁqG `Q|\ q*v~!)Q-Wns*eX"D_86k#ʵ3K` U;u)#)Yy(IB!CTrz"˽aT`ٰpX|.@Q%4v`[URE;5LUbYOUXFA2Oၽh,x'y]i ( 2ص viL>ϳ&U{C%`Q l{m@ Cw4!vh0yLWkՙPI1 5Ά*76 rmLQgʳX &ej΁Q".Hq{EUWȴtfeKP/u~>j?4✝셮T| KK:s;㠄a+wpNz żϳXx*9w)S*iϮNjp0A {a)0|a}L<.vٍ_okϬ`DnQߧku[wm+ă Q"  d3,\`k&ͻ&QdPj܆LusmV XOP#1rBol(h( PV Tx$Q*LFȼ"P>HRNKh-=ŋd˘L :'dqOg,ɂNu~d:*+|q\pێjJz%84x~c݇m~}C dRR̃Y`%D߼ ^k= .}i|̠K \ʨPKd0JiEi(v1_%RٻF$+_e<, ,3UjRCRjsXI$V$3*&2"hv)0Vcư@pAs^1\{XUAzP̨mFܶQip|^ANGh0Yc6Cg 0,:QM ;#RF WGnP\- 1b81΢b-C D%ΓZͩ?#t9b,3,TLkTRRTUTZ[fq/vTk,WH6 "ی_N%Ta. \ F`ac-ރ4j |ܔߗڜ[2%(KD0um-npD3 L[ +SBSv`)jYu<.֐D]k sѦr8RV,0z;TOs\`ʻ_~#|Aao:D,޴RcF6JtCK6ƹrPDKhLT@=BBH>"Kk=xP@ʭ`G`}fJu&iD  [W*-WEs)HN:03^+v*ĺB!DE HbQ<Tk Cz i,:azڰZ_=zV"5R&xnd0R+kЬT g@܏XH#jJҸh µ=k$Gvʵ΋:,2›Z#@K9056B z A,gfFXV2Za6!VQׅX~)#"?1p<{QaV;͠c6DK&0] 120+&)z,0 C(:)r ,Xc. ]Sq1yՀzՍ?Mf/nN[E!F5Y |p_7HY.7e㯿n}_ᦌXd7:CE ufM(^AͮYȾ~EUеA衺$^Pzu1Bdxn^+"η( :$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBꀹ:QN5x噹:0|Bgt uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBPl:ݻ `ub:Z]S$&:R1cICBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$9[N]W$Q5$b:֩(0zIBoQTyZ$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCB꼚PwŞfh+~}~/{/穀<_nwWh4@EdNww1"X*tEF+IW-z2jsdoԗeJm)\}) PRF*#rSl1fߍ~|v?x>Z0FLc_~~4.eG77܌};n)#6~s+ wMy8i% 80@Q4N࠸/mڊGϧ7ݵYTy|ts[9¢KׂI|?կxRզ`nUۨ|:V%-[iD.n{UkڳƎj%}hT]XYnoړ".5jbD(rVMrN]|ףIn?˱T&v:,R'n~@h?[ANzl,klG~1m~܍8tkoG[ߪ* q/ݾk4gm{}4zQs,?FTykhmK)#N~ralcdv̏k~t]j6z qIaqT}Hu[?k ]V8z7oa'z B +VRV`6WTU1ZYW֔b ǻ\Z ItjT.FɰNdȪ(Y݀du0s&Zs4u}VG&'aIjex]2WomOhpWTcxvBRi,|zE&z`N6iiPպ6Mn_j+c<8Eww_nwo9?s7x7qg;~} -oy|<̦#ncG?am?=wܳos4K4?TԩO]w=54WxⱿY}  1'ѡNm37ON;4٬e9C 1f641RkDu%Rڜ9gǡqhs 0Cm]oo51d,62R:NOnV*Z;D9`<͡|NޮpDZdTn0tV"?\[BCH}E  '5*zW-WeFZ ԈEt*Mt5Jxţݏ2_3_0:# `ogB6cb"k@sTqn0O-dhC Ni[ #8岨`pyNoV=#(nQ4w;êLi3ȬgnːB8,c^*Wk jYl*ynSB}̫zGr* +s2crN¿qOZ[^Ymo_iU%t1-`Oy>@eMS6:' V#-1Zbr-٩|uX4ڄMp=0C@83YI2#MO$!ULl:asoP}Y4g/ܭoY# uKumSO'mNB)Kz&{[[dj'Tj'LJ?yys3'ɖ~ ŀN]8*Vaq̸F (Eh gߊיW`!Npg/ $iU4)geK2ϕ('KRL:u(,Fu-FI*׊Z "-Fer2?K D]gJ(&&MԉL@[>{tP7C)YbH^\xY˺޷}OCqޛ#dq=.Q3=_䡸Vp0y 7BYeRR*uR*uHVwڬ!47k|kCsG/fTOk-ibyn~eI+YN~!N"phȃe *з.]X q<i6];jq3m}2ݧ?Ώ6l2WnڷNAl[!Mֹxajɞ^gb5{<&=WWoXGE+߰Otcy71ҧf!NMOE)uh;7w ۴*_vc/lB>{_IWv6]lo|a+: ׍|-M#0˟x>X,|aVF N}=}m [n:x6;hOr;s߀_|ɈdE/A`1I2)r,Ȃ) / V$& Yؠ^,9mLH-|Y~e9aoUƎP)<汏t`㟆Mg޶<^dS[ow;晾_\k.`8cO\j:9<9X;,2Ak]_9PuzKVɦ"5*hDcDezmߨ:8w.;'+^)#tʚ(}[@̲ʾ 9uP*qZ(eFbۭe)4dS}sOd px*"X(F׆l4B1y)UiLdƖ' 8O,|;OsJL8ڎؒVq /*JpQUNҹ/:e ҋ#9k;x2 4DKxLgAA?h w}=-9h=.poD!lYЌ V\Ol'}HCO"ٙ/Ifԋݮ7eY.R5N{o2Ё7Q&VhD$-TPuu HPe+JM+0 YX7EFU.znje{WǍ[ "Y `fq^ӽ,XdE3b{$E5EQ=3lvǯU_&cwIkuӂSl4ھ&=Q&8e랕e}zr1m%k/Lv&~:;ï7rݬh.ܡau5y.K&4%OQ#Q S0ߩmlN)~9|6d!&D*u\KfNQ,4Sk*eqN.'!2aw{ cbѹ$PXtY+&{k(F`$G$Xw.hb۠NRko+[Y&m`_9FCevQyLY<~/`'`ǧUn9~NJio;KQʻ[OSt`:3X'B0 .g;抍s 2V~ׁ6gAG PTj slpZ8l8nYpe< `, \Jon)'7vgh}7'L\E&+5Ri[ǷItEvaldIg-[UR!z2>a-gl"ʪ2VF%PVJ  sX1 `:r9DQen#H1tj]_|kb ix1_|2pA"ޛqF O0K(]8Q/zV㊌U90 yw n.ENou_XK>ڞ L)W]љH4˄%\BO!z'S{$ )e I:O#yQp^zٳɃ^̋W}ԕ%%#J$16X"*R K#s=X2/bc@ FGFݜċda\( t{:=89.򕮙>dhx3Th4:Es_piR0b,jy6VAԋBl9sW.kʥiBn})}O]OHMWVw^>cIa"7a6̰ܳƊst|#GW.e{rm>~PNmιX;p|zN1dT*XXS%I$, cAXL1)R%Yߘ8 ڐRYfKFI1\VjŌh6|+xurrڶez]{{LCub#C=Xn "8D}v~\zr~\J 8o^wjH~fW}=ƃz]DXQC>+Au}pRLT @j&K%g#xD܂:K,SsP&OL ĘHQ..N٣,N[*I0AX@v:!f3b>@WTG0c%DM0 !mp59]ǯٺ1uXzcپWͅZ z F8gϑŌ(Z#$#Hp{@T${1ꏳAᾕc<*turt_˙cc NhHX7bDmSP]U\(kQ?D,lYki#+Y]D=jpiՏQ$甫Т+&TjlQGzvGz'Iܞa,L)Pm{)b``ZEvw}?)_FKP+/tx;ٲδC^y ?} izch4[ j6uo"@doHWCl QzWJq֧ˣ-_¸GY'kG9ߊyxhE ,LƱUpiRJ(QU>O,{ѕa>jLLH {KR*lz`!O=GSo73xլׯ.;y }ݵGo/h>ɥE؉HGvDz^Xlu7BK#AD޿<)uA>5/2^BhU8D(Ռ!1'+u>ݚ\%LFեztRȧ|e.k8AT~J5ĔfF|)!>It z'ʧoj~ռًʧf"Y$d&ʽF:"fjMY67fL7Z1(_! BEhkm@`UlGWQC 1.hPiRJPjIzmfJ=l8Oso}Am`H٥`D]\XYqEp\El(^W6/Q.sa5BVfhQ$*T1%ı \DhaٍsPAV*C1Fİ ₈wZ`uo@͕X {lVj漩b`>%1!BN3ꄅ)%JH0Xz`D6݈;[͊¹is<ٻFr*}I\pOs<Wdruū)9ʨ(/)*~.q]ŝ_)9OhjFS֬D4FEV]#u@V.c]zK;޺Rпm=<@^m#6EŒoX8;|ˊ\"WzȟQ+ٴhu.UGэGՀG89B(kP:=`KCCĨt$<ĤK:K1-*)`,/ZG$Œ 꽭MJVy e?庲[jpGvlLUn)VjN($Yc*P*t=9[]k4Y\D5-Uog]H4<PBH)ٶCY*8(d'Y-e铧:YvTEylǘ@s$M (}pgc2lg9Ά-ggCQ塰12dfѪrCfn'2zmU2 ?f#:PuɧZR JRy\"92YRbZݠV Iq> @pTl$ȠY٩hy(ev)g6 %ks2Xة GcPtPnOAJS,8%kͣa@xF b2=t`Bhm3",[#/{`NvSt6u׍̣FH*q!C)90Uq!8D~lyK/'qu*b*/ H6e4}r3MgNۘK`7}Xelty^XO0f V,O|m1f+U@XkNye||ApZo# PzLJ}׷wӝ0a-33YJ'"YkIF {'z<9`=mMHBɲۺKX-I BN #{'{hrXo]1O38g:;„YwÈ6c:HIBbClʷ߱ۈ56g.AVzB-6ʈԧDO x6=>%R!NUAi(6fſ-t1*3 DECS1/ّFUY%x]ϒH 9O,$ng7P$c?n2]߯x؋ݟ [6Z"|ZYS#˿4$G}LUQ<&Bg- U;(Ⱦ$@R(P4_ȤXT>%(L:Q${tgoYg77wu[kgk_ǯE2a|S,)c9Nr3pebUdZ}"&wIRvFYÊ!=S01 —TyA:?< =-;kB1\+'nYNW:"͈y#AUi}l! ~~ )&}- qfe8CW?q-xh<>Os67䑂G~]Ώ|p'#wC+~"g}#^uύW?G<:wPҹSI~(EȬ:zJ,XFkSl.`1hّd=zs6k2$KQ8rծef69O #tFYغBr#}` bnCCkFTd0G[7SJ#W$c҃w5JPr&+"4,)lW 4z!XpCkdUNJ늭ѢؐwZ3)F.F;J/}i}gۑ uuB~ +eg:-:RhJC$Q9?FEzxw\#vrpOj7؀MM&IN~َ@h3BGBbUq?ޥԼ /XQCVIpS!G,t6[F;k1OrF9ڦѕ*- A(;84 d$ր*u{!Z~ ߷=~ˉh9c'u2=gK>{5۩ ӗ{\C>rVcקS)[d/6[!{tPge7ǏNZ/rZ#9M{ss?n[e_-p>cǫ0u}^g=wK@W+(RGzFcÁ!rqcK5\^E޴jcpbvwX,e޾ 7:?A89ZCZ|^e ^VnfgrwCe/*_ |ݗ{sizu3xd-fAdvٷ@=yxڧ|?gql|idX%j\C?ph-se9z-vu5ʼnbi&1mQCW\>~ɥqkÅm{qͦW^߂_X1ܐV)q{k#]rz]Y|Y5ږ^YxŽG~ҵήW&gY :F|ت]K5Y78 ?d9_u/Tz⬛=Ɉފ-.v;R1iP|q8vJtȍvy|04+ b'^`5(2(-a+d)HfEveX;V*G;yh.{?D7W-u{U+eZ(o7P6N3=& dRU):J:4Q?DGb{[jB d;אP,HҚ t6Jh4҂uLn>YoC]=$DEA;D)+CbEg\9јNmLy۠i8z1 |p0!kYYݵqRjf@NkC`w {M'޸{jIm-ޓTʳ|o[~QJId7axqu1OXnR oz/c*&(QVI&`DWFBEb}>t,[Є|ak;k\0*:R\ئ[K:{$@]<Z>A۔F} Z`?[U1Ciײ\xmFb@RH |&B^z0YZ~Bsq?wu@^6:*Ф@Q)A8DG~S*[Z͏P͏Uk3dYh.d)?{GcX$ ,CY,ob䝑 SsJ NSȪb-*&1E &,u͗¥8[EҡQ4bPUW-8Ps :u# [= B.!*K څ9& jIՊXC̦Lq:vUԂPEDjHV2J&IQăFlv轢YQM簢`Xj,VcO_AMNZޅ)ﺸZ|ϱwh|;D&,xv;gˍh($B*>זo;(Q[]hf@1uEߑ3#{+f/qkrw6QF(KZ}N;Z[R:[n9land<ζg[xtn ^9NZ~]_ [옕$ y˜uU0#J8B1VIZlFeZ 4i Lj ) 0JUSgm:-vwB̩Xn1mg=#ؽɝMp(9ScF01GMR@"/%풳 "Y! 2(Qt2ŋE%(9-(,HTQ ?pu!.UŘ$99q{r=9QT)7}ʝd 0 V7XfO:ozȵc(c 2a?c )0suƨ&1Iַ(GyKʎCWbL P < Zfn1";o$&4:ޫ6G! RI wVx-ëa:qݜ`"S*!pA/I^RgdgU|5!}~=+Y+9BOaR[#E >IVDrV "Et"}Pƥ*T'b1gݦH=}Am}#'V$XPd[H:ufh<6V(+ڂ}tTʨfA*<~+6ԏߛ#Z(ҭ~(y)_Z$uB!V7Χ$8IM+.Ǖ (\::1>,ᰰnv.cQ:.cRgdPw"54ڄTA0{j KEmM1Y#"[Q4k[[#l"凞L Q7yAg۰R?B=j :bј(V9r%FF G򡖳?G?c*6&!OՎν$4[^6ǚ|w !7$Z&!aٞ )s4 eKqdiU0b<ۂXDkCd癰9Tw{ȼzg=k,:2"˜)Pf]M,A^Bv S%sm>Hqy#h>"S]'|j%,c_:itury3/ژF6)`7]H0TZImPyo;Lj V /\0f$N:T/_~GgzF3VLJ}gگ\D,<^17U"GT%1X`rz^'y XX.YUkcvC TQ"JbT3CA{9z׍vi!WKWWHv0c.Xҷ[y#bў+Ҁ`%].r[r-pVix7ݪ roϵ3M؅OK4nϋ}l>nfx+QqfNY$I XFvo:7@ѻWk%F%[ţ4 (}ˀ{=>^=v_-Ͳq-N>׏'?.it'~痱ujɪU>syq\ N翷yE;nDT?-.l.N?V+SYax] 9{rcMv@59=R 3^0j\3j\3^TCt8nU RF"CYF!S,oCK<5Ρ|)V}0j0)+Qs>L٣<̇ Y:$}阶"bjGYT)ơBʃC-&+Y*y$\ޓ2'*hCݙɵFeZu3َx@Gf#Y8߯\-_bɦcpClɩRmg7'3 Mf@p])T', 3BVޣ\65K&ƾ,u;-V %8SRrlLyQX4TsU54~U׸;nyd2úcHdވꏷzf.=F]^{ߺ_I2},#i@9CGZE!Rmlwudm7s*pj7t{m-7w|Gvo<"V5BNQ: Y`[ɠ{WaX?/̧R? xÄjJ @^`uHΘUŚ]9\{CV1J9Ν\LX\P٠_ Md2$1\/-ftAecty -?#(g|5Qu"TA~}jRwE8"!M)_~GB°1q#dKE%oL/b HT咊%+(s MkO֫Q0׋4{){|ZgBS u6PEgL>9MZ-`Y^R%]^Ō5مܞ͠ynny D>9@:P`rK\P!p |6 ͧ^~5i]f6άPGglg B0{˄WCݦk{ٺɓ{YɽޠmLW.բ8r4`a3D<8˫lCWijDzm c=5D`c5.匾Tun&Htg* XZ;ʨ9͝mZMzGr!N]cw/f/۾߳ y6d&u? \/x0~roI/^rokK2owmmǏ"w*oy\s\BUA[g0Sش*xM/nʽoZz0~/`+-n\gc((12E%@3΋09R|H8[X&k1 lalD/#*m[JĘ(*rI{+ QAk|)}{mO;O1gz9_!}ʌاb/3k,b4I,odKTRp[d2UQFjyIMtgGq_\Bnk-DM>4Ή Ek<4+ %1OĒܦdv: =7aє ,(b|\H%c"弱IE\rC"ȓ[*;zWoz#^hŵ1qtޅ)(F`wf1@wNH)]gŬ'|rkc}L-qdP"E=;.BK%@<b)E/()(͉`IbT 8. Q^>D.^ n&B:/[Hva,7M"p'uq~)a*;iYvύ2\XHtMх\J 0-Lm7+X H-j${nI]N DŽӌw1cN͛d'}8rgNY2AG%W/嗛۫:/ ~Pwʣ0ϧ2&e*A.,*'ӆ=Tf^x._(?ϾiXI 8W5cV2\2 4i"(g&9.zY}qv7~!=(ګƣndi)O#m{NmzZ6.(sWOtǭvbDjѦBY`ӞB(-ՅL4P뮺.9'l9sf(]vU??ڋ+O{#` _~}o2͇]gDEq ^gODP-}ٜ!uP=(e-((TCҹD5,z{S; n{;H?ި9Y73*y/* !{^&AzJ}]º! ?jMxu^|ߛcmQc`yqPwE-˟ߙU "TP%x+n p+g7WGe_Nc/[,竩 j%oԄm:%Dq%mYKؗS,`%PR.lJRIk \=pm>\e)•CdM{ L7BtpJ#ZW(0S5prhS,`M,W4mbWY+ִUR>"\)ɮ&1 &RbMƣݠŐ_O/F^nulU*]%'}H%-(-|"L)5>O\\yҪ7r]%%,pg+tzSƵ-+UWetR6 vp*pL+&޴L7AWrؚA*PxRpe-ppTW m.Ȳ'bUuӯ;,`u`"w6 2oj+0uϰ\䟟n\t"*R*pJ yNE}>bs3[2P`ѢNW VƇM;رWaBP#WB䎛,j \eiM"r3WUش]l[2tRJ••O8._Fq/ʹnujJhk*+!gi wLY/պ✈Em+'@8*F繒m̷٬%8 ^Jk)JdRމAHHI¨dq{*1^tO5֢- hk,iMJ+h:f)Y! K_/f_+w{w)Tsm͑uͱݼkYٮKú&£VDAʦ$+!qgJ$벮QݹvnV<ͺd⪢n0X~v+m{'wV*V8 )(l|0#9tHA 1,G'%$(Ux?<ʙNJU~4~huӐspQ`1z7+Tj1 MGK6Dn5LHP FOфZrA |i)F +MB\ yvJ 9NB81-`wJə9 g嶜E G˒,^ZyE%h'mZ;c$:b$pTiWy8`*7ʛcݑL/NפftGl0}p5پcr~֋oGw =R K޷w2V" SWf#7wCvs:0H}ZS׃_G7o] ~ۆK^T\PqeceyMћD2w9Yiqh!-֠CZ!t2N!gK{AR.BJo"#B%LR{h88L{cbTperiDH,RbջY_'/x8LƓO/eoz^wŞLonROt9 Kt,qA} tT{Õ2.8 2+X‹Z7T$52J1ȸo5EST)%hTɓ҉9n;y;h4KV־.Qs wݎ@łx1C}?]%#GoEY^sϫd<(OR\ Bnmk VD܊{Ax3}z".ak7_BXC?-gw1u~1 ey,[BmpB lolRnxP>}! ⤺ѓ{Y'B(x4gM1*i̅ 9vVUˆAlhD81NdEPd&] Z' 5h0bo,`"ts@ʥ/_;ur/t+asN|t p7T6D9nM,jM,i|wmRwN>dh$|n߾UM&U!UWYDKH:Yrܫ^q4HR:,d`j%Jq*D"c.$sekmmR$bjb .{ @G `T,j*2W,ܫ96lP?qSUaeaUp|U=Op)0ůUhzR/#S/vӧg{K yPƺE58EmVe͸=2"goy3[W}J5zj.ǼfoowѾ~Cuݮ3Jwk6⺏0meno>:**Dt`E~:R>l-Cв| VH9biʥ,{[R$Q($ &Pt1qN(64&㬦`Aw>ԠtI#)g GMy(SHY!ƅB %-ƓdxeUpP(I!Fo#Ki[ @K-%qZw\f<6 '`!FИyy".鸳!АU x0 br%|,I:τUiQ c"Aeh ^R~)qN| ?{WƱd ݌UQH>,p)&߷zHJLJ5f [TLS])G G%0`kDŽ"TUQc-R͜a?)L6x(l!Ceo=iab^[8ѯ=9A>B\DI8l/+O1_8F}1o1`%SyI<;eʵ3A6>&g1If.:11q~#{X{rO9Ķ=Aj: +H#U5bAKE4HOQȗ=tʛt?QdT;yP~Ol?ӎE]dw~"[Wv!흋F`Xm0 |cPʤȅӺe콋t1!Pi)E$jNd-(mQ2+_3 5ڥx>[Ji8blm*&fWM ,_Zt=ݩ-]}ş^@R}[w+=bw!jt斵+G%,R-n!= ʘ?0:wJS\E+JF@'5;jP.Q;JTg9Gg#|dPzA~qLxML 8bt*Z::hu. &%ތU;)LvHdӋp?A\w,^wˏmSmÍN{/JS3lxP2r%IliK,(0`,#҂ w G62,P}y#0 PVҜ(ǾI!@+1[Ie'(Tmil{5W6*ˆ^Q}WKfYb_ha&`';EIpHNNL!{B&$Z$H(8;<|:O#y7#SI8)(d)^)-YAi1(dC$_l6!@#t<8~vgro+#ࣞv>jg9@N5,r.yF%I х1;^m6r^?`M- PCF=u] odG| ۏlĵwHYQ`AjF$ SFFN3D-a+Gs`ό'w+Oa|݀ mIAj>.wަHdg<f6Ato_qg.WP 6>IF7v G߸0&OGk74`:%#;ݒ:RT~{v}͹q텶 RLN*;y4lN%!1LGHI+ F[;˞ko_Ž6(A*sހDrAx5+C$crUY'}h/ľjdse[v.cU JfwZwm}XnuJIjO_ 6v0髹N==[k{{j%=lOOY:o{\Q 9a)b^&%S}(-P][F4J{BkPɜ%LG!Qj_#! uhT"mkp˨_?w̓Σ%#y)puDc!RbN|RHcyxʹ}W 5uP"Zl]ŷNu$gH| eFe$@{l<@,Y5T=vZ`)輊1DY"y4 #mU;:oG 9D8B0BF\h (AU QJ(221^DJk1&$'N޴lovOY>^ խqx  l|n_j֛T<kjGs.Xh/,45AjҊ 6WǫokͮB\Sr)LYDd@ S昮͉@rHB DJݻs}-^@>.w}]8`fys}imGoL :gb>O4>g~?m+Oh\-0/,BIՠ(!}c9FY*ZS]y;:NhJ 6 YP %+EEX!@( e*wfs9].8I9B>Y"R"[$@KtPfχk)*@;dggU?C}Ƣ w K[0-1ΖE]6ΥR16 7Kr9iKїzĬcd̜;vb qgZ}@*G-j<|4~:^,W9b$ 6FGRFE-fxdiE"BAl2XEiP= `8`iR=˹pTRc."Z1bwfÈ泴b;󎾠Q{d V{ #2x͌*)HP ]HZ"I,3%:i 2 Y9Cff(2֤,QYh8WYNe}xؙ9aoŋ*0;SvfDxױ$ʼnĤk>c ,xC HYˀ\HN a,rn5bT 2i!9Vh[?;3aDrQVOkyɩ:E;∋z|XO9]@&EFjlf-'8iaWq*RxNgyzC%>FўQ<x8qD.*8yeQG ½㬅W|HG (*x꣗\l`ચb(pUwV •m@p, 3vUu8bk}j#\A5C Wl-JwV-•VD-`[] 1q]8歓b3‹dN_ɿ oo?0YWK)?{t#%#`+恖'Myu1m*Zw,,5gUo?{v*e{'̗| 6/Wq<im bfwOG5.3vT|`~f0dnrS`!yTUҴB'_dmoj'ʿS('2U RCVcS^gRSNy򭀼~ Z*Р'ӐOM|IQ!'`NQa6p2jL^Ew*\48R7HC\U Wlj(pUUR \W7FWl0Z5`[UE0nE$z*\N g_PZpU!W@!mF \5h(pU5jJWo4 b`ચWl-URpB:\=`_{gqZp8kɼ\=J/GpuWU+U]U HC+}Wʾ-p"pWlUr0pUE \UkmZWog%6 Hx/ǒֆ ?nzHI0jCiV wVa t,2BE'>~`;5c PNii-F93ȥ%bK>`[N]Ͷ,9Wk]JfmlQoDWk>/k8z//Z@]t_͚A*frc5120'4LNDnm4|g󒭼D:b JS𬡣QIιXTWXEd?$~HOCoyI24'<<&+N_ʋLn W0U>^y(u=7rh_)!Gp:Q9'w:Żvw^O>Ϛ? ~ ];6BAR熌p%Gʠ w|o'GQ?CT7iaW^ei<1P0|EĨUr٫RڶY#؂3AB "*]҅}d P"Qe KGP:3Al>6קSE\ Ǵ܁[wmX4b@0X`g`3 ^mMeE aUɖ*I-SrٮUu9bq=w-_vU --VhmO>TvqՌis#|0iN8=@b\/4+ieB]pZ)eVlXoHN;kd4Jc,qjk%AhRKf'{#g?oYg77ѩًzUͨm`YqvխW VO&WzRr U}JG^0ZąU6%Y B"(*)~+<{)Q&:D&錀zVD͋2"(06J&Erq,*1Mlrx}<B\,W![Nq%PFKA9^I@&YO%2 !r+5JCtJK"#ְ|V_rACX m57V*Ӂ (1E {'=K,;%YVP s}j zB9e+1*Jlᕢ(d^*OǟzǓ#+FH42Gʃ :qB! kHm|2Pl Vj 8y\N(6:cLEm1e֊-O:p#t|OA(3>b2(IU42kPTxw^0G?p[Av;]W|P^#9THA 1,G'%$(3 _]?G9xc;c|z{}ƓᐪRCt#FZwigH $rx2 a Bhs*9WB\.RX+ΉPXI.3 9ϕ5?$P1?x@w H/!x&坈$D!J{<,E6Rg-ӚRO  41wIt`H5S Ӯ0*F~EX J 8.p¶r` j:b{O>JϔrKx1`F^LL.153U2r(LIcoEG% (SN*h+%I$L CBuO9hI{Ml@Jʳlfj%Jq*D@ 20B9W6TQhc"&ijb .˄_1H!P,j K 90ͯ!zOo]Zޥيlgofܵ??`‡+#~L^z=gV5kv-|7E=gy.y.jX{ɞEy֏z-5nptytՌ==.dg҇1vf=!pd՞çݵ_8x䭲r+#w=n;rmzDWLܦ>Q8jwCmvi{sN+B<#u6X\QR>eF-.(/2WI e uhlVI"L}*Z\ӡeth E  5(#Fr#Ҕ#( ?'LHN]T6ɂ T>8T]Ou42QLk\@ZDphiFAnVH{]|\,ӟK\>+wcɦ>}R09Qǫp5Pd )vu8WA'$jNMfOk6?2l\? ln8& X,rZ,Dn5r!oζ dm aǓy6)^).mŵcUL8&U~#tP?d4:99;_o#&/kјC|]֢hWx8P+Vf\kq;$(VzԖBY$x6'Iq(V(CMY3>39K(5 '7=>0Bv.ͫqnЧzg ~tnE8KxugD u a֦V ੰ 9}Nj؆ϐ$58ArR;**&ɴAL'`s3JDE^'%a 0rxn'{dA%\]GBQ;iҽ;jq\\蚽޷})}.=:Ԉ:95RCdS|BI'z'j6 eI N"Y_CA0aP; }T*@&-UeR>5WJ1Ȝ |Bc.,֗6#l KEw6_һ9"yu@2enlSj)x]An5@Q(*'2n|<KRs~S๙- z:E y^aє R'YjQ895JDycҹ yH ׭r#kA_g/zhQyF~)Oe$w $y'qvGzkf]<{((2GWҚ'pYp" B<3p;h{k()R&1Ũ@i(qiBCQ饐f"9}O,MA6GVS/*x#]<T-z8& Bˮ>+vqxE.(0\ѠU6&Έ /*`Tn>U,1LQ"R=h¿M$&sɉBZRR E""Bzc] 9xM@jN\FƠZZ cA66&lS1z?(f!/PUXlw)a}=仺۔~wt3Ln>FFvYiTql]S& C\(a:}F΃aƬwC߻oZ~PbE[.|u[[YefY7=֍[2ˉQ+^Q3׷|:֏;dqԄ6mFnr_ie@tvsbsww3ʴ~He\T JC`cؐ)x\ ^Gӂ[/AZ - 4} pzzv ą1d@hm.~Z~B1 PVH]hYJR z+nÝ%e8%%@0H<`_^b>w>楋0Q>Go1`s~1;.Uʘ(>BڨC̏3 _u>\kN?;Pwe dH{YOy"d/%sQCΗe?*kjȚEAQ3[85tl>OfrZVz5;M#$9*OUwTC(c8l:~-lav8i" pV6_W+2;`7h~EH乺y;ZpUOqz)GuWl=0Y~2;'zu||0oC}i}mb@*N.X^iC3+(Y[Hd *ǔufSŨqQhk<Ɲ*;^ZStK#X(H&I=k)b :mp5z * AL3D g8rM(-r80ǍԱlLUpK;}߃plzrZ1aIJbDߥm)c*v.բ~щ,"M_-EC!,~ĜZe( 9\BPScWHHrX?^kgO/LRmD))uE`I"{V)_F#gzng?\7iMz~ z A-ZyIIZPjS7Z&{CY5flγU֜J׏I I͋>WҼ_Vo^~I.bcl RGFoJ^WaOӇ)|.>I, =2"(i)~St֫sK7KM(Rv !u: 3YqLO\MHHku_IŽ9uͻgkiC?6kNw  ]0 0X23N_S?LpVӗտ8'Gߠi<Q1!Yj!ZZu’)%$4`t9Z,l0"݈GW͊i\subqoFW%oC-թ7:d%Ezp˔)8`]p)pq6x8:4 cZ}]o9^ _Ŋ2u=ěm`FBLpviܬ#{/2q*fB 2r32vu3nI` a(.⭋+̙w6DҺ_eX5\g?ǍtG5Jh%'[_c)kY|ue8_oz&/ۼ~?7Qbkc7uڞz5׶*C>,9 c2:Kvۂ}Ym.?֫g?pΎm5;'LJ 6?}ot#dy>G_6Ϻ#z$u&'ӠRp_Vp6Ҧ:r2Q F+89f ڦ/Z1.oJw!ՋO נ@%sv)Y[ rMd46_j0Z*y%ȜxQ@ހ dz4^a]⶛h͓i X.tV z$A i] =е'ù;_x|4Nb{?kyw7/8g;O+~ly8\=)w^١p;0{\aW~~EqzЃgU>|k(DiW?ޡwt(/^:E@Uw]!͒Xc;a64hJyic #P3O5=42EYgpbWИyy͡ %RAPVpJnFrYArrbS5`B>l8:\&ft|;]=~>9;ϟߟL|_Vn}_O7Z~8s;~2ْU˶H( JK3Ah4FāB%w7$wwȗZ/p;;zW~Y)/vޙS/S=lS;FGX = 5qV$k* `sh觙rب\((1-3y:]kU }_9^ד)"&PoFRͽ]5pKy,7>p_;Ujg?FNwё>=@=? 4ĻoOz`ZOraa`mG^ ڒ1좳^ѼZ@&*x*Ε U%Y`|_[#!O4'ȴ{EtkvDۿoMz;`Vjx r!-#Nξzk^=>ٺͬ`{i+~&бmϺۼ>S+rLFu:#\_mg,DG F_>_^'ޔ}⹨5^|//Tl]-5`Hm.JmU "C%!ܳM)y^=MaoXl'&,&LSʈ,2xoʷK%KM(Rv !u: 3YqLO~\MHHku2ʷڨa='=SOo^/ct\ڷle{ I3N_S?zljW=G„D^5y4??q6ZN v}MPe4KMhk91#e CslQ$ ɒHekdB3RlVy͍3LkBc6{o950Y AMV+lO\,VL֤lbI ޵#/vf@p؛䰃 vna{cKINƷm,Qv$lHVU*Kɍh/&1@ HGϒ某 橏¡h%GCiMbl)_ ^5a|][dl ђlxxלc9m//Jȓ&]v5X{MR(VޡQҍ; xND\CK"(T¡ DrY-QM0 t {'!B┡d!/;:EJaS1+]3*ta1WºPt_:ƷIǛ&5ٮݠL̶hb-;!`-7D)œ qЄ1.H5yЈPLNi8XH[@R2"C*ЊEÍK"uf@Qh@8JIJSgP=PFE5=ZLhn5>ǟֈKP/fu,%E(u;9z E >p$\8EJ :$kg ^<^l>,CSX~*lKIAfܩzCO8$Q! N\h _Tq#5VE?'oS91Nr0D.!P&+*'岚vy׸z <I_FV玧Ĺr(o?nE])Ewˌg0$O?;Z)!Trn#5)‘-uFB({ l^|{M!~РB o (Xq.|JMcm'Jd *EJ7jTX:>L>j~:դ铟 f 5{ZMXvMddW`m)3ߜUȍv8'a5^r˗{a*h*. ڇD4_BΗ*vhUbO#XijF-)$*vO(-{/jTجJ7ӛ?ac3pd@Z8jA,yEb ".ji4bZCb\[ %Z *eh.$sekmmJ&bjb .{ @BD Dy &g UBUXX-FΎW <_AU{~mM6.{.]McK:C3ΐ Sg7`[7ϭItG[:^nl6b[SnݽMEOz^jrfO7E.Ƽ #ޭXqO^4goyԙP@fΚϚZf-fE󗯝9XnolӚ4%jĻɱ0K>@Ԗ*& hR[#u6lhrZΆV-U;|#0 `H h7RAXr)K՞_RDpR=52(=84]O!c\k\@Zdp*#A=L:؆.Fņ&zM~_pxX^zKl ' u?'|.n:YE(>Y@L'WIAS8c'spUT3@~pTRMtsB '2dUVL%N]}S~Ƞy)(w}wjD-38:+q:= jޯʈ>'v|yjd ~Coo9+)pM'з\~.@gvG#Kr_1T('y›Bi=l]5|ysMݜ3>-s+e}(P7 Y]^ vm 2B'#3Vղ[}r9꿝^JݧǾ8lv#G}uuں !=7'5?Yo)oȒ"wIG뛷K|RNt?un>u"N`:[Sd&pmRx~'.Sci{|]ΦqEpwd9FP%a#LZD py`{ptWp HC!2rLYkbAjYj!@%%I@dO.%DCF@2rm{7ΫV&&lPLTDK~Nwp04 j;"\9gLs4{[p2_T{6瀍\\\l6bh6)yWluc7iM(},ay({GU dD=\]J5׍hC-gZj(. 0̪ wvGMq?npG)*k^;Z`톃a0jN m96&ڐjҦ 6K>bueXEyDZص0g̻KQr`'>F'>TD@j3l8SߎXH"΃\_31D=`x8Lw?6 ƝQ$>Z\9Umu^MX#_s<]qD+rG>Y^8oqFW3޶Hȍm|ǫX/ ytgygj`6$kܴ>Na=6bv 20|dŽ`eN;frZ4 ێ3K^;~;QJM`<4ZP,jҊɍP҇8`Wq?'΢Nځ~seA&iɞ "S>r^DwnثRl~JQk ќuywtyXFNfin*Ew;cWN6}$wn=@.92u:c@bZ(ȍb-xA.T#WHT{&d21Ff*K oK=ɍ}u5PEid:3HMj7ɰ0UeQ1cT)3ɔÚAYRJHe V"tQ`EUfAE7֔pS1k.2++iRQ  VlRjR=_'i&r=^lCtSɸ]O iZ`, n3H,s"e֕)+BFgU+P Ppw=оYM#yHjmF>Nn."eXZƓ*K,á ?PҘ} @Sy&+ W,.88^?]Aé UL+[̄R:*#D\n\L*1Y/\J&7v0ҽvnR?~v9.%p;DZ|`}vsB,rpZ9xr(ke~F:[ ]Yڈh^/hIӕPtu:t% *&gV<rpe4֕CʡQ& 5*=X! SD!hr޹@9h@;R)< ]YEs :]9b+l8x+xtjX\jamfZ kmm%̾ƪ\5LPWh,18t2ɍZLKZ#b 4:OE$thI $(i 4IBCTt=A\AhW "H;fFo=L ˊRU49zMJ5!3|sYvW֦ȝeblDh$f?&m!`j'ӲKsҁK Mҗ,Lʿg9&yz?/ Qx{i X&p]&@8 yE)(ݣLMia{A6Pc]o~VzҤϿӼLng_U#b5$9L~|(E]>ɒX .ᕺ77U#ae܁dHXc{}eu%*Y˅@Supw5Y'z̋^w.`ӬiTW>nSG`^Z0}:6^!>V$>K>4 m~eO\eO֦3wp'&hXM /H[d]ۼuA!3XQ&KgTۆٚ ˥j+wn"n joȃp}Z@Dv@at@ ⽳ucW,`qJPCdv9==ʇHXG3,xy!9*yZehȱHp[E0{Ƈy,!_m Z30sB<B}) q-G#ٟvG]nj>`-!wЇ{ 8 dAsރ5/n!Ge}hVd>)ځ;KqO9eq<)ުεXNMS)|PM=n7m aAD&2ϸG 0O-O2/u"خFOw%HZUB2 e'g:HYf.k}poL$@%wee{LY-rσ@gP0ç*HTCCB%eÏ-`Ã4> '/$UR e &nҴ$TڋZ]䂶W_F4: 2Zm/{p0Z @_^U? &0T*4)*0KIcVYE8m$j!a$~<%2hI<][f:5_|fV`G\/iwחKzDQ*'c8rA9.QN5~txHe ccSGƯV땁%0}|D ~).lDW+iX~{)=wZ3C;k#}OO>d D>ժnD'Xxzwڬ љu3?!—] u *.잝`f!&](/6<6[9,KfB : B<uvDG9C m4y^sZA 魗{F?o*T /Cm#fǣb6 bݿf(U9l%d0})eٖl$劓I#wh4Q528jn ct Ux!|ƺˑsot\A$vkll>Ye=DnahKd'n`TH{{pILC)gF"Ͼd9,*wmlTg3ek*+cpf:NA\bkxЫ ǧ s y۔iBqt֘oبzw(u͞uP*d ZwH,fl B夌qq %f<\K[t dr5ZPذÒG~`lMzAg 5X3P

,*6g'/ ѩH@zu Аf(>vj~˚i~KX~+y@JUVz/$pO(~,Ј0ۛ "tUJ`E/[+Z JIP${!meѫt}(z TII4B!RD0.c*F{g%-TB>_oڤģ;[EH9JbuT1/d܏TVX,5eM҄ح4WY#&u?mׯ tu'*ӚHvjo1v!ט?N>A톙|A~5w9hhߍp :TTNFhoeٍXM@vHsoN$4G!HgT8SN2Nn|!hoAkpp$'<"JNCEˆ)+`V/-w{>;1I fMWMRFuLN „=әf"H5U,rG=AqGNt{qUEN8/lq>w(M2e,xױ!Vsr<`IzriІ{9qΫ+s`XT>kM?Ɛ  Wb@1SD-[?ag7?r&Oh8׿PkLo $+AlۋZ$rWl8ƈn8Me1" Ly݂A璝K#|κPr^d43΋1v 6$[KA4즖([vw%k ϴĒhM.L -ˑ/?&r0ae`}A +~ެp;˻e9[TZ+P^*N!8g7$Y勛Zu̿|d~WHbuRb2y]/OF0ˌYED^w"E&8j TMZM@zhTbm&YV<ʯ.GJ~8\8$U\Ax?Si* 7D+5sġ=sj{hzV/Ȯtf8 f 'VT pNk*HWGʴW6>đxk.Zo~>z0͌hNnRև")x6Q" \2~vjiSYs ħv@U1ZJ*)8y`mgh259J_+P9}',Hj1itt`BSvfU?F̝pj}kD[|0*왏@HZOS)_i>k!C1Jy:A74gJ\g!S;3t2tDt$g> >{Rhw8P>E{;9ЁmC٬Ip'J$4 ֲj(skJkp lar7.J+4 YI<㨜F_Z^/DOFS3^kFNy")գ  J8z6yraidz2Q׀@6](}GZZ͔` ('2bnq|FɠبV2=YM k|gt:u+N !=YcD,ҾkLj:,o p)KY?O3% V~tnFN(Bg_&^ KkA('L7)Znt}k :b3i-i'mK3 ~F|?Oxh|C竿}u_fiX>z$ BjOa'ai\ZY?nCDk@,S/w="-U)DWF Sp$h!n)8;;eSDVN-.55vݜ\ @idWGڒ@T`)"11Dd 7 <#Cj,҈XIw^@DG9ݗKQc$фSf`<!Z ctr jھ~eR)S255D]mDtwL͉ڣ]0v71rOJKSlt5|㴰dTi*dVYAĄz.(S3}ЃK>a~)2s`gOjQ5HS~m) í_Fu+juo8OARP ɸ(3w`-L++eD$S&ƺ_-k;'GpsY1 [I/b4Cj JQΔ=! 8HP`vk)R% _Oul]mĶmg*GlW]q, B {`4 ` C"0G> R5\q0"ϘKsYL~",>, Cn.e}nL(vȍ}syY7JSyAIfiFp}D(ϸqr [R/U;|j=XvV?[Kp^sʬzO+9UtWS3va0ewS0E[kn9jO`OD2)?+6#|2Nj?ϧA}oʮZU߼ޞi7)N>vp@ Yyk̫a'o÷o jP`b}kK^':.\*ZqJ]n~MuوLֽeSeaAIf3Γq7fUWȽ[ּʻ;z BV7T1\X)R%Dì;צkܸ`R&Ov-k|_ t݌ ,o v*xt7T<~>Ò+O=N%:q!:qݽFBDtv"?[w^ZDŗM1k6^zu9h.Kn]<'K])G?[;wl먢\zˢ>eu4 =lm'UxxV~K4-py&8!Wu$ٻđ-+jkEǰv ccC"lgwc[G  *#k>}Qu"R #M kݾ3 };>f 6y)Bz27;ڬH|M?oTn3rK6E,׼P *M %OŰU Q (*5bB_3JB\8 6,0E59Gc7ߚb}ڙR6 qR_jń0MPv>lw> q گ?Y*-iƽ{yxy# 4pA,æ)'g.i; ID <ٝ{+H]:'=ykStHU[ر TXa41!۴B~ Mes2N3Y!R,dB*j z۝OI)2}z򖃘[66?E;XhLSfm-nkp_@C_w2PS߿J>ظ,dEgArM&y!}lkd_wd) [N1/Ңv4K˹O _?f/W&1y_:̯VX|_t(]M7Cލ<'8z"T(je"0'V%0hc/ xQd}LqIMBy/;+$-#Vn2Cɩ)7GAhiKcPeSDAo .>#8fUA x~߸t耩VxQN{V[ʒ/sJEtJޟ>]4 2̼Bo?㡏OyA -,nwY1 9ƺvJTB[8nL`/ޚ@u$K qq+k.{ͯ9zsq_|us$%H>nm;]XT:;ܔfHB8yq/2F!H.Sp({;jl|7wݣϖ4MB\ep Vwh,g; ;vnI3h:W.3)HU Q)OKD56`ze:";ދ9zV7*iG`L{ 29ak9ǑGɵY@Ekm B<GT,B3COu唯z;埋)W}#$OBEw$d6`%Hˋ\uguU"ЙFfZ.AVwC!k1UR]lczeEIYdKq0֎:CT" Ng{'8gE./rzDViU*n KɝNj]gx Ժ#q]b*eT:ģ!f*"EF!XPηS)e2c0e^5Ηf;<)=5߃H}# L`V@a6{j43KF=X: k2,y +^G[ ySvP{Ӝ`,M̶`α2VcUH<֌k\-y@m)|S ;CgZλBf>A }x7)K瑷 nGNV9ehc/$ ~|=FKUGufٲ 6Z`BFQU4eN\`:TT>:-"q&J(=p}֨Ӵ=Gx+%$qk kz,ՠd)UJ-%" )`Ku¢s-,),|vㄍBQT?A(DcTH5hkmhtuD'q>PۛX«SBu39cveȂp<QT[8^fPc9bNi&ũ,4Q[Q$)B*xA0} 1܆K)ZLj[cAj)UKX+`C@wQ‡Sk~n^pWʨ1~6]]>u:]k$BjZ{\͌Fxe=̽eә]ƪP4VLUw%@%'y4w _V]F3AۺZH/dC\YSiB>%Njg0:JSDu NK'yl|#D?wmL= \ːB u5@4fgKeJyĄݞB1楽fnΎZ9`z""o>‘Q.jlTz,<韘i)#zk bKlCPY?x\EdVt(G6Z\2kn n}d8PmJKVķtI `V\=5!m2EQF^uwsyCREG; {6g6]=u#QeߤkXMSCn$95aPTuR?0sѧJs !W㥘ýLp-Etd)Jw k4ƼL5]{뎪̴鰞;˾ɲh/}i3fl+ iMF 9bMOyG.ABSfXE=ŀҤ0GJ\z`Tcc7SU 02˼1 PEO7qA[ JnT^[ʢ)k6W*֜`Q5ޕhǏYJHv@!ur"6yn鍡b!Ҟqga'~Wr#yW61QrcXakD7Fgh=I^l@Abƒ/FS[辰qxx8T~3,5+MgSUhfԎů{dee//~K qky r (upmfM< A N|ܵAbqA IwUDιԊx2T |:%Ct"x[K+.# wZ9oeU9dh2 kCl}lOAa 6 ZY*yN"c$ O$qsY%U[X1L.I% nXQ>?C "V56nPJ>ǧg |rҀ_KG gv}~pO)PB$2Aa2ԢaQ̭8" 3=S!edR}m,;F?(K)9n1`P/i_* Pjl$AhN8gHtӕi rV`E56*ʍgx/7aNM Lu-z (H&InwGHi,ZhI#sXMKv9ЛS~!ƚn!WnT"wַo)0֮HŸ%QLҢ5RHs/L>P+3@qu)JӠ|u<}CyI.)9l(Fٷv/1ݽe_Hf1o*\Rv9s :Ye$:ijI;Zbi#?CZcctW5Ճ(>SN<,זFa FFc -Ʌsc;Hi`(mE7!Aچ{ Ƚe߫Z܅6 G8Rߕ#B1GuF0}VYn[B66o8b߽7%fԽwo]s_6}B]/ ‚7{t&*!uKr <$m+CLH ? TYCB$w_岠fdca*D)AϬ mGx f6'%\̶qIMO,6eDQ_n_ @1B\7"]M9Ǻ%Stk6??>whzA U`u ] e>T3 bޡ,'2;T= o62K.)wCB֥썎{VIP2}ijWYLjO윉W댉X5)BՇsA[ j4J|EX鑻w !n8NH ^O>|;)0O.sR.Gj^ݑZ~JAr⃃:agW$Xu.gut||w8&+hlxExLc)G!}yq8S`c'w&eU 76ZԦy 4kkn @\3ӇӴ}igzy{@[n8I,HMIZ,SLQ(|º~+ĞYrL +ֹY;MABzinQcB]-n6qS6_<a#"cuB%X/96I5x@R! P06TKҌY.|cdz16w)2>+(=!ĿrhZt0g{7]tJ4ΧdX8Ox ㏖{ծ/C >?D,ǨڗKgX3l.==< /<(Ym*fpT]R88zϋ\#S4}u8FiTܸ+hEo ӟ͏Qh<>ՠivנ:6$W#m`zݐ.ʌn:eݛZEvTȈ% DRެ~Y|t'.>] rE͵% Cd!>$exꜙv\+oJS?4;\S`[<)wbB*V9eVxz\+pt &\wc7#Hrl X!PwvnZ#=i㭛~-W˄j:"fP| V%7(# k|E52_K%v2ZjoLn2c,8_XϏ Qn"]QH?ϠFf/ ZMҴ 1UXMlfDLB(kd\,#PS4R^NSiJyrU jTXS7& C4!e7T](K,CT:w "b[";z%OpܩK9 m$(4% qߖVS6qmw׵뱠ǽb.Q_AFp6gl0i'c$O"q4G %ܿGj3n~Kj%?U O(kU9Y }-wv&/N7$255c|}j}Eg5_̾~C X 0?n>-kv}'lR$]%ԫe5u'c5if5ˈluYWK2'޸ٮ*rA%'h-c!%r۳S]-aAe_1vB[z:>31(Rg;!O ֏Qucg0O=I2H OŎ4)(=S#cl\-ʓ+t8AU52z] |i!{wkf\P#.*qτ?m;ߗʶ @O3:Y*iz*'ݹiۋ2)S,&Zc {JZ  5LkOwJR$Y)*@q14xJ&Sg:,)sAǡ)+X4`/ Fjom\YB262PJ-f]vBͷCOqR3h(LIuCeA>P_E} gZXK iW͐a2(kj8 }'K 8rO$^nZzà7 ݺ*GR rUf0n]/ ,$iPbN1FL,bv_2H̴AX-kq ~c҄ǯULC!O3@+F/[h4͋d<^E1Zmwt ˫H3zWQP Zkӫ$g~DkQ}Q)yv fc~2̆Ѥ !q)W >9d;aeN8\jzx$z6`eݭR!] 9Dhۮexiqs]17ެJ"nL8v&"'1J0H.BjdlC'p k5N<|/V!rl®}"ܩLRtrFF:@H@,{:9-:`2*]$pa@L& BSH3Pwh'ciD%4>h:O8穟=(1S~rv~2B#}e -!m'X\7 p(T LQyWJcקh{F?h&bGV?`ox!{/,~=LU>vvyhg6 >ذ}viixܫb]"Ѫcd#<)lDtKÙLZ4ˢs _-DgtCoy0H AgQC7fݘ$D ^z$iW-ozA#8JSmQ,qMH-9&XU^/:ޕf<,DŽN%(pAa4 (Oql%^FMv :Fٕ슟-_|iTEIdbH 1bhDa*< yͧxRS8,E_|,Ԃ+8TԊCc) Π^ &s`sh3Xl*W.6˲")xOqhiB8&HKqH*z&xA0Voh5a1v-~ǝ4$;Wlj) c a#}y@)pMP Qsy*>%u5G_RSC$zOrBd'7!(' -\z3-TdELmp}=CcÅ"\ݲhkWgiWNFyfxr1+$jS]g7{R j[%v2W+ u%p|U*0sb Y7TX'2Մ\ϐy?.Y^r0q};/! ݑuiCweUTywSPZDowQMAGkd\bQaoA7dyaY|&2j\N>֝ͥތecavY7R1dlpXF%rBd'ze$KChHAġ]fVpeD5hgk^ *wq"ܢVAYڨŧ+EعA} h@ b203_430 EWG}Oo[:sAsگ_)e ,߼>a41~*p쩳Ꮁ(JMU[hY9O'QWYͦ @7I֩sXlhI2 je8 58oh:VtQQZ{U%8PXW)&ߠ:o 묿`C+µ /$HcÑD(G1۱QPBJ@3fL׍DaYuҎ3Gv}V`ajP`!&D)5D4ǚ)Ǟz{ȳ:pCžd%"q~RjGdj3$ڸޯ*J?ؐnbNm-ܯ>ٙTAMMF.Lk{}6 1kd河 |lR$TUn.݇ײ5oW ]ۭu.1Cά%']޲aYӢa/=6Ͳ=ĩR7f B<RjRA੓Iv2*j.2!nC!T] :o. '鬑kStO[@0$p#q[k.(it"m"a[/'h u`BR:/8u_\!Fs6hZr:kd\*ˣR[ǭj@qg+H#QPk󡱶,IglAu#l=vRxԦM=l 4V5-g=.B:멿,ܷPX'Pyv-sF-˳6(3|(˦aw mnr׌&p'SI<"F*< 7uԸ3.dDJz=\ -1i8EqKp· =yL pnˣ8CiƄpdPpuTR#_U)>ѕrC:{5mUsbrԪ㷽Ϻ+qol461n:scNѴ'1sCoZsS~.t!$/.yDΚw*rmX!Nn@_#BŃLw(7lc.$MG4JU>/ߗ|KMlfKkd\pPPq&fe4+YTa)[KmPt~űnʁSePhpZb[s"kA!zimØV+Y3[RD6Mۘ(۾S[.na1H-|u/p$n۳o>'elTm<И !;XKjk5MZz|VbSGxs+[ؤ'$s`D?+t(7ntfW}ehY7#xɨv8fqD1* PF1S4=4p {?9X[1?|~Y,fQ(/ʧ9dw{>:OOyP 1yQR E638.)OԧM])vD:v~j4*n\hEo ӟ͏wYwe\plZߵ_|Z`6@u]8dm{Y)a,Mޢ{RiUÃBƶ~1M8ׯC 5сQ0g89 $'%ȫB/\ZS B!Zb nCS~0C܁Y) ֕?Lnkc&T +zanX߅aHxk^m znä@yAV [S):l{I㾾vipʵ#zz^$EYNQ^WU`2~VaMLZ seZ̚$TbZqQ 0֍a>й1}RnZfSeM&LXpOpgW˂e영kd,C Yȍò+(h @Hm[O3[1&(֢uom~݀ pkN1C\ ;;p-*3i+}ٍԙI#/XƼU ps38WK,`n[ݭVNDZ_E_U α3{3{p3<ꝵk;$}.VGG'zLRpEog0pPETl F㐛^La5C'e0d3o?CP qI5S< (2!5M< *Q-_WԤ0m`##۠ zp+"m{P$Sj DaJAX6DnGO V?:LhVmma}$g7!?ώ'yNԟpq;LB~ok*R{ef ~#-H> iiG.p(ѫ ~T7.@h{69KЏ9Jgҭ9Vn0J -8T ;9\je87 cbDڢLEd leʷ!P"9 Ihz[!+(=6r9uTI:̬p')kiNOc)/-msdntIOu DYmۖE+M}Bhsi?Gd] 4 %FHDp4be9%w\"w:FHX; d-  P旹WWjr.@~y2*4Ua}-ʿ|zs<;;N7_ @i={ FnM-&c(JFClkq*Ivz-}4 X@j)BJYPG_,][`U<1%&eW#17r,PͅӇ_j[̥~~Uݡw3|ӏ|u{'?h9 {=,h{RXw1r1I2_O 9+ZָѺnOOoѴfgDv*3U'S[#$m,Ђ,ַVZ&,HF2VƄ \-ZK@dD"CQJdVpuL+mnowWN]ݖ&y 4_TOE1&%E͂MȢ􆌙` l ggz̾yχȞ<W.`:@zڔ2lB2GQE627rdsX"'"7XLްzvhѪ:9z_}8)'sa&yjlꃳrY-0g_4Wv>7kGwU0=\]NNK{aX/GH14? ) uJ'nck0NrBb7 pC& ;6}$E-I`,nճR=5޹![okh_VA4 2 Gȵ]GTVc0Vk*l>/[ ,oڴ :&'7]J 2k P6Fue#X`4HZGf=}Pyu{Q=餑N"uRj*;g;4+)/:YuH- 9Ap޸ʸ} Fc⒣M0}'},j/ 1RJ/ML.49j6)}xRgu%QRYeLJ RQB6~֮w/0WnbTekbB_^Ą/sp\ 0x>;Bʡe1.|b.#AfTP[/Cfk0T-lhĎ+a,\=Ó2TtL4,(}JQ˙v -ݵj!Ap]Κ6hk>G\ľȝsFH @ZCI2 Pz$93dHf$%&C+2a K:nG7a;c hT9IY!r·ƨ#+ <|褙( x=ˋfnFfTRZ3a9ي@[l}$82k-0.'C!`p, ^Dazi\A슡`S}pCIaRc9eSMY)jliWyVK'ÕBzBϺ:qN-NO{2OlBl[bO*R5f Ay #"IaʡtahfI,^cqhnxZ14+ՒBvVJ5^fY+՛ B*'ZԍoA#Pas66:h5j:> `!"e)!(t{4+Nn $jymhS{>ր_v]ojrUkwJ(Ţ}[l0eF/xъiE?4)-͊qG&׿脑8zѣJIJnYginCSd[4Z`~{4kpҳSx" @ec%fBeE䈒v1EښȡHv4U& ' N +/K]s >ƾрxK/G#\Hd˅@c_kE]SWs.B4?Hk @sQ2>T{&$,\](!L$(2)-> ž@ž^W:xqZXd NY)#tHzҖU-)Q MK<yQ' 6& mUvhLV%0rdg!\m^@}#<7>P(r|v"<sn*\1fe{sTbHݍ vy]C0B7Ō.˺$h])E-]:1*pt!GrzCw |4 z9$FE#KJ,:0Ήݜ a^<޹y [g%Ii E*RɡSf}/50 7J|hcGV[c2Iٻ޶r$WUŏy陗fл `3v[qܑίCŎeN_*I?$Y:X򚄴K4od-’saCO%䉆w!RXX’`8mǒ_7^G/IrÙ%uҍ֢JNX+V,%l.Zk1tO:Dg]MbCr2M&rqqgpNtȀ[uVq HU4P%M*L2Hc#fjɅ(f4ާ\h8<ڮydr 9@:>ǷtŽ:LGhbVћR*OoSe)ΥJ]Z],ָ=2@%)% uqp^U*jۺ\TZ~&Jյ)۳7\K`1+&l UdC λlD*XHF -kDvVkw`Ar>>Lmgϐ=`̀Wuµx\7.4TԢK'&XGEKVLxӺiTCz sO$qo˚|.*8 m6íI6,i[=8$ÃX MdN{DtbI|c:Ii 1N*S0{c> }V6jdPAFΠCfBTfGY_!Z$fĬ.O)hpFc02*FBk[Om(ҜxțIኻHp޺9eҢu59 ~@qxu2IIjLR2IS{gd$5(i#^|i⚽ϬJ8';xZƨmq`ղK0s oJ$Vu N?[pB @I \Tm i)yn!S=l#evVШ{ڠ`pV/SOh{ \,P<UvRTѹlm "(Hc;X*(T7ZJ-;nQ:X֍5wlQWe@9I"&w^Z1bJ=Ha|i.j3j:uuD#9X;8>=tA|UIrLjcR.4Qj*`4sXq}Ig:{>\|*RMK1=и7oo%9HsO{$PBRfY/~_04c]K;޵!:IK"JLwfZѦBk|,jŰR!LXCE7D''Ox9Jwb*loa O=|u%˶^ }t~?Nt;h_zRr3B}x/w=x vIICe$X8ܭ)1BQ6D0^ " grp-3TR Pb7kKղ}\ᯏp Xq RǞuiQފ-/2Q `޶8b2$g/Ɖzۜ^[Cߍ3W!nhb4;|J ߵgo_^|uOP\^^_xasks|u/3⷏9?Mgk^k=oCk) v^y{^Gob2vcʼ:- ^M!7c8㫣 #q52G^ OT Y~)f5%p_UX;XRk.}:c!8DUcHLZfH]|vI*ʧerZ? 3F*FV ɰTyI|bAV:XM=@5W?.l|V%@%qEyjYF $inX<Z i:Aƻk{5kvc0OJ)] ~I/n& ~ *zx<fb)޹lyiY"7rJth`Cojk[~q-2^ 2=ApǤ-yQc"y;6={ E2?Mt9ܻ]SF֐󵪰Z۸w6W䷢t0˵( *xS"Eٳju(EeDni~>%Vql;L BKZz 9 -1 J!+J3)n9x6u#Of/͛74rjm8cUK~pۢ 0mË6r\o힣$Kh6X^ 3(o.~ l]9qʾ;Fna<ĄAlmTd@;PKޡqGkn[Nvzչ4fI0ޕ#n<'! xW< pE(/Y>_k(mRuzq2CDz~k R;4*CO:ŊHE:1hzJ"p-1;EopB՛_/eԑ)Kkqͽ+2 jÃ{G?PRp^vh$7͘WL4t-QkA(ɯslU:GrSUf$ݻ _YP\Q{05񎷌uA#VC%ƕm<LwYYKO($͛l8:dZݤ[_؋ wx?V$lrhVO\޿>=W';vp<>".V}~f.r]vo~8s'ڇԛaZnYp\Vh u}.6V>HqžD^/LGJ:OcU}3lm%6cm9$}걡н&SM.*#T3\(sb0Y45zfS4 OMA=~qb倵NU)6_DhVTT']Rw}R/60Hޚ%Uشs 0Pg[,!pΥW~l}ghf"S뎸YGj#5֑Hm) Νys@tK M'h;db0YBK@N2,}Fud+eR)FW%:z:T-p#I%4:C3Χ*91bbZ|(U-p256gDhϽ?v=XÅc|gDLo6"U[?o]X]|O~ٙnweAic)ƕ抴:ChI"P&T=U*q+UF[\])5tfg@a[4\}kF8luXlt x/YGŶbs0s i٦=Rӌ'K#Abi`s]wzG U[| vڦy;V߫ ;#rjÎQehSQQ^ mduI-h$%[d֦qilM̾έU7BILbL;?[J:l]r)BO;HR'!VQ@;iGhCk.3M- .e̜ $ĺJ,8%J%TF61mzgrpϑ3s&h2 )֋mZלh>| dtfoDܻIbx0dfw)7ڤlFx'S$IDs5#&YKj86lei|,C;h\$eݼ/5!sI$a9Yr;:=YCrZ{C<-\8Ƞp%s-8RR`vmk#t/^^.eH*BكAmGTmkl&p qzeåvZb}9I^7JO(bOǍmOI%ْƲ7AXHd9$""*D"po ,w$Nlb7oDO98oPFzܞvwkw4=JD| mL lgvۭu߿K@e+۾o{|MDc7+MyF|LںVL&.-p^o껋fHzrZbd5IX] 1X00SJ dm3p&j2܀ɤFJQ4If&Q&Blج F(ǐwS5m؃3&Q\|%oRPlD>w_=*~&0a7TäxY@Zɖ@būKhJ==o)HIL}B՜1ђ=150}M8j7WL6O|˶`}qRKS06LӸepHU6@1l?Z6 &Kld? oItGFlB$?i]nV+f4QNYvb]jE2*dDR[fbG?;LFjsϔR(h OZFg/w5̑qめ RԲF3-A#t%bSmi_Qm/fi+cכ+3]ʹSp8XMJ ٬F={aCO̦Hyh8{MyfbJ0dDnccE$ Ijb3maNfQzb }՞{#1GJM^I"<U%s7/4$M)5Mn@ڼ<m$FpM]DB?nM+'V{#Ec1V2:rҚj F/bK{#Pjz.iT0P%E |\ݑ=-й{9s"e#L5ro+՗?nGLy7!f}]ǯgivz( VAŝىw1~w1~fYl ;D,0~hd7+ŧ;cڗ4&g`^͈'؄uqAt+G//@ȑsv7|'oZAGQ܆?=ކ_{Y{ۓoqN_ QAk[rZP3-cAջqpǂR0e7b}$K6ơc8~"H|dԹLʭDc~Ci L1hi4Pk(\#@ \ݬiK#elS[p9a(U9q`dQb^b64$ŁTBgpT6tx/LLܣD^'kKkoXqЌQ4 ͓m0ɦlYHȥZ<4B8c.ki8"-Eǿ2 (ױ&T3XȚ<&3oJ?b"=l@s^:0&_Nbn Ĥ]1i݈cJ  bJq3 B6q| 4ɁP Q4 ߌx2p}z.#9 OEΙ|J㣻wz޻șEӔD יuNJBGؠalB+:ZN{P\1{2  9;!TvL;OxPeoF<qo} F1ȱ*{+Ɲʪ@eͩxwךjWP k}9S*jdbU)2ٷuV%CN%&\!>΍>uQN1qPCxKn2FJ<:>Tut;#>;>bݱx#ƝGG8lC1jc3c7z=# R#Swύw _{"D5EO.i ;=dQ\5[o<'[moL̔ֈ2o L(bJT%)Ie2?jU$ %Kl¸Y\Z$qTVW$RZGܗ\2 IČM(22P]HOVi(D7Cmᖈ- e#O_՝Y͝S"^PIDvQѻA`Rрv<[}XG=MX"h|~N*SЙ')A 8L>k_@{\p/#;Ƥ9L GL'}(0o ,GmEq z3([{Tc-UU4=,=GɥhAtY:8-uq#| BGyϡZk V].DG\fd&D_֧/񑪕ߒy&[< 1[P8d`A#WؽLm-JP4Ό}7 /cɳiT1<43r릩Rew9 jo=k&ص {́:#*g)>VRڢ΋ďx]{uq>G R{5qdyo^x+3<͵9iZO}UZ,d 176yA×anF<X}cze[1ydbj\3s\8zŠ}G{q{Fyi yBaHv2NP-sD-geSD6瓫4jw8hFM?Z/]TXeL VdJ>ؿ,4:ţo݊qZÑ) #;=Q]uϓo/x e:3 ^u^6{b/ە}W#L7__._GM?oGNM]K3铁Η=u4}xN0}v_ WYd.l뽼t%*sZ}}z9]vƇϚ!]dġQ\q.<}o%82䱖)ۧSls.Ь{bњ[' CQPypˮQ#GuD\/ak t\lVhci;ODZ} 4ĉ?ڱ߈q /8RXѾ|ƬIc50^,8JiJƢISsc_3z8"pwͥk FΜ{'$sM>f{isB=l*ό\f{Ok*96StD^Dȃ0P(g7' Zs|/B3# >4h7ĸH<9!ԸoGdDpƀ'Rևx=]?c=z?Hq+xnwyD7y/Qѱ1@Bq={~GtGX>mO͈")(),#XO{+ƝLNcӲ=2>hŞ:J\q3^d#޳/qAcV;T'#/!p_]5h1E[P5x)1@ĝtWYckLqy-nmpNsjövL7xk<|-J+;SeOV.T[V.x fV`}m ~p`svޘ]Эnu h+^pQnv G{.W\} W\:[fO c°{J'/0PpW`ÖO `'Ğ Z@ c EV`C\ݏlsvfF}&/W`OM+N-)vVL*b+gnhp1otΖQC{c޵mq_̚.(/y8oC]mDzcqr~vɔDI-f7 Břong`\nRUUUӃp[u;f˫W{+1MQT.G2Z15%uTtUf1t7/[|T0+.d(EpЄ pRP-h>FpF`V2=A= UgrkTϾ=ubccsfWܬh Dm `FJI46)˔!8RJTmLj:Z5QMSޤMz^0AF=)bɣ i7@1MD^m"/FJ,I22mK[%KkVg+y>G*p10u@RbwL}r^X!RO%H%{qplQQ jw؋`N%@Z(4i_%Rz[ux`h=砭RyCF;Rs{x-0cl5 IxEхq)(9hxQ&m#.ic7Wzx}t*|F37zwF,GW b𡮂^ЗqT+LZ-nb}z*cb]lOdmS6Q#W3U76S#p6Uv(-5) %7JeTm S _ u1K vJZ'7f%ڜ[ =nb9n"#Àj )$ f5#ZSb]pAfWN>R{4s5Ƌ Ѯ3v5pۿLÔP|d VXu@@hJ$X挋}, eU1NJ\FxJEqN(QQYË7.ڪB@ >q) Mf5BgV<%eYp'(Em@7vޔ} _+IJF*{]RyꗋEB5B͛W7Jh %k>YlzJ0&J*lh4 :(%l E׌*Yf?g#/875V :,Ň :H8m˚"e#JqتVl! b#or H zӻkW[?oc6z7EFP22@XP'њH zcvjEky  8a$Qɀ}4*VI j+a[}s8_`7w?Hsw?\uoR i]_q&gln~0W.L9OWf1vp&IO^FaіPɠ vj,W_(;΢D4B3 'jz$V_LDr!5+Of@_S=T_0=XePvl&,/ @.|+I3p9_K`g _ *z+Aq9?yp~0R%q%HTgȨF(siX_P , mQ{ȱɠ`s )nC"LDVѾoGH;Q 6؃t ~jWxEb39伕7 ơ/j;#Ί鸗ô=X[t,RB 4ؗ ~˔>hD9C o ,[:q 4B0\^R(;:)#((v1GB !yۋ ;zȮA+%KXjgcpZ2s&7PqRD&@ՁV)1UEǞ+<{>~])WSMկs?5 #Mٚ̔P.mnYrpli*!$k.Zl.kѼ[fb\)蛇5G+`toycZ+f5joMZRd~裬i%03:[;^N>)Cjw}N k<):Ri-E5p8ň4KDZ˵ήc3{WU%64P{U$KVsv @͍,H}MxQDV(~ !9q+TТ4I * zZ) J!sqP֌ A7U:Vı[n `24V3h8K&%^R%Tv F '~c0v+e4o#SɐWҒu񁡏rcC1?U\ÍZLGSI}l2yg^Jl;$5@ b5i8CCW%tu)[AA4ǝqM?嶺w_(SuW`#Gd~6 8LNB,;ga-g2W6MQoGCAk@^zެyq² !\?AiQ CvVAAj4?P5mw{Jj׷{wԭn5@+Gޘf*[R_IJq`9o, V0 Er.yŃ(`[|;]qU>_Lױ'_[sBͺzV⫓9w6>w.Fxkĉ4>b:A]wb_zevsݲgݲڪ [>~e.dЮ$[GN?`ݲ,yTS\)nvI7NǑb~.GjG`Ho8"!o ^3=6-+\G̈ t;t4FcٳBjНgUЍ=+T)Y sLbݪkyI^KExFb}БTF&9&yI0o~T0}QNiq2.҉xqsF*̪;Eފq~ȣh.j^FMۯV )Z}q:jzk01g#OPg Bt.vVO.Gp.ʨOnGVyGQh\"P9T.UwbqI1hXUQ7 ܝ}{r#N.#.򠄇kЍ^ٔ}Ѡr-Q?*}ɀ}}]Svø)믿'\`ƃN,` o]}U{?t{ʽpN' x#Nf{ϣT`G$V<ϣLk5Y;bgq&H;bbcpB3+{acO{i<>g2;̙|npg~+%qUΜFhMm=5n8'X#[DPAFˢ}5<ޯndo#xG!x}ĉG MN57(:ݗj|;dR Owp4::xJq~I K5uܼX{q:!AF;1/G$׭.)&$sNQJPglIpDs"prUE9yO:GU=oFT5n0VVK ޵ym*ѨqA4q25]|]T1y$kԝgG-mvs]]>rqGf/ CsA6 ޿ݵNYظ}OMkм/S>s3e@u??)MU;n'3}xq]|6@OOw3胣aΛMG :Av!,i#NhD; 7h,|9%H ܼ(hymDhOLֻy6EKsFotf> t.y5A%'8؝n|{v:6 GvOlv;3;,_dW2jAӧ_>ftF)'Vӷ;6f9?Q}ft2e(lGN##g_ tE?.]P |?.X/_3qvӷO-9Ons7k ns\?S,Q;绉)& |bgY߮gIY {U9%GQL!͝)5YgC grt&G͛j z&G'942PD> &(r&G?9҅O6urh-jto {SЦi2qaC yL^w5H5V~黭8}`|G {i~*| _U1'J]o(̢)̽k?x_9O dWe-nxdi%Y tTJ/C)Gv(r0\vUԿ{t $Qlw<Rf Q = V v`ح14;K $II]4>`H !4T 7փs#{ <4",`/ʧ1#hgւ: #v+M:)A7Z,$Vg6ey\5uʎ?`ufc> B]iNj6t"\g=i+%vI h;B^i0%rTz09 Aց!y*C Q=6ll}ɼ~Sq\f]7 0D+4fr!G`Y"GQHINJDEF7|ADDaט%"O0Q.$X}$Ʌԫ5̲hN&sna6M؃i,bS:-2]gN>H #܅/{]0 qf 9(ftf~z>;h5)Ns96z|jϛ߿]|6~C?#͊ &vaYZPJFRJ%7e>uٗkLy870ӏb9{ۦ֞JxHx;a=ZPjWWzSMQčB4+zEM:*D~rK Ku [%_C>=7X1pA4g_V`nƇ?-us慨nmYj_U_ jw=yH%iS![,}fQٺiBQY<Ñc3E#J"pT۹/OMhK8D-2:%({L[ُiO{e<dk!guQH)%;RFjfi_׵m;GXY2'va5K Aĥf,=$85=K3,|hXßt_$[Y$qma.ZґСdyO  g GI>иdKpt#h`/dlϵ 'sxn62^H.aDe3 i&)]LܒE2=$3V|x%s)n`%R~F_NLCNKKQuc`S\[h Va1 |Ǜ".EM^F۟^㭑9u~#:-)&HiO #gEZZH R;~~ܚ5agN0࿗^3fM$Cѻ("*nP 09rL ědTVzž 1UmǙ ̛8#z~n\a}Y]ﺆ{{tTޖ|*m_JȆ LhL|O({> F)mKft@Mn7SH#pvf3@\Ւp–#E_gW](UL7Iꫯ 4.nmӕ R&ty>0A8/^ PA\* C7e9SeS,|#Wa}}5P|LcE􂄦T1n(yP;P@%rS|Ű, Ւ3NhC\,v݁uFuL Ű+jeʢw "5Jn^wVl{ӊo?Ա8ǹj9%Klp͔œ5@n32qv>~/d3kv:[Kc 6ҳ{UM y?~_=`piS+ vԏ~kf{+iF5YkХFO]oG&_+j'zQONNl<ށZtW+%-kxOIIꘈxǽI1pLB‰6+Q%mhhvz:ۯV1?f7{*es26c̸,#ւ6#LU爍@cQ:Gl;] Lۖb-}A_(AF*)fߋ’?( ^gs]Ti'W݄v껉E!>]Otf %X{[~SO,]h wD-0cXP"JG!E Vbxg6 =Z Γ\m{3(L]F4hJ;-X2=_.WqE zXwȶ:` hEY RĽJ蒳}"joIs`;/k0^V PmMOHX v= ZCV04 ag )#M&t11N#Ksp~bڼ]JSޞk7(th)㎊!jyiNgX(߹Y<\(Mpcnsh!%U}<C+a|%Jp(‡m*uy`(P1GuX LQShVq}UKǎ[P)jMy?`Ra 3R >m:d{sw91#P̜m7\,ޝV[7p`"bֆyRJS<^+¸ R"MHپPqȭ4u""Ӗ1 bQݴC,=%y% E3K59y6/N'0f?ӱב2[B klkLpN`v'ԅ"Jh_Hq4ۏMN3h585˽9JޔMBlJ1+2TVI^9%JY3A%rES" |M/,2m nA"PnBbn^5Z(rU@|U@U@* aA .RcsyV%\'e!.(v,.oK2L*sB$R'. I)z1mdBNOTZ;| xJE-Pȕ +7 lm\:wSNpWp2 [( L}(?0ʽl"7%Y=kzWO#ƀZbn\?w 4d*$!LbJէN"ɞi%1\jʤ~OrƁII + ĹCP.hr=]rvY#Wgא`h`-58r9~WLU~=&1St%BkS/f% !cy}g p7?8%/bO wdT%T5)r2 jNTԂ5,P~6`)'q"KduS "Ζ1^> zs$h7)t^%uy̴VD!%T)ͦO T5y`NE]=7ҒH۟hBA2|~3xpіC]^DPa(\l|+~.Z ᔅ"%wNBekͬՖ1(X5V !SwPnr|x^Nn+?vluiB-N}43N70vOda9]>Yx:/.i.iܾKjgV;\:Mh ǝ^:2d΢@;TmV:*=)勸LM*ιD1Qow1?(qqV k7Ohӱ=Y `Q'?c7JHzD=^S1V'b6UP_?+S* p% a 'iWP9Ԑ 85nf-jЉT!{j ;h֟GQ(Ա2 pad/'-_CsZ^f <s:άzQnk9[{-}%pgO*-Wl.s .9( rc-l@@5X伞j,6E~wR)A {An,VU*½EA%/2s,E lGU#Tx NyAi !p+4AZTP:PQRAܑ\GqJ~qJ~"4G҂A! ,RFb:rHUhQ`Q:PwWh\\6nd1`d50osy2J((@bFӭ6 GoGL>NRq$?d'eNQG?"?oFOSrGL䁤d 5Pɥ Q(ձ"vPkAio]M.~8Y]]4.u"[ND8*Gu" p=6Yv'(?O/䪚5^_|ƉTxNeVᱍ1h4(]ARk"R%7&b>NeCJҐIgs >}iL))vzfeQϬ((BZRgy޾k*܇( FT_\aŠ%F=Z{7>Hod hYB ^}h7딇BԔӠ]4Zu nmmV&2qЮ♓xJfRuԁ'64爲@'fs5xF䠿1< u}Z":ZC!^ 4 U.e|-.xN2G3OEN Ɯ2SG-*A,M!T(}t$%QuMt##?`!d @5Y Y=BV tnH@"*jǀ"5D aE.BGۡBZlFl=pDy1A5R*9P`@[`0w.jB{D D ^جdKƅ\lp\kE_ H5h\  yzT[;p~UyY3!~GYo6t_=;ʅx d.ks,%p.e=TOX_wFN4@{ ӱQ 9_]~n0q{}x@?4Z1R˟]˟N|[Y>#;8'@1}7^x(Ӕx.Y6dFV4sc9'WSTݡf݁(ȁ *e턂%(ʯu*e0߸D=]YSk =4{;qI.U['~v1uw77s7+dYܴKȞK7VwrR0tt2{:H+ kE*=ׄ3! yDIB|%+ @ʹNX)^S?-ҒIL_]6Za?ff,s ˛ۨ?oi3IwO?`ݚ>Zg3#LVUmowgn:[\&߷g)?#B4߹?7߹Y(J|lnH÷GM@udm/DΦZ3L?Xm(V 5 *%YB_}i9Tsha{Ng_6e6.^(Ud^ '@Q@}Ft9OBSʂPO@&z> x_צ@+dqbqbqbqH"SosGbMh(JeD-M!N~K*.|gN5ˎV{A^^)OL9 E[|Nگլ8s"FKІW1E@%ҹ;vY #d9 =e~dqר(SuuyYT[DXZ(YᅄtȌ? %n;R0w΄P08Ƞ;!Lo[/20zNj _&Ճwl8#gTN>3sŒAgڥ.6~f^D_ԾSїފt!*Z9D04n8T yZq6n7 ՃFn Z鴃uwXL*~|peAy=/J`mc+?cJ~ȇ\- 'Ahuؖ*n %)(k)z8 D؝p2W :|UJ.ݰ,ʤ9L-wnXgItmD\fIB0%]6lnt}JmUhG2$`v(@;b/Fů2Py6>ShmR6&&)&hX˄2 A`?k_s~QmL% L^NhlxԆS}|vڽG]8ص1pe\>JKJt7ϼ:0*J*ҹ_}~5}> (_ikӣܾ &nL)dLtrv=b)L+ϣ09&IJi8?&4!g!Yf0vvD8 vw}(TIOif 6unrp5~uLS|ͧ߿$QYyLoS\ƲWf%3FYs~eaqIfL{';ݤalaWH^RhlFi&R#DO̦l 6ęimOTtiIp"-?ݪK%81ʶh6JspyVc:#aO*mW:.'!G@&A3-J P Y>J/M.7Kp20'Ui4f>i,ᐼ98ɑ -''e.ķ1sRqRz,;Z7-fYf*2b7s2L #ey{i\5 Ѥoq+E*A!ik]}uշշG=4|؏g֞> #@ cلBEɵ3ǒ.ZQŸX9fxw]h$p=;.8!i[\@nlq}[\ׯ(ɛq$0; b=,CQR0V}_% KQT3$<+\s^  UL{cG}v0qw/۾yWiC8+V|7-tQ]:2k_ -J{'V9+*$BƖdD>3c/岋30Z5JVK2S奕p)UJCWN|?4"|ʌGyd"m^_/w65+`J5i̙}E+!A.BJ|H ՀI}Zw8@R䄢OOǖ#,%հ)7Id >'Iz"YcsBlp` wSrMSR-I)xW5PDD58hzM򢵷FG/VԆ)^ʽM,| l,fmfy@ᛟ&Qنoᛛ*0UdMё~}<^'I<&X0"1T"BE$xZPxN4le)*j{yᥴ4EwɌ²n"Ņk9ו tR m-=pF_Фw#5 _!V:dӊ(,BRqyZȗK"nKDrEXǽK%[F`|5m`"1`8]AϠ^bD'9mY)dz8t*Qpl۰hp e@!٧D L;_Py: 87,T)2D+oUHrưD=a9b>oPoGMl]=!w a'a| &U[a'؇iBRvK*/륭{YlɈ.ȶo~,Xe4͜wb?4#2?$!̯?z$q0%xՃ/mK[Җ̗.KQcEhEhr)5\4DhN C`azz[/us-4:ۤ C hؠ֠$yc{h"÷F` 1SWʥs>\)_U`(O4ۥ@K,7Hۗw}dx1~ߤ ?X)ADx os;9h܅wo.~}{3eeh^$)#:E9 ?&JNaTп7o`vܶ #`̧8SoRu9$f<_ڐC*4u2'bV*1GvcΞF0ka`嬀s$s=uM6+Op_\Xmv~ ӅZP]S;⹅cڟ:~ǝeү0zk|M%i4LIQA$ys:}/ؗpp$n+ +Xcdr8TcBo$:wIUq )_{A _G2Q["TepOٵS lwG5{lHynWv(G#Lm*U!G~m*պc(xa6,~Z҆UEDJ7!\zi开ЂFx+ߏ]sAov$8r:H!J^lrT Djrʯ)^χBk37Ǟ` yf Im?18O{24 DmUnsԧ/M*LÄj\{Y5c&~ߴۥ'w}ǁA܋ <Ɖ e ?-xm\nsyJƷl\mY.8ts{tyr?tN\'Iʉ!]p5IDjeV E!5\sz<"櫋=LdV.qO=j5]-m,HzُuRIPɉA1[=)c8cd6!2J-DH2sNüa:C96q401`+nL5å V3\Zp 7!TasCHsG'$P4(8"A,C9-FUQˢt 4 "]v'@!jHD "BfCq$K"Xi[Uj+V >=; #CfJhQkǚ(eqfq(?!CUA5bM>dy$c2rzq Mj̱}f>!U`gOQO<^fȸRm !RL@İ c6|M",H(TADU~1*UMRZ؏E `1TDX*KY2Is𻁢OUgQt̰..;].t@26BM^ n_6>%=Ia%ȣ>^Wi@6ECq!X5Xxr'[x꧐,`4R "eiᙑ3xZ(<#4B*%bFpeZ\4 O.n!,SrPU.#[L)\gʼO9E1#)(?>(CoH@[1ulAC@si”N9cC2/ud 1rj*9ȝ4>"?&B!?D&$2fUhN¹M)@Z9O&G#\Qi|Cm)VW4 H,ij_PH8c}#o)`-ߎQ]X9O-'lB+vqs^%oz.;Hx$ +w_^@N LSIl?SI*%uK,%SRǐXH{@BN)i R.Ks:JM^ii1k"܂@T"hK &ѻ*hC$ҍWଢ\<+˪o:9ڛe9Tos[l[D0\Y)l4 'KF`F|0{%yv^Zh'1^Q 3Bȣ H*)"rrg(AϕgfBƎk`kn'NIdAse@\ȘJ׈Ba)sF5 C7fF$KghOߙ3c v0ЗxؠC2qޭ 0%sS}Ga8FQ9z>4ڈ}&xFH89V[=^ (eNT%;Ϗaq}|g9c}搏 ɭf1xEfSgr;A[SǟXOp[:EE>j yoOP!bh⁾p YLO ePch|cPǹK;Jݬ( +g\yt.x&FtCy"`$[+ihDe,wvHUj6 oM Rtom aq{%q#QYBpwÜf sqafKS~9|J+ ɬB;ępgQZRE!/О{g.A1^kѨg8c`tb[d[k pab"mU4:+xVREu|uTH8Y(!@j{;#gk1M%U[39ٹt2u/?_m=}_OtMڟ{F?Oo}]/rALJHr|rlې8ѯONA7bNPn&`h^ ~[3qxW- ;mדgq^ q|fkn?`xjh}> K̑MIy+ 4rc/‡Nv1OF'Jz6cz %QPBAsKwg:`s {^!׎)6Ca9.-bg 6~@v?<~r{w;_~mv]}8^0 >_uu;V]pfpDGqzs(nDRUuY Jإi9b6E^3R22۸~IzmnnJdczSٵߵʮ=~RvZ3a; ܗOφ}0#ӳuyy";X0&*qW%t5{yb_ʬ̄K=Ueʗ=eUܲK4yA"_\n\pKTkĥ7VkKO}yzI*V0a80Ƥ;/Jp 5D$I6JH:\_\)_ͷJ7{thkՍrȕRwz+:7ΏWw |Uv+(Wp쇾"W8Pq&EؼYs/,d͵`);T`C!Yu;[l㧻UkPK:)̑5 P9)Hgy0k.oN>ʌ~Oj| vٹx9*Fߟ8WDV#tŌN*vaz ||r) F˝/uX&h1164 hm RHXg"SBB1JII DeS8 ^osp=j~Q6hVU6)$5V=Qchu.#r'9^8Ƈ= ?չ-|.0=wFcК]5M2$OF:R. L;5ĒR[XRH<yϸ̪g\=k jJ!ps;Wqhp]uAsxG9pVrꢕYfP Eq{y(⢊\Қhk(@֞}KD50+3m7'PX@kUWL8WŤf5ySKz x!uogd{-Ouqu:P cI+xSC>B\Y / q k;rl- j%KtPzɯ[Ge REB:/ߴ)?M-cC9!d%fI $c:ZnבuDuq}^tK>ލ?<7/wi!m9Ir }n6ٗ -O1ӇϒǝZL// "fL~*D:Zd!$׀L218d8M k ?-*PE!gL 8CB!S {8@*aSp_4hy !vjg;CJ0Iʣ~LF&K&e܍̈,͠vH>D A<60T{3"9n,q@JSG5R\ajE2aIaHvw4qĊ~FaqEőBXAc~YK^C׉d4siݜ..MNF]^Wqb@uYNf(B]զ4٥KҚ)T;ue-DC|quI~L%wc._)ޫryv/ݬ'q f Wv9b`?өj9wӣsZ?XM'X !@Cvիpzv˵u0LDi?8{|_dh>a<*! @)Щ3) CkUUP}^?c~zNd)QXYIh1x}#Lm3SufB1$Feٿwc,7[~}W#j}tѿ>[~}Dv9N$\ITRz ΁ 3.[&=?~B%UQ4(2meA~9NDG?\3'!<!8CKeqq_jB>9iy9?V "<.=j!ie1g  09(-Oh~"vmOb`,@?jB@W'> mOb{a L eG o dub櫪%~[VxYG U(os.'cm/5<أx^gqsZOYED+`9f) fT!-C̬RXk K~5vFɋ<Ч>$QR  EZL臟FI$|:?M=e^/\WԇS$wVXڹC/狩̟UYT^y:>U+5z_PH1#j\g:V&Hpw_ݛ ɿOǓ(ީ->y9ukf_?.Ttijfn{ot=%joq7_z]:?Xc;1h9z'>7:hSc|^;E{.! Ľ1_l#JgZU03qfgp8<E2! Rp]2Le&E}?pu[ԛҾUnt5*coqb~p09sQRw L&js}SoGr޵7[ܩMaݽGTv1ˑS/7rj&gJx5 ^NN tv9 2HSr =JÌ22% ń&)P3l,P\eJJ,c,j5mk Fڱcϔ'q C[jVU*Ƈ#oH_%eўTOO5)~|=Ǒ݇E 1H[J,dܵ2Ґg4\!yäTЙVHtL}Qozz|S0tXB<>J.ďrҒA. Z%jG?>Z|tH1Xv`ņƇJi"ө4eiF5\8dbC<Ƈ0I<'!JлΜ%=K kY Rb E3 40_"+T8B;>I݅T izW(@i?X<_~5O9ػro.K3Al07[~UY V!j / 4Mжhٟ&]|=`fJ\k .0yΣ'W#>Iml~V{weU,(zTZ~>xVW?J#elqәO"!Y_?{֛8\P  f7` P_^cj_ ZQz  ~7qG12e >R,Jx3 5e7LL9efނu#PPA)pk<VjF2/}l.Jj*J_ϊS#$䛐*ȍal0،dJtEc5V47 5:;zCѥDf&0xZ;Ո`AQNonuVGǥj EPP'9&ۼ9!D:E;K7k W!z\U@mק>WfJ{TѕvSgW^xl-{ȶv1!"Bp.S+i}(Z?zheو g0 Ùɳ/xgwe8 n~Pɘi+R k3k+Ddn:+9v3 p(h5a«_/{j](1+d]RVCOc;UrMe1E4YlufRzv?1OQY2}_gIwƒK.?լzNrt_4`.wnvGlq/8FОlkApCe4؎|*S^'[s)^Þua[,uT'M[w{IS).fݢkА\Ect Cz-;nQg4n9;Nb-Zк֭ U4J08]FA-?fŦP4V343W( ^^Ⱦuc ^A1|^wւ|*zNw?{1N'Vy'm)/L!q|l]y|иx$@\ɾFRH(3D7!bgoK }Bn)L.? h՗}>WUGԴfbȝw+6[fz"-NƵ2qs}4\uf<񇿋<&wۘcm3?^/W Xi e5GH!RzHA DԑIBR#D*/R(HMIOb6hV+{7_-a}l36X[$'n<3%H\s&Ϟ )%\hhCh,JK M&q~tc5~β?gǓ-pY_ۮbɺD~4* 7G❇f-7-*Fv%iӤ"tUȥ'j|@K[-װӘ'2ɍt3A-)dI9?{۰ݠnP0){Z-rLB(}r֗piLݻ\l~lTZįۅB6?(2lC]EgQUN}I%ϼ)AP!d@Ć?}kd[ B%,3Y  TxCK^>bugOxV}?DV I%@#3 eYʰ  {ze[ɪpMO.S}b^l%gwͻʞyd@=g@s$׼mz$X6?ܩcY B)͈`k!H3R*cɴ#eJ$<8F[䱄QKr&nNdU9q}@R~YfǴZH{x{xM(@/TiZmrr-Q/Jb`[Gy ^i6虧)9l r~͛7OWQ_vbb~aZNF3"=jdYŻWn"U_W]WUx]7MruzjUfG׿S_6n~6n}6 }=׃B3@V;Wot[c]\Z-}_7EgҝygyV&Cs懇7Or+&|%~:o瓄Eo?ӮP#c$g0v*6׮ak8TbVC v\8$W65{j/\q75!1ϸ۟qӚnYN5 ͆PJ(= ewe=n4}gT} A 4KЙA)o%ɵ_Rh[%Q FXg'(1 Qj{P# "L2 RډN"ӌ&ɺGwbAơr0,tN a@M 5~m!FwW$0zcU'32sOGhb}~ I"©:¬.kPgEH 2r.f-dazI4JB`rhn8Nڹũ\!G ҂( 7 t1 A<]p/w2i'>aud,$sc^Nd=a$BdzQZʘz6H~`>2z6~cyn1mӰIB;e@[ܢ [hZ ކw`Ȏf%QӀ$ @2 0XfrKS퓛h5;h@?Q$qR!Di3Zo_Xɥ`6 0nm rȁ덜z_)`%.2(  OՌT dA9O'I^WwGJQ9*@"`ȏϧF3Ȟv$ $e+:Q7¹a\ew6A [3{fEk[]l+t{_Qj*xSe"ncbǽep>WώڹrT҆5Rdam$B ⥈QFӧCDpe>lVR|ka,.-@H%tfSv7#~ ܌q=nH@"G7gEABUj30NCČgڹҷHGziL8u=D{YwRBdeS`(&~ #B&TV[9\I܅\7]<aF'# Dɩ=I˧i~$v2-RRXC) wn3csE3 bN\1a2cǢzx4b7΁ WqNP/[#$E )"S/ Sfvu@wm@ ]Z;7Ǎ!©@0^R{:y ~k#Nd :ol%7(ώ& TP`jeA?d-"[c(iF EQofOӨ;LUN% XzsN)g@F ƕ")9bj22l&!x^2ý@fy%4CdS(ҭt8u +sJq(@6h&wE32@ʎI\02OeUr,М"Δ~ga~^ͬޓל1u!V`y4pl4N,@TbtU\ eGV[U{)IAľ֏BBGvѕtPQ-eKLq}ڍ@3,J\u}EAAu)pcjcA](Oa15 =Y9k%` 2‘##s;PF+J?bKmrjomg9zݎQ891ə[Ip(̴<9JikRvteʃyALaG℩3J=kT,cקVR3ђ GvPqo y ې{Y9-Z 0$fn6?'>uiQNzҤnjs C9܉VI(\O>瀻h6D #q*[ugB-0s1TQp5G ח[F?<< Hp(v5>x%u)- 8^fv p"ر?m^8 ,=C@WYI!Z65qЪT:3ږ?d-@_쌖R/5 &m]MkJB>0"-fFOQ9H14{F8{u);DONH@\8{D';tk# bjaOx)[QYfD0'`TD)i,x7<ktռp˞SSsH Y}j}iNK7Ps?e ep8_wg@逥0E>)'X/qx2ı| }02hC& /b[p᩽ѝ{8$Nc;ҙ!8^Nox@bĤEM9t_ivܱqM6?B:Y;ʸ# R(ƒ{<G.p PGOsIAT;G73xjh}(˻LR6t)f!se[eb9SA0gzOluosfi(H$ FSͨew('$5'Q):tk-'߿g \w©Δm ))?5BwCs@M P8jf'z] 2AuV; 鷜mrntqwN_ p|q.fL.TYcwxpλ42Uöucn&݁R wM5waR 0):[ IVZmZ]q,9nPcXO80 NAkV2ey,@ooFiynrF?F[[C𿃫//t6ӯj8Gx͎LTk8q +O.ʽY(Z4! ipODJ`ߖ&@Ce,;5e)s j{TﯔtROq@C"B%e"bL(&j±eB0*JZP+:>j /JE9uS*[NlQ @r8itP$>W"GAd|\~Ksʩ%WLҧtZI#y͔ SCQ~a(*̋ť=ZN%M>3_=V:CQZ[4KfJ6]2"?ML?q0L-9WG@3#+z`.Z96[ I(uy>ͭ( a)ODȿ1́J 0uw8b(Ãe^O҆U.Kц!NWC=OdaQ-4 ǧ4kLaawYjR?8](Kr:Du5ajMa M8)IA@Ri8L ( NJb\}Xը#Kk+־2ls&V;3;.fSL c]HA4ixWxlaZwV9?mh9Urv_fęq\Y孲o5"I IFV}|*7־ 7MIͽo\y[@5:1?O{,yձ*N^:gAD;VK1gͲ Gۡ'|d(7z[ !Wֺ [a@m9 9nsvsu:0O3b%{uĞK=s\qgȭ@.0 uc%(ᔕyWF.|[pCj{؂{~R]^ q1_ܻ]`>MKEܓ3.}131vNmn,X0u%-<4Xe!ge4 iOlAU'[.@Mќ Gq]E/g~ `tqϋ?/<#SkLx/ޏl/!]Kȋ0vKrWO\\"2e+awƫKl1)oR/0P_eS)O~[8`wC(ٻdr1< 8Fya)挞5VرﯚA]xzl֖(vW(o_]?g\`0k%~j*is쵐FkxriV򷿼~õg>< J?V?xժ2 Ti)Ōl2)EIR( |&,$FqJbAqrUJ ^CCc? fX$0Ds$ AHiC/_,AӂFGR(UlԛNuŏk+ƸɮWVg$MLUWfU1+ Sm:ڧ /Խ,ђȺJ$Z*86{JM^BsH7[7wJ5s듡1 x[ܻkF&㽟(bђۻy28vt?rO/ÛQ`OVB : C5j'P {r?Z 6D pс; V8w(L{æ|~mMWl_o2@'uk;->;!҉Q6x %_ȚoT(V&sXIG+`wor]% iM糇p,2~Sr{҇|9VMKQFL>ۈwqɬc9d2b^{ݩ\B^e &Bu6+p᧼ ,FY0k'cGHUHOsMSz3xcRXztv=b[j`q`~"(]6Nia/2V*LV5{16#ҕ]ٿو{reAY0DL)?iCy7/',ƀA`䕐sRƊA6v8X1w >NXIFݍ9.ٙ1wpl jfƛI;vLiY-aFR9>wEa*cŏ&Ş(<|o"b(_^Yߏ H|;)4tMhƧ3)}(DG%?0B(P$``a?|݆alxBasIv`Eg#FIRR4ё?*hVj p#c6A&!7TH|X+'c9"i 0bbq"#EY@SDD&JaIhL"8(B&is8´r8Oϧjr3Y%Ʀ{l ( 岆_p{ rYFS=+0l 'V %SYs{F 'JGJD|O\nkcܓ|dbfCW קn[GQwg5 QBȤG[~:\q;I!M8ZZBvn9j8}V SaxL~ua=<w͕z#_fEq9rkLP@vTV ꏷ59}T E*h\hIKT+H5iT½ՙtODž0Kl6dtPw8VucԥB Ý@WvaǥQkp?:)1GwݏNWq?˩`qOY\zeM^Yq$zړeۺ,PzD+W2ALүQ(U=u@FY/ 4H>lDLN &p/c龡Uζ=Lj%ںL)E.wZ^+pgɝ3[G4ڿRB6t:%}{r#‚!<<%%%,w s 3I2 +z) -bJ_׻VئH KȳbU~Y7,vw:Mfߣ8&Ԯqa!2lɟosųUgn /f} "97B}e=jz~4=mYc Z4x2 ik}Y'h]dW{̫$!߹Vaխ ?X~y`R(QK/z/Of͑ꃵeek`Z^oZV?8Z:$; 2%"5eD_eȵ[RGR$ E 2|4Dڲ'[Q5zfGו8mo.̫Lo̟v%/?Qr?sR !HWbY;+Ie}ֵ!j j3ѩ:GɗW!f5"DP3/* Oby(tD!e,ܷc1iJ;2ƋQwgD;aw8BAv~rXoЕvr;X/)Ah{lB-wtNZ|мfM2#Z{ZPɸ1NTh%Z{RLjR (>KLM,BHI~*@mѓ!El*R,_3,GMepjW9NwqT2ߓҢNm Z]ax9fnuPbIu!!ِmxk0طHfB x\7ˆP$ur9|&uT/DZhU3DRy!C\.Gf0?{ 3Uî`,l5v%yԪ*CeqtJ]z"Nݟ.CƜ^ Z(-e'˾P2GbR %HЄ[HK"k=SBN*A ׂ.H c=ɺ˗EVCWc O3o:=dkA/n6 ҇S0Gp9)u{PxFxʄ -&זYSk˹eԖ{7D`11qLCBp h`+S$Rp\bbkHB!JB#Y J$$0DJseLĘD'A,l&Vl6if`?5`Ysf̆q-# D6ypk5Z2.Kx 2 3OnV8ۮUع+gNیFB"LP-V]8f kh#!P &\4VJ# W 8Xs0c kV.@;HĊALERS)pHL8Q Ԓ$`Ԗddp%f:Z8X܎YY[>&4wm%ph-٪.] Flouj.˧)wh3|oWl*%߿&oW ,W&ۯ~>n#=@^2/). o"iY7%=M&ࡹY(CRQxBG9H߹bwj @!*Z&e쐀aMQ[=r6u~c.J ؚ&DLQA efvjwBI&"8L@GM-f`QE{Zs[Zm̧wfƤ4-atEBW]+kV^[?D^DVP~́5J\0Fo3Wefo&fncMJ6wf4N󂧀c wYS76%(Nb}&|n[fh*]Ic,2fYd*m=%uskqy=.le{-o1n^eNkP\-z֮M< ރc7N $BMnQ1]$le۹eqtF]- B-u /A(5=Xh٘B+7 릃/B(EWAM2 V \4)[t؝:wܮK{nO@*nXbN_޸#޸Q<nQ2BISpq/ٿޝ@>n9JgǶ;8dާ+56>ʥi*~,͆G󸉳8,zI`OAbe$e ՕYBYEx@E1?)bC9;>+S$ƽvO%H OOdeL մwv;c2ƽp2LMj")X!ջS"S!`h7R:`J}>*(BIw)(8u68iߑ17e'/zoI'K5x!z)bꤋ%ž0Tw'x^41 ͠tGxHr}l_v(Qe0kNz/ATv\I:R]6''eǸ lEe{X#l twODl9yhn6]eISџ-BZܸ4fP# W7/^# I;RyA<Oqv9,= ӌtCQN辯e}Nt\$5ܙ[K&L_5/ܽy" C,7`Uɒeg ٟToe!Wb±rQ 1Xӗ{Еb,"^eIXS^m0UJnIu5%DֳƖa?tƢ*,:_{r<-4pzAm2^ޞOEzwNmb/i%飷lehӿ7,ud5"GӧqM]yBGea;=_Ϟ7tm_O"EoȭfVSty5ְOps ik}YZкLqo{sEɮ޵57n#9{vi~qNL}83XYRt~R;o;M,S?6YsVȟ|YS$CZJVg++BVީ_l>4gWo?9QWެE.OZ.cњJBZӚ5f+ODk)1QUǯZ®vj3q xrjf ΟiW ('r$e ( ;K@2ܦ$Lct w_\?_N݁ ޺AY7?rf%B42G_]^tVNBx]QH D`1 0 0O0XZ;:&8 l(Eߎ)&wwk9ݗjBA݁Hٱ!6]<] 8k^yyEjyt0g$YNl xXTe>VWU5rdQB׳F,"v^Tk6ZLvDD7rr2uS t9p03sSXn~wu]*y_9x?vec<绰ۡ5O#(ϫwVMWf|<-ӻ[b;}2jxrxFBO'(? f&Y],-1$ Ir_劦n؛xK댬+6f#(DӐ8 W& k%FKi"^Ko^M;3_Y"3h:X`R6Zjv,/T$HZ7y7b)GMqdbxx[ B3 #E4B>Ia Z3(ֿB ,tWo/a1 ߂/ş4=50s SEU'M`8vFJlEAő.Nٸ RR(fn8+ #J!]Pm~ Fss $fe&!sE]kpϴa=p|j3-+gL*i$%5.(Jsk J@k(ގ4]4W@:xH®{ 5^9]|_}\BA;?ǽYK։uyP^t1\ݿJ0ڦ&Vȟ|)RpMA2 b LA÷/Yn_yÑۗ!}yT =JVt~$ G9sl'J_My$g'޼ὕj>~?[c{0FlX*&޼I]8_ifN" 5=Cnx RYSڞueM&&nuеa%ƿOf}t3UO.| o3T.etmvjLF/YЧŲCՅz0KG3&/o"s7e$!$D_92a#0\ Q.r+Ej9Is#D-bX2rT =He{Ud/+n}K!H֚t c༥ $x7GF7z۳I DL^yu,\uNBhoozP# V'[ŌHʼn.!;5#f;Ѳ(RB*hXbb׵U*@Q$$O"hEHBc Dҽ6}%yLI0: r`ƔD1XiS4$(AsfP:V~י͉)mDtƣ; c5HpwkLo|盧|.v/~v0JbRN .ix1ٌ/J8hd}&.-UWsQ^,yߔT]*G@H!0PO/xQ{#'fb{?Gc_T)h:7Gf5N*/z s{R hq8bϙϭDZ +IݓbJ;W$7FKN|;Ț-`@ ZGP$10p̔182"k P4RF&V{ K(Oq׺~" W? pE\1͍L 0(&Ęi9 Ouax(A˻d}39". v`r6CjG4璱ծs3wz. ,Uюӛ'VV =|fSk7]z$>IKq䜹Ͳ=Te$Tn3B?2FY.Ttz*v}TuQa ":xj?Ia+"{cu!5^kw2XB߶Q3З[ qfa;יGò%P]LE`jϵ$^.b qUK q +r TTR saE=^j2CsYS#c\-~_Uw}! 1 >UaUbowue6!eS'%*gRb .qȞ/:@) |=꫼:H9ܩrm"H t+G{jJSў\^;PɽNPޅRyj% [yjW٥B<64hUQ=q[=Gݭ^a Nxuf`\4}àN{.^YdDz=g:׏](s=r$t?d,;VFrӹ#6Rp0c5{OiBt͕oL4ZhEw;Y`3( ]q}]݇g" NK./ Nͮu2`{][ji)8ҫr'+8esiZC1 <^dw6]ěHi:UyPH GҼ {:^t62CLo˂)mc>X\x.<a7ilSA0o}[7$m%- 'h벳W@:zx=w0x7tEK' .Z쭋.SGF+z'>n\FWf Y*'U BXW)[Msr zةO^HC\[kاun3^Er_<:[.'xA]OܑY0=5ػʪf|ƐA BkޟBqG kxJo>Y&~>7,P@]+)i5Jњ>7Fyn7DjmY͠a۪E@A73?ȡI0s45Q8q9oo,@b*/dl.I Q+#;XL|is?pӞonz]9|Vcn? y#ȘsOu ;>n`5;'+'5|.,fJ3{hT$ʽ.A!!0 t FX\G͜ܖ쏐{ĽUφCI%1iHu ˄HHS(iȰ@XdĿKiǠ:3_ts`bUA>~pۉW{3Oy]!CWs*<_+`T^7#cM &B2bVBbD B KH+8|ѱ\\[&YϺiQy֐"TeO5fY.c{0?.͜!DWެܨSNzjlL` $tаlTolNk,Gq0,Fs Cm|Z]|Ȕ3ꎑt^O<*;! =T?n?eT<7 Zwem2i*?i'wrUXm I)qIIͥM4hRY gmiσ^Z*|[IJe1,jį /ѣeq.jE; OZVC/9D՗rgk}[,Lam$w;˓Ih㜜kG:yͿM !+9n}TzȻ)vaSY>wJJ耕YZט1CnЬsܠ9r>uUvjlzŜ8*8ĩFv4q;#8 KT0yݾ^2"]A }r *lԊ|1{`$%Uk2G%FZG7eRZwiP%8,v4f#wv0bFwVp,?֑h>̃Q~ * pfGD Q@HOgJ2PEmiзv7b^0j%gjSsjVIƒaO8z!y ٸ@褗GeH7Q$q)㱷,HZd.A9g,k4(HT '&ezXߗ3(*6|oΒ6j=˚CXûQRK@rJǞB & hs2+g+sc~A[}Lwۃ?ź FXl.&LF`~=ĤUZY?<3O?^_̾yC9Û',A˞>4iꅱcJP~MW$s]QW%s.onnW{.@RrSs7[QY_g ^qE"my5B3ˋ#%dMoŏ̕PDƕɐFÞegyO9YRGNg>^XsY$𺻨M dC-.a6_>Ҽ, ]-0zEƛtH񪡏xּs:ܛd:`-x^uE$I/ꢼ~Mo*/"%kĝ\cFN#2I?5,yx=y\ЛYK;z=Y00Iv^| ێS_ҍܖmS NȄUc kDHB#N9Aah.2CBxW+„XZS$r -{k ruf׳^3Vqy{a:Qw)Rf* 9:ViALrG:wAΜXOhzCft}yk%ҩ(E( ..ZW]FmA7p R|ay1Y,UUƿ,8)"L>K<N@0rլt<Ya;Soἡ3|浔liɇ'iUX1me&BF$xN?PMTc9 =(7(eG bDmW91B5,#Z˖]٤Nr2Mr|#C7ah05j-q")33bBio"m:f/n>2$~cȌ "$l7s-7{_dԸkqq9X7_=f<@UO~~?t3_tz?Mn]%Dy[ qŬQ6U;BV5nud1ۥb͑G OYP)|~puR,}l0[kTǪQC~ȿ ]ی;# }r]hA;eܘM!n6!nN;*QЊaTb:2Fo=8՘mYlmJ9񵞙G9Wgat GbLGMj68?0bd̫^/S-SZ.g3r}h4<3I4vbN(g͕v=c 蔑2dyӜCbE;/1QhU IvaR Wq0,0!lW(X98U lpp^B1GƌcFX!HIk)ǸB&yxٴS)PT4Z(I)G+f G/xfK5פTYxT5rMN풋=mSs]簸4)keLJ[/ɬ~zhxdxVX U`h72wG>8r92f#^o6*e ;B`Nwa0)|I3i ͩqAuObFOs,l@1iQL ) xZ1Db*$uL01;RL4:Z.d 6㈣D1!zH@ < bwbj~Ĥ#ȕVk Db [z,r9`%U X :y(U.}]4H` lO0q KP>\HXP3`9O,PN l?6/ p9j$QR*P8V1!P,RˆT1R^Hq;Zsϱ7kqD%X'$'db)L{D`1ZLH˻q[]($:@1N6vHe0NFD}h=7'Z,O%Z1ZGĊh갪(ն+UZUq<#ȑ:"K`iHE,eV`9A:v!͈Fw aa>)XӥSa7S:I@ lTUj gѨoDk9]r4&2N1'Y\lyQӰ񑚫M]\kN|ɨP 8Ev1Ic/Z7I1^TdSvM͗JeS8^TaLSF:Mwy4_|?7Q[(0?eeTvC3PJ)<#ad I"8U %GgvP (I(Ðע}SBrq^,Vؓuz/96ߚo'Fϴ[ugѢ9| 1E;̒)"(Q~=Ǭc.&+.Rb]TkYN@Xhk-4 QESamCSW@\*Xj>KfKwHL'JSg9m,b`Kfb$#(DJ.e &`P&"D``^JSB1XShz>i1u0JgPBǝіKwb."iccgwhCGTTTaF P:=Sy.54)FK%TCWSW@',0-t{waXbG@s$ s͂9K/YDiLz<cu22v%'sMqo7KAsAs^Rk{5=߾l_hW }<@y a r5^CcU }<@y !Wz O:wN#(Kk4%-^-ϗS񪡏sQju6qS٧;I6ۖ[ "VVqr sI+[RpbȺk>yӬո rIK<4U.)^yI3&-L)vrs^s/8uF+\]pfzp.F'/Zr4yInx~<%R3Eϑesza0MQyUbVGfstzp|0NQI_wfstz}aT u:ЕEׂj03B WG?;6l>6yvp|4翹^=Y`71jȹ][{-;/6Nr6%u;2X/^:?H rKA81)N':׼* A zGSh}&)5/@u}x q>'փsq0? H)LF6 XX1֌ջh1t A#6-nF z\vf$@>  ӝ^? :!L@P"Ozm|mJM'Uʇ NB""&huHB2S2;嘊 51tQ`NVŞZ E(6B#D$:LE=N'PE ~VN!Z #@ {f%a-pGKג_7.ύ4q_.3>bR ˲~ɢu뉿[4t` ʇxF*%ǚ?yCv|U- >iz`Wo"U=[n4g8k[ǻe7g)_^/\u~gn0ڧ J`I\Ju5N=jw>N JJZEV**FXaW9>+AE1$ NG~Tf|P"/2Y%`yb@aA(3\FIkq +.H$]{N${6cŗS' )Y9')8~9.\%C$ ==H.qg iUY 6=uC KBPc6yywxry`?@퉾[Sx%{8n_/: oUZ+nUt4}nb2w ܫD,&;ZTnZ$ Q)2^}]ܦ|%Wn'7}37W)W5RY(9ӵ/!_>'c]5Oyō[L} '􁼰og5O莼,ou7t2͟&oe߭AJRzi}tt%Q\Ofl0-ߡT>jk~ѝJbvw߅rTlc(/Ma ~!zH(l`%-{-muK_:s,bic4_ϦQ'h#_»umR;1qFl$6Ҡڿpr`Y…>WSc~57KSz4c ~>M%=GГ>X"8?ӟF|D^3D@$(l`IcEܱ}I?.^sLoҏ1uHΖ>}LhtMHrϔ9Oɫu)ͱK_s'Qe5<6(uJ>=@tewKsJ)5O&8nd65%ڰ‰G' ʉW;W=q:=J}.ѡ9z>۠6ŷ6O-yz Ձ˞ Zp>`Wn* l%[5?=V0#%εyGwNR"d[~W*7`.݂dLso?I[0JHHH)rANL(B& Y%2ArC4[̂I+7&*}6 ?l~~6x7`E|p&0`@qfZzm$5s0/xa0_(\e=wCA˫sy.[rG΢0 &dK$Qe(h6‡93EUNo˘YE}"ܣcTL?C骅hhM#F$SioߧcFM?McP尥. ڭHtý6~_/O:SqZlx%qe|5i!8QU^ N!;;״ T]Sb* 0UM?oYO\ŜM3f x\#UkNq6@űWOߒ7(Ky\{v c/H$avűWԎ'J<^.[?.{[Kv:|G?XhSg̓T,Ƀ DfCkcYJɬnjHRVzd %?2EPh6vf'1RJ?KicjnD)=)dc;.9 & 00(^/G)3(Q N) "A6RdKΝ:1&cs(ר ּvV@A BZuQ_bӖw|j} U >{2'aڳtjZrF=iLcɑ^~(6+霄)>0=2Q# }4%Eɪ`RQ")e4d0q׳,c#Kiլ>Nżٸ!M^ zTAk-eٕ8y*fFP'Ưo럹gKvUdS6SL㱏rde`&L9s8*pTi#rN$JŵV$ xZiϴϞ20LnN]_0[2noϭ[,W{Bӆ6J]Ĩ&^Z((%Y 'Co1 (߲ $;pA%zo܏ipBC)"cRԙw!!gIV\j;QQ5\K+#G" eHY/b<4P(\@0schL}xtPVXb/|$8C%F$I&"gĭЅ_G"2Ia&B\i8 QYS:&HHDIىĴF=g_M ;EQ[=aIK ; Ր ` 8 ERWKJ)ߥ$ֳaM.P(P)d+k5 kINv {Th\vUSJC$oJo+Vf[oH&$Bk$L ՒRk EEJUK^\%7H"NL,U3ews{NъSJ2JHUq/V ی6 'WIr69)Ő3zNa*hHӍhl1?invyCIYɅ Z/(9D [P@9Qվ@/<^}&_@A !}A&OE m8VO{nz&j+WmT -xNjQIx Gw|2p\,o?Kr-=_B )+wY')r*Sqjv|r+f>p@&T^T) LArL;C*A8\TcP)e)A-e}%O :'ڡPbEq}Žo#=@e Yi!%2Wp\9klelK"n64ㅠг MoѷUX=)UVʗ(K IpJAqDIfP)F3 KU&h:@ H!nP[geBRQۄq 40`X9QvlD KI[#"@j9{p'lL!G)Y$N0niTe25]vL!ʲBJG4DՔTeЌCfCD( ,)wN3˸݈*-.K<9ʈA$Ҁ  F;>zVL8-cLuef `;BmN度q2h]֎)ˬX]sH^KmTY1z裻Cm9=f~'A"aId VDA > a:@ t~7=C5tP%}QM!pJPFi<jҰݠR Ar@X&ߒ x6g2lp(<5y%Լh#s22hG!D6@ Wɤ,AMrrP_>#ɺICd) ))ᑾR2s|Hkઙ~투Ysko .t8b{joEdu,ǾV *m%)~=R{"_[:г&F\E  M] wf:K/i,%^>dE5⫧#2Ћ:rFxȮ1(?Pq= 1 dQ";v5!ptZC%d#$2^<M.'0/(e0d {Fq8{ Y*N:2/t|q^f_fpnYͬl9)m+6I&2j$7Vo>i>Jw44hyNNG>X#>iNJ3XIa(YZSfQvi::ܠߵGQ^ƅ.L7 jh ⭨156[9efGɍGC;7A'1l˄&YO&/N|fŘ))LfOI[:0ߔY9QDե~؛uс͗4+l(Мtt7OyRnH?(:nFCRugwwt7A;8e0G A]A'o?i^8E&LkO . {BLNF&D̍ʷKT{ɂY`~Y\]#Ʒ;45+&;TdN[܃~}$\6+KA1N6/C<.LPs;+ϦtnҜoҐf%w4; 2U+s;]4mYB$R44Ydt.BLq^•='6'G*a4I`9mVS]>I$6nKy詋0}G62`~J 8I]8UBTNq㋾۶GIK9z+D&aZNa{K0`v\~ڃ?ah]p LhW<ߜ7{烟n?Ƌ'vq]ߍR4狿$ըR8Q8(۷;'sG herp.:_ykx'yk[M}ǰڪ~tV""NW Чz=l Ku4kfK%$njZAOi ,'|p2Ûae4r'x%|!k"7 PvίQ`JJ:3f㱣l\ l(.0Pn`iR[o'cjKGHsɽkIoSeͷQ[ӶxmB]SK%[ቕ|n)kמ:~ Z8)|!X!0¬vi2y7EaM]<֞whS{!vh]sprwRztZn*bM׌bv`U3}I>a7جzhͰhBkA 3T:SgADŽ4L)iJ۝ϸWzZ϶?7bM(>]CM9)N'O XP1ygXӥM303M|!Q6GYjWsaoP(Uz̫uDl@z n`0pZQd1=ŲQG7 '{; Ülޏ'z3phFX0Hh<%t6LD>|x݅^Zwoؔ_Ua2 shM2 tNgҩ|>(#KZ 4A}&_~w߼Oa<у}٫jA.o`f4Q:4sk&"IwQGm$E" \G8A[6JS&7?A5ŭ8࿯z}6p?~LSf"4*,sL9X8sD(uUJy{PDL-4h!tIz{3]-SYDYn%zݐY-Zy>$ "Ũ_˟ɱoN.{U,2-ݢ4DB IXxh tKcaBEn(:FUEbDĘba*]Th-u$Jd#HO1D*A; #*GN[!DЗS[L!`?"J!R\g'èd"2'˘--Ku:vBV)t8 pELbn R.EHIC`|Ś۶+q|_ޞ f ݨ=HWJ{}dXa"< m0! r)K8Gٌi*G4&C+e<&q ôҁvp0s[F vP1 ak^3Q'\X'CTirI@i-H;CP&0 lD5u_k*hF'̎Z"mAG`@Ȩu ^ZMAX 5҂0?seWrV!vG/A5hÜQQC= aGR5^ki'(L!.-QnQTxBz{aǝ"f䊕:eU\+W>n,UPPwH#R$.r낧 $H;g/s45ʁ@vr0MWJTO3b2Y1DM0cX &ӻZ>8iV^ߌŲ:RaًaUV@8YWk˳5#`:` |qEܼԁtuƧaYBy~VJ>Yzؘhkv'ֳv{ءUݜ({77Q$:HI;ƒX,$Hd"M O@,}Qn;c{eYVKF!}Oc{}#0q5P>\&胺q8^]Qo,)so#2zWߣ!c !# iJ`a8C9zF΢卜t)'ҥu C8ꎜyDͰBl[}oAI,mU>5 ޔ0Ff+83t>߆}ǟ`3Kѩ=xB5j\"%=rGu gך*lzcN+!x,5,WN*= M )gVp'%gͿȆwٛ{6&*30;Sw^:\t9$z7&ƗJuXL?v-^Yſ2OI^SR$ߚ!42^D <_z4EeQAsQ0G^P"g+Ïdσ~[0@/kc8,eB7Oj;WNKs.89K1=)_q%glUvZ}x_EW[U;*RGOK}"& ۗj.sw:r(H 9/x4<nXS\}鑓T}e~w]2磀d+&n/iI86Kbu%ގ!fV\b׊S|\'1~T8 <.0ݜnmL7lލ'>8Y;Ω",Ŕ({!߲@U84sUdcQkE,3Jr˹=)cX:"9;z[ڸ`oNigcХ0H1 |dvGӄiDP쎢&)xiщ>`Iw ;q=,ո\p2\4yaʆk3<֋45VC{Z'*qQ}`k}H'(M^Y5S*eX N)|O QHK°c!CמDH.HrgڞSR5&t)+U?`XNv0\ts#XP9DHP2f53c'$cx)5>(#'KeC~~ 8BÏzx"֍t`nCnf7[Dw QM=$8:'FJ5(`bJ*es+&(& f- ILH)FK1:ֽ;]VF нHV}Qc ƩÞ밲+rBpbVV $%0hb*$_1p>\񐱳VT&|hA)bA [P"  X XXN%` X _>p|K,HÈ(^}˘:S@`%1hB#Zc+9,G;U_/LH\km+NW+*UNw(V_F39 W?Q^8q(מrP eu1w1sxzF))p#c9DD2NU%[52FX3}"¦;߇mP͆.EDa:Z/SJqԄmmU `Z%b|}6>,|A&VCZ&Z4XcJx@1{-hHNRa+% J#Hh J>c0aXar@I) n$W@aLjh{x@I A xu!ud 43TO_&ޢ8L_l(ƃymywE?/ި30Hh[U0Ȯ88#O-zr:RIZ#LygЎXYrn PVG8 f}1iϛ|? nN=_ \7DsT}8\mr?{w38g/ρTqjBlmӝYGYF%0mF_X~sʍ?}Lpv^]yZҾζk#A7|IRQ ꛑ[;*A@,sٕ0gyfԴ"j =KOk\-b7/bN ҟ&jZڹntNoHvtم"e{ykD n00BʫGׁM-c(9.QoFd;VڻO,ݴMzxS}l*lr9O'N;gqM83*>o?uZysW~FjEO]Z;qp$γM@?Z:#V!14e}44}`⠻|1un$Te9U%k__j1I] h 99|\ay-u?^Xo~Uj3,Z~sLbSshg_f`MGVE.@UDrPL)1T@d+́"*!\pSX/sB !~u4>ե.hV0Tao qˣ(ˊWc#7QD ,"y}n΢>rsͻe𶰷:j]t`{77jQQUۏN Fd5b%tϿCJJ:Yp,Y9O# Ŀ-3C91EhRڽ9Z_}ٿ)Y>o[Rvx3Gx}~TQsΝ-2|z{_nꈭusKu\镖L)mR 'rJŮC=]iUO ̅.zW.5,;ߙ`LKJeZ'ւ" S|5r1K>g]6^QY ]s7,RfY6jdP0!)::o`( PhKB`8J80{e'#OiHfQ_ii-'-ȆR!rtv.aMA mV' .kl ؛Xg9"㈼#:,W1A0/9 j XG<lj<8ɔg,P9~"P$WP  o6iq؟h3C1[i]X9;lyl\TOh TopHu 91S)YJWxy#rؾe&8^'?%"iNH,rjhy(cCsLq55@" iP 9uᫌ5kA"6+ (*B(6%*H0C1Xa1Ecm0V. A3)㭦EnRzIN܆; 0jiV:ױen1xk $Bm3?[{ _)@y{.qn&Tϯx'<Pĺ]DD^ w}ldƓ邎w(CJog)b(#1{'^' .Ɯש>Fs>88U[^$ )w?nH_\k~pwl$/Yl=hNq_%Z)ngǚY]U,VYbTNPMK>D1pv T;=S"6Y tLJ},]v5Yb\T&.}=xB&{Yټ\P;u@^O1b>A/I>~ƹ Wn iZobwm{(A.X_Ûe{6.kFvE.ߺς:jlu64Ȧ֡]E9_%Vi0A %2,:M-Ḵqrdw6k[KisA[Mwh *r% O cF "]<3@8W2FSөDFyݗ" SPF4#`2*BȨ pB  &$ȏGgTHBH"4VZ@(6K2%5D a4h -PYiFCcj,"š-:*@wrb0vI/-srR"~ HdB",<y Ϩc)As N: 8#Y+XcHSLR)q-^ 0gdyVm'By& (HW|9m~~\6 DB\q6 5?;dD $՜;)wK]`d"07Ȫ)?nOYVނ3](0-3;pW-ޣ3_ڣ&_ " u1ap=1G:<spl =Lb4! 09SLU_ZR*emZ:2:b%ϝG dZ޶uiݚАo\EkTU߭CsϺi".w˃թ;Gv(ufFnMh7E"oPNUxn;$YM}2 QG7k<E/wά9(R|;Sop$ :ҽ L _i-1[2e~JDF0]A#حIcح~ƩД4F,t/DLD!M&@;]*GR8 R.O(S%^T#А 9V;A㸥.&}z7~LMp6߷S~^Ih*_u(\#YH4WkKK!:XvJW8 f;R]:"QP3&^A3\sI 0ݠRwѥGwM:n?~gHݿM>[  cn%bfVzD֡^': M<*R$i^+i@GߣkH[-p^2O5y2|ލA`1˔$$YfC2i5n'uU9{AC 7Cw8Ԓ4_0: q֠0\ DU)%H 2*a6o5 J8wW`bfLf8#mp[Df>7[-%]k'8G"Zͱ+cAAp1E ץ[tesuC_Èo%}=Wڬ(EDD$g&E9ڥV1P4|5 RemOta٩ש6hHί==&TՒQ%>guD5s>3Bo$SYzK u߭ HwAU;6-QK%Da&0(EE i &ɑ9vEtY b䬣e;5$O^IW9/F-qVE o0]ͣ^}*U#r-FlWk'OKg^&p>^ <챥 F\nOћoyr^;R?9X20#yQF|rEs*J4|3X̟pUM=`Bj&`C`셌(Qu/d$FD#LQ/ 8BF.ojU[-Ίj(YȞ413mRpNէc WeۑCs.MiߊLjV޾( (5SHvL?kJf%>VV:C P ¢fhO@(tSb(Tn"Ok~D$1llh\DtP1QEvGK1VԔ>w{WwGHyU@mzӌi0B+BIL| `]ɗqHCe/bZ9T>1S4;fB3Po%cα@Jsd@םV躓k! # zIr.թHEz;xƭmr);]/5bhjMb2g4N ,I4MLM޷2bU4bb>?a Q7^F'JxQr:2 J 2Qf8ՋDqѽ}[2a~9]҂h^Vl=_0'іTz֙VoA"ѲzTN ~%y(VW՗j zT3ޫ;#~Ӯ">(-HRI*!!U`}p]AR-Hi΅O2j\VJ \ۥ$ZG^J q Elg\BXa԰$46zwbS8P kKʉSZ8nʐF8d})0!U"e2 6{]lpLnۅ)ԏa_5Ǚ_hG".zʞ`OC<18b&zxg=c-8+a|!'7O;fq2tů@!ڂ0Pa:BGRpLDå!~5)"ae +Gz1 o+wh_f/ z3"Yy(Y$l( lyїsP]wHg;r8dBzSluJmDxN{ǥE ܟ=W}Zk}[ĺzQA'[EHƈR\`/ѰC("3&a586KK|5iiǵLIB~9LBbjFR*Cq8X:9-ti8a9x$TgtQ_ LFR}d1"%.7c86R QNm|9@r񁗁N*pn9 #Puoǭa?VDsR1 oAH*N>0HH* J=, eTi4QY |a("5TcGߟ΍ExC,gۙYA|tc ]$n& naM8VW~ۡV3L$KO`sB 3x A~?p;/f_s?+> igde;2&)x<ec Z~er=z!l@$I=/%ߒH5~0Fk0 xLJSa[&__7NRw}"{KQ7ZQkOoC&vE rLؼ{nU+1>2k%wa1oM??ՐIN|x^r_]nX!mS+ܨ ~x|ܝ8hS^~~䍇>X'k-ANցK 3c,܁{COܓ߆bVRM{U0 WUkN$ZվCœ\'6{CC/M( ˇ 4*k?ޕu$Be;Cŀ\`r$Bj o5pB[LDTW]]ݵz;Gp~o~ŢD/h`^r10 "xcIyp)X#W~"PN;OJ߇ꦧPr_ZNG4@17T9Sy5 䰇 cA;:`h3ŊᮢAPhI%OzĩzMJe:cE+L:nŌZ&4䅫:EA`=uKo?3~x&uѹWB.:nxɨ<88PBK)zͰAdb_4]w|6tH.5b _X 3N ւIpIB#IT*Fn/e\ɰ]H[CINRr?HrwH$F KfV *S'3r$G0\oe+&Pv(!ĔM!oZ#15K-{U2l 29P#ƨR[jMf3F$sfLa=&AԬ-Gp9W_&;L].޽ɼd^g,ohza!)bj!vhsC-!͐:$ ѵ Jc-Ֆi-1;m<\Z1K i5S^Dٳe$IN+lW6'qJ$9곫${|Ƿ뉶Zvx!?6K{q"wx,Fz&!$oZdd=Y%Ig"-KvQMFCE5Œ /twYS8 {. K%i;R$'p;ҲdwXɰrszlUG1ET7٢7Ɯy c1 hwXs {kp $ B3 t8+1ɩȧUNKƤw"*ρP-1F 棊:b?yG35 pt GAu n R{e 8V'?mRQec:I rb﨓NҟL.|L'iBC^,9ՕdۺaM֭T9S:֭G0ڶnŌZ&4䅫hNIOܴncte\2Ȫ)|G2v)2RF˸АG딦zi |T]?Tc#5Oq@9QhN*n>la2BW=kZUu rqB+xϸR7>=3žs)N lnؤ4>U{; ?1Y`Q5,}Jھ r{ a|7 ]MDCo`z;`ڕ#-'82ql_YJx1J݌wiX `GpȞfe.`Ȅ0M 118(:lyAa8ZEЬcA2H\0;u!2㡙 T$vٜ'}3<4ԕOKA놂Ũn&qbGH^P0xYYxX iXyR3.aΔ*j',0Iawxw`*GKT.pQTwKrH u{]-ߡspsevqΗ+yoeu!ߙLQO{?eiK,Z0mې'-mޞ:WG5Ȯ6-8tfhMò1I›EDѰpuFc3Ek~@ENX%ipQ1"hZ4Asq'a_eߋøMB/(J4P\{<9nw&f?wu ]DvΡMFC5ZBt~pF9,hH)Y=:ýFE@R(*Rĸ^6SMqnra1t} ۪\x\z:/mU,V"˻ett٪T7vzZؚpDD/7J(HY)IqS?R$p0ie*VTH[z2hy9#'ԄcBE)f+*#sM QBcH+Jؑ#_zl=J)b'#ﶨBdVe^P_j h WRLʿڳÚ2J֑5JzNіcDR+ X{*YFֆ&x `!uXzŒǫsk3FO q6q/ku۹o$bҭ J4J_݄U.~́L̪l{cb, ,PdP곰 4QzQtHDƫJqPtDt%P(0Tdz :!C̒^T5J`it(( \{ l*+PfLzpL*-*{9ZR4?'m?vNS%8i1馽y>lMn../|q<tEo>}pd3 ZgK[{U^9M3zp@@e! tjp7|5zKXL2%}1CafO9:$RɑʦBqVDr^?+ph5 |3[ ĺv3B(Bz;ЯC Uְ 7u83lTK8WnĤ-$Knw(Jq<1K{ &uҼaVKָwwN_)g=T2{$Ls\)Wy9dG~+:3<8! 1andPڳhP&\X(ӌX/Lf1 w͋IuvszBkͭnn] a%꺒mTTn+dVH 96\ORc8vEMBHcLrg#D+;iܽ>[p.ע[TW^=)geU)nbFkQT)^NtYypJy{w@t)Dіj&m(C-aw-&tWcƳM(T4\6mc*Af0;˶ЂM8Wm786!"坈 &F6Z;&~)8lS\TF93{~AD%ڹ kaʦ7. ./U=BLM\&cX ~Ev!WiB^ y&UuB.FpwCoQ@,0 fG4(<#FVAji&??@y|5xkƾEmƽI~!l6/?Z5B*bRPxJ>sޜXshs~9Uk+Q-L}c51$vQ*!񮢒:jA`eDy}W#Mhm$Fh-oPXˍ&XcvLS]X.cTaQD"D0tL5m<ۋ`)Q) Ke3*őnWNe/TYy8iv9k<"a{@wly7 ?oR̛`yaՇ+?,CB{*I4on^UIp+`ncb06׋j Fp J&,/w x{ C%vRN@8+%@3o11&AHh6Whcznߖk-6 #1fg:2X HIXwZ;gz#I_!;Ѓ=Zh֋ "!ٝݑ}Ūfϴffwfdd&V#hVKAD RlV|"쮺#8*(9l+))VglbppJ/@CujV&%# ׇdE xc(*3hiNEHf-FQ-Q"Q ,yu qZU*5Ⴁ|,O͏1խ ->V)Abv%e[oh"VE=OS/QzPp=Ȕ1jͼFb}Z`Z^) ]_dߕf_jukIKE㕺Ƚ({S.kG%"J1zuizuyߏTft\[?}=|-?H_瘮xbΝ+cn>}lmh {}df|RekwYAT# &4ڪlh.XQ.ʧ+aTlCkHQ6R[8%Q

b_ z8}+IŋZ^{7$@t@t3"Ө6(qXOLy͞609O V-hsdM#NgC XVtx@N>K"# NO9oE?oK[eחjt_^m (Xmρ'1Mʿ%Zx+bwyg*4?1К uNPUT*D#E;5fe'wݶՉͣ@zcxi=Mo SRRoCԛ碝ՌEcʢ`m7֟(z}eThh9>P͟BnR!*¥ UKoEiF=n]c6&<@$<,WtUQ53T3mT' 'c$z( '%ڎq:!hEw{J%RNU$9CEz7o)8ubGN]#t9 ؑu*C q8M(ؖT: RU5J"ãkEkrbsw@&L9TV( 0 ջkVƆhAJ.tOnSCD_/g(d;j385_?~^{{G\\MLwW.r)1pJ`eH:sdb)7kAQ6 zn%O(U,+U\z m]ܻbCKiȟ=ȶ(w~Į$T ^'$WזAd06z,S%,]7U]3} z/։dbf|c7+??w_};8~@h>9)W8'Ox\{{dwgÇ{ 0=t0LUd_R-կT!G* v5LV կ[JI<kF:y1eH~z7C*_*|o}\~Ho?:ZA&ȑMB9ȹA SB`LG`Ciܳi盋.~~ فR8VZ+9Vpe9'2ZF(#!Y#biGr2zh<00wQA_L9=S 7c9{c̕EJMHUK:ndX d=\ ڸ &Y0FW#+ [ yF:An [o_/l4|]RJO@pBzzrMlWuY%|FaV*/ݺ8޻'}@E^$f7oپq8  KaRzES k+˿f}y'g]}ty3pBt]}o`|ALtox RԮ rͺCizK&&1K^ (Mŧf~ ;pAO[0cgP+@)3 of_`NHVb&Mt KA?M]_i꽺3ޥǘ|׬|oJg .FyƠYMnOǨQ6W)b@ʧ!}>osP*_~& _zP/C?>IdjEwɿbtV'GvkA(ZWje͝ | aC-G#r}3W^mbѩZYzKJ>0+~\jVe.ʔe*-m7s-nYuh8=Ҥz}8=1LQ߂Dև3 RVuh(B3lɡ<^*zpUzU\eesɾo]vu~ H;QlDe eӡ?!N:/8t2 zPtFq:F}>11kRHnl!&f[]9w$c!Jcwb{?{F2%@fŞĉc;<̯_RVKHVU_X7W_ J>QO q-`WxY!]imZ bcn6c,B}"l3œ#@k)5UW qh #ȁ SxF(m_ۓVqވovbBqEJVϲeY=˲z֔յ71F3:"gQK6ɸO7&_KGy%g3򚷞y9xK> k@(yƋ˒XqJk(Ԓ˄봔 SRL)G߇ۛd`vie OOBt/*dptEH&N |1RK=w65c*ٯcm4H Ͼ١P3u ?5xsL~k F,؟ Į g䢹9?ި?ʍcf&䝌lUAU;x8^a qUaq39#FaL#6ac @W 5oTMe=2ze)#)''sĴbFh@Q 65jLs.9OM~߁z;jM0/za.\%s_>n8]8Aܤ=&Pw^/z`[T4ypՠEՠ7"x]q;-iĒ_-`f!9Ոp4ZYQ+ЬoY-!CC@0\)}H "܅n3 -i;?/くWq3WzSz>ʇw _rF;qgqhٕKg^҇\^͗e>֪Mqܼ%Xn;p֩1`lKQn6)aJasG\ "`>t+U, &NcMZ$⳥P!NiolLݨ楐,Y坠 EЈ^q@fk_ y\,OꙞ-Y8<`o~r{i&&nn:%`a]qЊ[e{1;3[No^ܥ'[bmd3&PyM`VkŇe -d_Ctg-;p@ðY^y]njXR?ޞ./jn5Q[PyeAˡdyӾ+i: v!ڻyl˟F~QEaJ+F*\X>Г͈Cl)8 Ls!RRQ9*YeH߿mZ1aaޥv#MX2jzp(KQ Ocz?QNa0@kpB2nw\{`'uqV΁E@ǹI*^WȐBQחV5`(uZ֑4`tjj(bKT^pJ rֵ"ƭ4Tx*lB ]nD4:Q[> uD= @XW n,^Jxr-jgk)JCZZAu L-y(jYm!!Zs'3鋑։_Dzs-jm3c@6ZM,jV{Q|&-W5uZv6BiPшux,/JO5 Q%_Dzs|M(,s~nl Ek‡∋7w`WW!YT]_৯V7 b1#fRփ0)aϐBiEگE\TY>|<9a"wN$JhANOI#&R2J;&r0 a3ßurw~8`W^e;xHb25Lv0qrg^ Cu$,_"S+4Rٮ 4 Lȧ*\;F9O;o<޼= >A:tCT0f hqU85cGc8 I ԅ'W}Q7-(g+<JM_Ftiҕ|=o& m余3br45f(;I&BFCߏ~|㳼5fkubP9IT{@ofcҐeW,_B2LiP;P}oI8#)+uV%v[-wFT,b XPE6'nFɸEol+#oZu᭤F-$SEg@} uVQ r⒬5ه[Am>'JR~t &)h5NfhT6>`㶗juM_ㅢT|i}Iwԯwez6hp+J2V̄TsuK9y8ңLG/ v/3|,υ<@=9ۧ1T F8-G3$;FDT? y-+뻋+!VDy7CPAMo*56"rd 3rO zC#c4lс҇`+"hu*z!nVc`p#_2gzNHrG4FS'0a>GGFQstDK佤WI)V1ZO1 kcDk8:/,F&5r*๥64z]j} %rn<ꉙqȸOxOMˤn ֓`=ybv>(ў_݄ݛQۛSjoxѧ˫l{ğۋ>YbQ"w_Vm[*Vm(ٻtMYCVfy,{ GאQ LWLM𮶾y5ۻ 7S@¢0b 9׭OT.mz 9b~HC̗CV5V!'pr~~bD]^oFJ4q |a℘H* DEqir~nܯUmvhF:~-L{? VBSK;u:u jhGB ?`B+: 5d`Ł>[{[ 2xB"k&7&3h˟懴CU+:'Fq-R՞6kuhuZ"})yDsU\ӓش-KiwWM5lJ(lUhV n8mv rs(:s #FG3@ f P@-&4;57{R+{R M+,PuP'B˂ΑeRg/·v;e`^^I1jnS~A{v8=Fs+ue~%eIUʪ@j@7}&<`,X(dijh"n# 8*mt5·v;σnc³._z#XwP\x^f1i_waU.yHF WyZʫ; ۿ~z.۸'Acxǻ4Fl֋ŦabY% "(0:Dh/ۦYoG󳫿?^7+_hW1H%h;/lluY~'MU0m| HNoWAࢳlFQbkGQ78۴E ~AxԶu3iH{֚V)#kc8[2- FpKF],_ .dK1DDM !nmݦ;ցV# n=rVQhT~kWkgi \A6>$ b 8v>=>#%D`։q7Eq#OnNt筬W͒pBYoS(JiFgcf1wNxa8U[(#%! K2Iy[y2e(]CL"<`,mf6HI M99$sS~-A۟a;#Q+@o^ ŅY l FІQ; fT8_Fd ;j\_܌r%z79Xev$ NMW4ן&8 &9(.]7<ta1Pml0p4mƦISv*> ěM-j6ERK'tLlBjlW7 O#v}&7>!Yx`M R,<_Gwoa'S!mhi:ZzMM g"Q&zp_Xc p@x{%vPnă$q;@Z,mqZ<]|7~*=%],c!Bc Nס=z/jR(~t>xR Oz\Z0 @%`4%~7-갅> '@8U1l+ֶQJN\` %K F9-E*"=os{7:vXc=Efu;R>b1j&m\pΕNI@Ӽv0ήB:!!=/aVϪgBjL<ľlq4̱${/>+ cS߻<En H5HVya7T/G֖U؇8K%͢Ⱦr, rQpnsU'me-owb${qA%jq }YӚ|:@)Ʊ<Ɠ1 i3724,˯`bd4M Yiq'}a܀ zb=sfϐt(|*6Yg|=hT5+V{OD ZUכR QmJWQP{vU. 3W*M~-PځaF1vBQ_Pv8w6ū@;[xZˆlC5X;8Godv˥PH]fԄӶ76#\$Ѕn:[19pN;lU;LG*& =ݖأs9+Β NLpM$ 5qo{wYK{Lr+hm2V:s~-OmIa;:/|vbEwjeNo]L$x5w]VSF4(5OC06շ!л0}1YV;"Ycoi lh~FX%5̷|b۠XvFȺ,(B6u}*4w4N8TlY1\l6$;ߠ5SI/')Ę)ԞHxv*JqP{^mNܦzFH`k:b}%Ϥ^<߮q1i9 q W+HD9#IHUeҴ@t644rm*oQr{QCuRZ0^k ijvru05c 71r}dVdaU$ ӡ86+ƃ?gi0 Sآ9?J|/ z;sZNˏ ;;5<ҠR9y^l-bjwqr׫Us 㚝;}XzNh>wiɍq#b]?@UNsȣUۣG:6[+ܯo}x*hmdJZ 7 )i%-˻"ӿn|B~_jGo>.Plr+M6ZV!H8IΗhDk:/J~R [ۓ<[Ε-gBѹY_M IYËF6Sх7/T_joTD!92-hډ7om:&}s,j,4-v6)Ӫ1Κh{ jk(6K)(s;QBJs5*#U%YrOݰVobG]^H2Շh zk^#;K=5Nb].#H =iu{1 k%AүNrkё2\K|J<ܫ0GI QNT{?o~t,Pjۗ,|^a_/UH~Z_!HGRxq)'AG搼LFFiREHLњd>;h>?>|mGeqmhB~竣#I$Q[fͣIn-z؜-Żߎ?`d~j38~| vV4v08;Шwlirܓ=: ZO$_^;ݳVQg-җR{b9GA_au(@BpYvt}̣'n丛dL|wH ex[s '.;xHz{qB>0U;bc2Co{4ȖprQ9'wB J܁A#HA׹XOaBtLNק'c*U` >)>8D:Z&q$,٘.)$`IG8ն䃀0Xт`3Id9_\8Am(:s ;K̭g\g"i:r]6bDdX͘s,):$)} QVCLO.}|5 7X`FsZܯqC"_b=^3aSu&ikxe5xNg}QVh,Ҕ"5YdӥKΜ.s)2xԶa=>^EOX63<^[yÑ"gOⱓh0" -cQ I,D#!d 9EQe'Ņˎ:U$%/-m"~HDvr<Ѧyk\aO"lB(KzCDN&$~T]. Qi--.V *DW1$'TY(.^5uQULJ %Mm\[>h|E5L 0@z KPsnkA´6V[jv~=KtDCHf R"Tzq\b}U?:U_wBE᧣Ƀ$ݛFn]\*77JFVn?_Byy8ow3PiL*ʏ\3[| {'{  1qhcML鄱7G1;'a ׵^2YD[/cΥ+v/m~s ~B? m\ѓH} [fj/nEI6&rrrs^ >2O_T/ns΀bj.l y-ɫ4e?W~j:4qڎ;' ۗёi;e3 gsRre';')8!sǸ0y+$86VQ *v͐ >\5t'S }Ql?P:s}N"? T[swM`}~uǾd=|rCir'_ Қ>2RuHV,ߤmxyg>wSor/dP3j9#ç3,*_y~k Ǖ_;W#nͱ5-;1{B1tfcg "BHxg9(Z#B#ᐍ!8F !$Ñu U . : V"Hܶ( gEw)+٩tn\\GyFPq:)|')C-}roo^j[ NL0^!.Jb*8ʤCbfX HŵVYCu&DI]YrAF"HQ@8'Xb? nޘjT4U4M8MS%NǧJW+ }_ !rC!âu%E% lVXAKU+,ũCa61litZFxKq%`/M xn(-b .Rp[^@,L>h)X+:ZUmD0naHIìkA3DT((FDx)\h >F0peH )mUH)UʐRR*CJeHiu!:͌,(;a*{v{ Eб P9VaeOvaSDKhɰ0Hm$KdS*m&;\إ]إ]ثh>9.;i  OqB;4#(5܇TaCM8ghrRĿKs"cgմT$^۵5Pokwr.wWCc̼ѿ` j? kwn,X\gn ڽ?/n7,+6>3;uK)ݮ.2vݮTԨR(4᥽kwϓH)p*̓\kh36&mLԒll!Ii8?ՌUh9;lnk$Z:/u_Z$XKqN3H "Qmގ7(<b$ՔNyX@wSwKQAƔy $bI"A"ǥ cV}iQqõ^|%V> 4)hc F:ÇJ؜=9y(Y-8fQA3d˘(LRp3)4e@S )l0ZBg ؋RKvYJJ)$XH A s6k3w_ )Tb6WT"U20LL{D@N8 ߆\\m"T6h <^W9م(]h3 )Q,8p ].k eTLytZʴi6"(ZDt|C@`YP0`&qGm0"FR*m]IQUue DTZ/׹A}_d ^j`j]ҁ"Cu*M_HXdK <.%ytix Zf#.8"@ 2Br΍c$9 O9hl@ZBeUҸhA4ĽTzdQg.LrrLn6wФJo9ٺ<_8q#l\]NT+VNn_A4*0?9.mgo@`nO(|N22"# Xªˀ;Jpֲ.grbJ@T̻f56g[rxp&gR]L%XvI.d +r6c.iȲ.g Eu9s{ޜ} UM\4Q+Re]B:֏`r*z6R8pjK JO ʑMB;׺޺]_L#NEIS% l]C1D)O+RxeoM1Iͱ@[Y`ƳY+^k &c`41o?HH)[yV%v 9a >$<9^iCw׾Ni^.\Ng$Tj{f;w3t$Fa̸ ZCTrM-"X{Kϻr00 ̛dH..F5i0Z-F9I4m̬*=3FLym3FRw7LxOs3)ifjJsZ\MIb.{+/.{{U?.z˗cj`}*J]T&ٴC.uB&q i ̶2@xl^d$ _S'-} 5c5WkϘ)„L 2cсCثμeA]6{}T5KBhғ/_EU#l (zz%ZfIC,$bX=IJ;;úkrF֮ 49[Ÿ(K09CG6) \WG;Xml15U Z٬v`qcDfl kp F_Fri*͍Q%%Fg`i/ pA RP٨O-4 k]5ޒlT‘Bio}jK3.9 ߳E>,}>^n zgڵsxh䠭8׈X$ AB#RSCB+ BSN!.B55y䩶r߫ڦ}/?L^WU{^c2-)dvEq!uƘ!gTx"^r<1Z1".汢jB=!Foh5g[ _tx,|i}CK`2HN)ɻ_7/^Bjq 譧HYqS~c1pBԔ) /4VjG ~pxg x`O@;>o͊m>&:_8܉kK<(_5c JXSgK\oV|X@%7x7WV~LQ(ZNjPZMz|Ҷ缥"`N-ƘdUU (`%wy% [+sw$PP{pK480B0"a.*9H:O+4DӍHX d2cR0.*S02L(S0V1a,PR٧,BfP0#[E# /AϿ{Ep8+8p* d3%[ӱ o.[ %՗x>\ uqf'diHqdFI3KHKvM!'E"%vQ Ȓ.oTsn7ޯ/-[iJv5 Sk-()g-M/`z;Je볻{A(߼r6X.ABOjYd 2$j.!BŽSTo-bmIʩ 5C%(,^AJ JTU":d[^_%M%d]p)z% 5} ^__Csj?MQlOU?o?=;WE#/HuӴ漅Yxxw}U@w/R68ow88a,&xs6ɵmG-+ V7l;'NJ{pxu?Moֽ:}h 01Xc,e spX:-"|&D dXcg'O6#oEρr#.@}o8蓚;5o7u;|W&R=c4!x 9Vo\Io y5X)@}pby`znkf8;Tti dHt|Q*I Zbh*@xA'X KrM>}A 3Mq$w\|zk3 3JL1p6JI !fc \٢nHM%~aS$N>2&W?IUgcx)t:zA ~[?mIiwö=9"эnHb3&EgDj%6h؂vf-FG3Ev,$ 1Muj*wK(Q>L>:/tl%PN6a`u 9_/~@S MB%f'@hQjǂ6ziw9fO0Q Rљ>WL<ڃXa ȷsƋQ2Bj@GEd6QsXЏح@ﳈ`bެzf7upVl *Wio-ȴ0B}Ae!D8p/M1t/&s5>y|%璕JU ɪM_uL|HAF){@5iqwO[Nmd4de!Zh_"LZʞP6Uz]S%`E} <Ljj*9Dzx 4KQJߖӏQm{5dizkw0ֶ᩶R#T (#*M|A-y =kݝ !u ) C<Jȝ+>Ua aFVzWxJ;MFf/@Hi1ȴs+e{GEvF>iHl827?L+։+KT_3/ P/~#%spV6!"f..o;K&(YqH"4N2QH o$-x԰FXKW-`iސ'f]Q>۹wXA~}f~:]Ji x~hfsxxO ΄'8}$I=D1;4j{=iuй`2c|! :+YB)@p6354I[a#NK_2wm6$#ZECЁD}VW?]iA:sDh*@32QNA'ңFW@M2;h_LztJ^I/]80P>Fa=7a#B^f I,`q}G=yAÇ 8;5 ;gYrҀ95y7 ȰRFb=逩/>2O:P<>KPEÑK3G`si1+o3,3}xXɲg>:4bD8*eOkg9W`qV_ɸ}~eWzg/дq;.cJv=du3:_"e0QŻwy?⽎#}CBՋ~򫭻__mR^l<~ b~r55_'޴wsiC %opƟO>2Ջ)zJ~nK'xtWVZCuO#`~Mω#I6WonH[Ø5/y?f&=v s HO6(GW񼤋UUG\~|zs͕~> _SwtE0?Ɔ빬e2ð~tᗣKyh{|oac_/DTGq}w|uv_z7\_}~8\oȗCEbRsp,ɔ~;s$Tj*e \E}f\bࣿvknnk}vz^n׼~<&kYe1?"2K>;ẓ7C7wS5:rb;nI=fc'Նedo[gO\%޽?)F e^,@m w4?Jm[NlOJa,^` ' 4X0^w ܚ ,[HeKЛ.ڂ!OZ]d}Tk!WJA^JIK͟ogrq "ܛ.w17޶-lbpz}04FKhgݺE;,~\*=MڟvD4MhgVBwc,/KTV"qЫ:4LMBD)!^Ԋd^)k-֮)wH{NAd5*kyS3ۏ@dA(o֪ju-|}-H3TU}WW#0P:amQ"-;\jxo'wĴHnhg>ͽ|3Ź-Wg"_#;kEN~1~`*)Ov"7L]|m85vDBP\;[@.=<6DUA86]?t?T>NR=&{Z=1}.Kڅ$9-l"2F3]<~1 .vǕGեRw]`3g|g&a-5:tp77]4RUq#;ΩN,S;.( p:ܑKEn2UYOnQŞv Oca1pWt8_7Hjx.\6 6Ƌ'Գ~[uў Ro!qeNgb$NrrZ.^$4∗vB$M,6ȼE&y;JД´)z@"P\Զ'6ϗyy͕`ETӜuIB;N:u+1G ))~sԭ\LJ\A*.tQOȥ?ik&s?M?FGuM46 JHޑG/&F^!sWZE 48 Պ4Fqhy x^dV$W=!{d⼡/K[J"ڔ6vD7r M&^ҫ㕁21NѼP`N1K dQ(NL.vc(4~Wt<)(6qDѾϦo>[̅؅Ty L]Nx!VI;6Oy'9"uk hڈ"ܑT`T>?'Jiy8dI+$uYNWt}?Sb(J/A#5c]"*5%G}Mݘu {^txT/mV= YQB+[/ƂqJk,ʜ-l`ښ4R!%}#% l ޡ/ج)%g}D7'ٟ" wMJ.mZцAWKb݁j!\ْ^ {}7Hn,qkM1n, mK.*Ej3ے*BSkkচǖ+{Tl);f 5{3yVNڊ6 hΡC"GL{RPH)+L .HFPHW pWL+~{jzǜ݇,ar<-J"wy @\iJ [b=S410_F Pi`L!Er{2kw>汯C*O^r =C({~|7ND&.P$nk4߄M)0dq(D '^n_M^ :!8 ұYAa#oŋ`06Ĺi0Hy(Β+&޻[R5}F@L6o9a..LrW|2|+F>+ϘPjS2#GS <wT8 7*fZ׸y8U"3 EYl?l.xEGr0Eg/QS#G`c@3~]P3[`^pG֢s6}cj*9FĽzlz0uϹj$Q-W(1xOSM#Dp%x'(uBJMkƝ<տ/e[K_Lv")<4}YY}!_?ZUK/U{Y3D\RJB ,HC\ 'ʯ)u^_kg~Ywʾ]u}m z?{ojwU߶g^moC^ݹ咫|_ubvT7BկSvc̴GdQ5w 8: =QގVdixuDbI2o߶695+ *@1- W.*>c8\*F %H)C$ȉ2;^ ?mYS{V!k-O7Ɵi1EWe/_gG1bRiP]m`6WQ8^{J|"sφcIOGniBKqz eLK'P)שTERl,JR 8;SKzn6lp;e.ѵz(:YBiLtEP k.՗cf /YMe,Pwrf:ƙRLXi Y. .KDp-z/^ `k7IjWD2&x.4S_3!šm A3g>3>slU>sKe`Kq,DRg{K'h{d'84)"PD)v~~2>ڙ /gTI`@9.0&(&w t,3}2TP&evw!k-&J)XXVيJI|'{&Ndse{"M q_.hOIJR u,xR9ZbjU*τq4CeJkHiWQwuRAʵf@a w۞ 8P2c!8 tziQrSG>:I4׳׎#Hl܇Ų`HNH=[r6f[i = 1}^|Ύ$* l%"|,S鲐q3ċv2R$\;rGRʯq~{lwvS]oCL=$kWM%5wGXWS>N7P=8rgIT\bkWk* BfEK;'r跘G~A@2 \DԦ}A}$py3E#SrMAJNs/5m#WmrN揓?7鬐w.cl.zǧM=v_(ɰNw5d$/\ZNi}zV`5NqN^BA(&]SUuzrt{|n7b5M#P`yٗB=W5ڟ^?@#+6Xsg6 B9d^fXij%t[}ݩbO݁ Lr!"u÷™2C)IiVZ gr]*5!#˥7i_?#~Viq]9rV d3HVǠXk4A-˨dIF!"m:rֹPau"m'2:{ EW Ъ-E4~rqA_WlKGy}U[„} ֶgy+Zy1Իj}BT6{cpA"lpc%65Ժ#ǏqȽf>J34!ˁt" rg"bW? oK0{>˹?v+-;l{C?):tgs3:v+ EुJ\ҙ l8|6sQ`ä}6cvE1$3XhbC1>! u$ĪX+‰,&aIhw`u1N06itgҏ(`X{8 - p==c{8)u, Ч7dӧp!xX՟788o]nfWd>o߼>>y;4C`'/Fp?ju( >eYyppzFCku߁M˔/??gw="nWmd[HV:8m+ pX;Zrmn_\gnǪ\_y߃ 9ϮeYzj&Ӈi#_\W|N|'"VF6pg; 69vÁTxܖ(!W7dʎk$=1SCB<=b8Qhó+.+d0΄QO4-P D3b"ƽ!>[:):~\mRm<6U^S2;0YC)% Rk'@Ֆ7%P\j.9 p>n;pā|"b>!5?xIvU( JNj(WD)]9э޸^¾w+#)Nbco#!8>摤ڋẃ?$ Q۟ ouO>IPe׾^#B4 =z5 - 77 v ha#v5bv́/ *0ghfx„ؘ`q0td; HٴPn~8P7 I[eS:^tM|~Æ6O$6 #.dzZ{|PxĸN)b{96ʻ  sTy+hrx9WpixE}b@ $IXE 1_my v"Dm@Yq 'pO&z8BװU8?U4a3%YcDBJ"퍎2҆:c< R.lXIŜ&9EEΡKJ"k,0ԑtL63LHu"f)+~Fü:zgM7a'yP?^O?D#Hr ֏Ӟ:;ov˃._2 Z)4|駇Q Oo_n2՝Lu'!SITwRT7yuKhIb8%"N&PLj:ĆBFRJGR#OnlЗUdB)7.s[!48WgW| 7ָ f_m\ݛ: #*;g yTvO<">&"jiQo߼z(yǟltÈ˼˽{G{*LG8yW"icN0@?.j+9BU($3+02"w (NxvG8Fj!/w#&m5p&̀Y$Qb:Q jF'<4;4Bk WScuBD:ñcu&*lb FY*R1xy#I,AL0 2aBJqu\Ж&.4Y92). r*fofAb6"G -GEqHhm["Y\9"1k  ={[0H L 1Bogֶ?E-VSeEx/1+ް&[ n[n1F]YțkW7c2.ݪvڞ18.*dn[.67Βj xyXYI23#13C&;wWS0^Gm|o$#ps/w]tFHa9PqeBĭ:SW3PT}oc+[Hk)@Dl-a;c#q6<|| vL Z nZѤ.4ʐhwR֍#uݿ`4gZq_q;j-?wyExEMxw*fkvF|[^e ^:jm wW wKn3--c,m_:rq<@[0MsG|@8a#(ng]k30* 2P~X؃XGt*A?-er^lIKvT'4mFPgS{٤-8qE9+?|rU\+7}SWnר`OOt6qaJ]z`% pj҇zk-DBTRZ G4Ң :{¤"KVH'֜pנrBs]ݢ8ڋ8ڋ[+xl1*wbIJTwިpB`wjxB7 0սBg޿SPg߮3Q6Rn Uc8Ɲq෪ 6;ט`ikLRB#b)4'tzX 펪jL(&Htnshߪ //r!ht.{iG9uˠFK+1\ **uR|_퓵 -8fIz6TCW<>߸2.iӗn3)if6I(Z?{*~1_CV`ҁ:+յXddGאָoSC]zN'w C5St,V5R lD|ہ:>TLJPSVS90zA,I{+ 8/@ ĨyCV5f `rԎNY`S>{R8VJk!}qnrLm8uZҝ-Ucg(2bޗ:z-=BZӁ h(9zD@KEI1ho *6:" :4Q4e+:z--,[+"3eT(6L6 Y TjsU`bT ɃR0 ='Cևpu.k^ń2(_:vWv(/Ak)Fv%eO,JPzQ)MqԤeG~~C$n-]RF)B/O1Lܱ-ZǎuƁێ-#lyyhyrG4k\"Q 9^Lxe*#m,whl]mo <Ǖ6iEecg/:@wE'`ºT&ZS,nce[T7)/I+k i^91i&~zrS ˬn"2uֈԱ8T?)1A7B&0Zp,*l^mq$6ʓ7ESa ʡЎCͬZEhbw]V:le#d`yn,w 24?x3wf:}u)?(:Цy+K,R'F(8PR`vSٰv~k`[$v4#,֪&]>E%Dc7$B^ם;oG;JM10%5Aa[ddk@C`uVf9(d͊4슥>{%D4?H^yIˑYDNCL1(JL|&^OG SJci`xZQW{YY}nKXGdz3#^8\6ê1vBbj YQ!y#3#4~h: ??1c&&f#Zņ}HuyAQ;~}A[/I*glQ"c`b1Y[J^%xO48D?Ȁː+[nΩO/ ͱ#uL,AV+ֶ(6{&قS.& .!A$X(D9 J{ck(N^ G \)DQ$YAqlfM瘳v$k "Fjo2y_R%(?rTk-PʮW3FrS[qWup!u|o|Ÿ~K%r*g̏R u]//yID^__};R,14ǾϿ}q%v?ÄBj0Ƽ%{y#gXxUcLÍc f[?nߠ]'" ,fΡMڥgm?LZG¢h6UB҆&BY%abmu`ie0V`6oJơSrYZinkAfOiI[A`k77{vޤ%cn9 s !f:]e^$$a}wкon%u@s{3;^ZHa,nYRA.a[]N.xXl: K08@guDkk9),/|y% ݞAb*ݦ5n*`Nt1HGeQZ$ϞYN2dBNs(b5=$3qzyg^e/0Q$Z &@|FdA`*ZeDc9'kE &%-`m*9"]V!ugSĈɘ6gLGonϘ"zt]!']ɌP$hT xI|bf`I;'bqs(-Dd=qi̵ {_P &naXf),zR<8eX)`HAđ5ƯԼƑ.MJ 63 !ǔƺA3VI5Ҝ9-ꘄp@k)@2܋`9͖a,R)4 0:&Pr 1:)5^{zȐX^sAVgCYOGfm:vf::mل@Z-b81龁7hM2,4[Sﶯr!ڞdaU]d-Al7XXǫ*ke-$ͼkŸ́[T_YrCM ޾AIe,pƞί?e} qM;s̵$ŧ,X4|P" l%eA !G!Y n19N\S. )3aU,*:!c}8[սP(㐣-}xqJPٖd R,ȡQ.xï1[Ժ4A8u-4+(.ĒM Ebܓ ́->kcL׊kZ[ȷ5BZvB/ٷ7 }&ieZȁ[(ahެifB͘(W԰@a޾AszZmǓx#lkh ܍M"cRUi5ɾ@S] RJ5`4g--gta$j/K]kRfsCWq^)CB̅-QX0.{ )oX)o*;3af17n๡1RJ'ƃc!gD,VLؠAHVQQ ldZIsFmx@#61fs[`/sºvuoxZ?cKSϿq/T+K=d!Bz_m7\a?h?`sC_ox:ݳt?\|=JV{Rf/[i1żW_FQ9 nxޏ춓v1`[ծ^)ӯAsϲ.׸λ:R-`C4@-t6xՕ ߮Xlڎq{0ƋU!ioߤ-8zE7eL VJ}v Ձr<\'6QlgkߏR_u5prUu y԰ǭ p=^MyUpg^Oș}~"kŌ{vK9)=++9>Mf_Al&f8 ͔wuK!\}fVA$䉋h:ulk7U(uBVѩ2tdi nuH lX"^,=HYy&Pie/co+$I<{3g3y6gW?FvaԳ+LcPMkj_gҦ!8ΑJy>H>_E'eˑafFqJM|Q^nu3f-ǯ vr=]oy\Iś@17-Vh\~*VGY%`|T|%@x?l+*P̭D$,5O7VwE>T~M oxFἏ^G%AI&-QQ(5b)-|GO҅87a(b7II$QŌE/RW|!q0;/-ul,| yǷDy1:>ÙhcuZE;Tg4Lc|eEv^d,k͔~RG |ɝPi%),qe-vhVG zA҈Ͷm݉ bȾ,N'ڤ yDžOKA٪u4lڤ*^*蓤ńӨI'c}ƚ ^M@.0Fɉ=!g^+WY/qW8PDPS(^ǹy?J#xcADUCbf3 N8 ~|THH>#S`4 BR*bc&SR4hvt*J4:!p!XVUR#[xD+ k炢Z#-ۺU8"hVu˄ék^tMaG4wnG#2d)uòI*8QM\}+8TV\LQȬCO(lZLC՛ck2O/Zէ[JPRMXo^+ ,[̙`!꘯ǙPbX5[,X- 6.S#bYI#xKA@vieGUD'*(hpbD%Ȃ_D 1OcD1c& \(ʉ&R57 ;W,)QC!>f:DcwsMcgӔi*2apN CQJ,Kk%ISE7Lu BTJ vAmyNr0:º!b-{*(Ja 7N"lrF8)^Ekn1r0`a PWTm.*q@x)5Eq-kwд)~ᇜ!E{q!`*B g_? C?BJ4fc^ ܚ-?%QdF yI@w~!\E fq~XU/{:e 첋8bzV|:^ rtC_z1 aP'a^$[3K!+XSiߤm][ ƚߟ0-Kɩ<ߛMwȘ6IY>o;˜vTt+Ʉ&k 'G_"ˈ^[XY#'>j%g}$oze[mȆ Zֲ4{yn'k 9}4MG7t""B$*p8? c!WLܽ|p9o;ۋ؟z9э_ihFR}CF1mE@RݧZܴMrⳖY h&w4)yG{/T&"}b4]aNXԚmgb&v+G,BG|НO8RFbVLKXI_>XTQ4:/YPpZ TZ.1`'&7[|=b`P;-KixB*^~ x#fUµ-ɔ.QĖʴZdDҟ8Schr1cshQe{sd͜6kWSk/ bCK mb67  &0ME寛G)Nɩ3{Nᑗ؞X4Ĉוu= MqzޱuX.p#^*fzо%9 (⪸TCWtIn9*n8Kp JޘkV`[@ UӒ<[*R[(il|Rpi$.0iQ)^,pՌaY û6[íM)".2Q`p :oU3*TJ}p5@J珓޼Nqϧ&⯽.|9yy$ӝ('lwz1sEj+lB|s -=W4i2E`k |B tü[ws>F'pmR[CGcEׯξtA);*Pv`4N{ܧ8Qo߅'@{= @/QYJm1Zۘưi%KA_*lKIu$J}h!GH.,|mE| %bmV=z:LIN\tLߏ QHHh{8xUC萲ďaWZaЛGSO~/AhyYCtkϒGy&3V'c쓤8kwvfrhny0s]v޺*;pޜleFhN[&2WᨩVX I6Sjy< Sl6WiÈ7WvQ!]4rfnUU'EfB8фiJyB`9o="@tdj81½a2$f)*NF8aZ ؑX PMcm93HlVȧn kb,2rNpװLfN"--xewIb-5qOբIxg]|O;xee"/žxR)5j!(Ħ3THZiITI#2ix01a cݚ[tȭ ~ofmaj c^H:r$_LQےRޏd`fqu?u5IFz?( +ĴR;/gam|OJr*>j\QQ.D{ceeԏPs˃Ut^I:#giЭ'vW>/ 0ƞJHyYusV09U^z>f11bN,,5iRP)1rƦcb!d`pHgȑ"EC> %/wj[{zYagFRKC6==2 ?4WEX,~QHOE6jԥ/*b.}}AESqaDb  &xnd6EHR{l6Ij F$5$5OQuc4= 1@eԐ2h=}ohFdmA̝iN?Y1I&EqkAAFۄT;kAXb-Txz<؇b\ [+$oJ_TM(}yoS#ӈr@_Òd"$:ʍvI% XcD`8rcZ؜9Z2qa2ʂ&E ,Kۛ:\+">R7.KcP%@rSXIp\ bVzz]6I[! X%*ZehD9H8Y_V EһV"4Vh[g[f[@r,K:@q=hMmR d)vlm-t7ҳm_Ѷ׶E#$^VDfڶrommu1_dR58rG͔,rt l,rFa xjMܹ < #xiVM܅M,PrIJDcgQ; K \p::7 w*$JW$LEH+p;LUJ "j([DNp3ny4/ڄk~cWb(d`I:- uuky;թs%VDF{z>%DžMKo0<{A?d]gf )&blђ䳘A3 Y}4CwKi7 /"A.PLhۧ[m0RԈpXI@JvQݫ zӦ~>$ǧ]4K52c9<Z&5=9O0} mcoJIPQӵ{ڋ֣+sҵ34lhlFGa1F%#EN3kmqʣVmS}nG Ky^pFw٫ .+6N,ǓCmARV=}$;Y8mPI>j,5GB-x9\1Œ!RRWj8LVy#kYV[C1( Ac!!/z? 8Lc9炀E ]5}4Pq %\X5ܹ%N ̺UH,;Lt{Pg#&8C`~P$ASܻ|,/D*-V:)@| kŽ+f wnEDGLYxһUfʼ tz2i4lΒ/6off0y.]cOzD I`.ayA`eni0c6 gZ3~m,{zK c`uB '\䋷pCd:0*^ς@vRʧ|cƅ+̩3L־ PCPb8)5NpeхP㉆b%[-p0P+R0S>S*FȔdR#Y7`^HZic0 ÏS6Xn_"cEP PQ^I4l>m+Ѱ=S2g,`m\& p0!,ksCQk`MΪ$oXd9;ƘCS*qF_n0\X\f``8F(`VBY9Kb+]GxO\S ք@<|b-jqA, AcjYֳ-Byu"23(,'YH6Nõ`[,BƁMԈyGܠV,\^1.M0ZjE"ukDDd8Ru`723 Ɣ:Z+ N²Q0TXxصR_j@u(m^^+IL^y`[^\^C "z3Fװ>9wymC(KGa%d,ϤB]n}Hj ̍P%2\2ĤܚJn]jLWvuqUeyeI4 1-ycNc9uC?E]i~‡Ȕpk?-௧BbDy| O{:픚y fI,*<_ _a\?y,s1^Xϒ6(KD帒MȦ9N[))SgBJF݊;:հ? bREsa:7{r !*YIȧ(z/9TqSLiL6H=eFR $̓tH8mO 8GbʶШw/H2U'#ߴ^"c C@[Bk"#Ek\Z0CW>gh,y!#? eLUw4)!?pH8r?u ܨNQ.p^\q0zvpo|şV=L&Ե4ЧXϚYXչu^dʧT O!k1]~;xW6UD:W*D-sZF*:*`E/ADZQroUT{뎩L+tdLCCY ]1=+lt+m2k&3LVOi==i[EctB|o-b>P" h,ӥZJD|6^rF|;+DWݠ^P8ZzgsqsllaF{؋1ArM\3mPuΔS Vq5GW(@b1qma3*O?>}P<>֏ɷ$xq)&Է.J n5Te!~`БJVNΪSDu;%UuʾN̜v憩:@)% >8 wC޼}inn?MtAL1#b3%+}Gb)|gQ-ehI ?pcWJ%m6 }$Vn5" 5ucȌS*R& ~-8mps̤Y6s 4#1$$pg{9:wY8{wx;E,Y+.5g5QW\.aM9[t蒦|㢦5_sof?yk,{P~)0E+#r/Mײ+\ˮ kwr/0ҀO? Ɉ>}NGG;FT/T> Bl-|;)qz%zEXܯKuFy' /pPɅ6 j`rD\W(hkL85NÌrmK(]E3| VKJxVs1VOiw-MĖzwvIϳ]6 /YM4QcTh%9mp˖\ vtԜ%sɻX }^|IO>,ʶGF՚qnY]SX~_f,rٜg]+ܒ2%pm tDQvANH%g,YN gG? W Bd2*i2C)URTHt< 9zztenVuFQh%$8I3 Oaǽ΀NgiŽ+AbHKxY~?i%9\4 0¹3y#,ii83 Ә5JbܴNխRF\>̾^ݿ=}nz_ӱxߟ4'R rv]4'/'>F/doUhpNoff_Y ݾа(*Qm'b76 U[.3[)S uMCٻ6dW޻~?^ ubv6bC̈́iR5o EHJ}%)Vw6!wO=m!CQjV˧?=O󢪒>Xf:!wQ, =k$1wAaR4ttk̢4P4Dŵ1(eJ Y49Z ;mtk?C`pz(ʶ6r_6Ƌq],WYTH=^FsR 1D) x'Á=7}LyKLӄQi* 㥍@…0-Xf`Aaq,⒙0Oݙާdy:"BHvy>ѷ /:w7doGvߣ>pi>e*68g! hOn Y7}o`&EbR!)gBhhE r;SLSƝ''@<a"Ӗ/MʨSso6Yli}༣ik㌔"ufg'vpqbǓw{r:9f 3=u4Th2xoӋ+ |!Ϟ=>_~qt6 ASO/KYOmO_j__|8=2|LB:8lp>l0M[t^_.|N:3Yozp x9N;Q/r,Sӓ0s'9ե<ͻl>>1tpHVZO}w2̿v-b48`ئ/?qkTc8spOIvT1d6<W}X,O3%,+fׄ~9~vm]]\z6}r09\d-ͅdv[/ s6_D Ӑ:̲>u҅Op0cwϿ{ y1\ >$4/?a~]! K'4>G-Zy:{~y&BŁ42 ?g$_w!YDRT>OGW_̀hӐ~n`bL|6OyH0kΉL=dscSYjjM~Ba+^&lbߗud>x785O'*&@eEi3?ՙH=G?1|{SqͣHGYi}hš_G2\.\2f}B +o'DU)_ts/XJKZ"LZRKtt*D *#Ę6m-Jk,J/la*MцYA`'SfH)7*%DhpI# K U| =FǘRx4Ϋ"Wo޼ :IЉ+ >Eo82L)=2}Sπ1eFz,w qJfL' 'W .U #) 04c 5j&CʜY)AphRdSw8nyMʆAgaJQuNpL"@(FF ֜FiM_V>`0Љtzaw*rQ&P̕m`V f9d*A7E-t#V! LrEY] H)K2#2`kŢ0xY;}UC+` AnZ0X+ a "R-& ks*:3T읓z;޴6uT] ѠR+7ZEF0Vh#hGθag,85%v %b Ӑ)ox[j#YC8BZnuYW u#/Hϱ̤Y٢t┫-Lg Ŗ뺑=`"9k`] )YA~:! ͉ \PB"{OniuUieP%KLGwD걥*`\RTkcw"UK),w*!ZJl.BU3G ZU3^\[Z@kն}c&&uI5iU;b:.V^Tm3{g;+g WlׇkU2<ͭʽ\]߳gz{5ƶǙ(nF^~B+usׅ>Q eZpC7N \#6Lowȇe[(YhgjFA D˭фN$[-&xUHkJ7k^:liG=XkMl㔮˞ݹ^WbV_iF;֋=k`5e;}ۚK(88NWGG)ż{Jin-'|ژ̔a1 ř&rYwu:W &ERaRا YVƵ޵gQ,W0H1tO\d>:4*G%kqyөƴyۛw^i!{d 7bXG/ʢM#-_;u4CGٸN=O_{6Lg!߯GYgi4xV'1C`ǣD#0fJ 0^j2Ga)y1;E_N) uTӣWY#,JgY4~1<F 3_%%uY)L}BY)MB=g'+FY1aÕk&V7)6 0:**BTzCdxjF&y5DFgxGV$ʤ`uHl'" VSVZIA|zY:W_*Uug,P住9o: pNE$%!HGCMN u fEP@2Q/MPl]Cu #bbnL FZV= 8}LP9 "(=x"Va2i_]sgYJP,ϔ/ m؎WO1 +X$bE:PāǀuGw9nFFKFuQ]^PB!v+GoWՁm"t.WmU_܄{qكL]&)b]XĢ; ;Y#v.-BEng+*TI"F0Qޕ5q$鿂˄cEi$oLaO0(L&oFu7ͮ/¬f'k;ӞZ4C;h|HN~F9wBZ;z]|9>Ҝ^rB+G :QI Rx*ƒkl13Q@yVtBcZ;M1%ZԆ ]pBbȵ )9^r}1“sN0iͳ>3^%G7VjDh"ʨ,Ba&$e gd PD_3nz)ySuT gK+DښkWj&l dT5pIfC?ɉ] NQ ![ޟh)v*8Lsk5_ MU;|#|al5"0*:Cwn/~">-Ag"_ڞ!!W8ޖ1^JDIOXEN,hĝ )OZu2-)xO)|Ǯ"(][. Z (}ZB-bSiKV.^U\i#eNi{j *S(Qj<.;""m"zZ*1g)sB"soPr33Vw]2GK;Q0 ΗnfkRH} !JszSRAB*r-+Iš堼9Վ -9EUNU 1Uid=y]QK9v{OB_};Ź-c|?D/nhة_oʜvV()}|J .`'mZ8 _11)2BZ6v:r^yYGs0ʐ*f|+BhufXMo\-BcX~&`_0Koҷt+4Ӡi&瓛k~lLCnW˴j\U2UcO*VR22`$QJKKEyɕ#H0M*5݌^$-e]d1bE}el4`򟦳1G6?G׾x!,|<6XL .S``.{= raj3T 8Zf QGI (8ʉ +$60G@EQ:6/5|mWbf jHhJ?M\T6q&.eu C E2*JblCZ#-%E! XYKaiX 0%K'30z 5-^L.S%u5Y&1/҆01*hJ%Da`s R5HI8q|"~żߪ@Ω>av)`@qr[%HV21H@pʜ9P~ ![3ƨ^mT1gf\Fl @D`J /CAmiY.'o;0NP Zbc$FFJ Ȭf{DpX .6p n;q6.eڸomq DTT2#LIhpP'+vqNm ieDp\lwt"wt zERG ^Kd\sirͽ D0y !bA#\EQQ$L|wi 3aQ{_U6Pp>z~؟n"4Gbv#ݯ D aQw(4V[mx>N8(_s}s+zֶٌ #z ͗(OɃP \!Q"ʒ@'kqDj_޼ڂ{UNPtZEk@٭8HZz-as` ё;O.0h! KxMDD vWF@5e8—KX qX,g rb[5M N *2(Z*[7@}E^K]P eHA9ɭVhV@m@Z f> "谑 ec e)ZF"wU$p{9v6ۮ&7%  {⌀CXn9s9~2]8b?>†M.UL $n/ $bpaP@L?,<y,R˵s@L<-@ay@g˂fM/%LH@Vp%uk + 2ʩ2ä́:HaH ʰ oY,:Uyo֨yͻ9vgqR-Qkީt?'7()>.Fw7a֯LJyts[1ZznR?C ^0 Œt8~ T͕f>`x<?zx0CKO|*LkgZo\DdJ&Y7X[)9S:Fvu^K{n͵nmH7.{˔bձ mOǃ9T;Fj]ǩ?Xm՗dA{U¿Xߠ. -c r]룔|v- [ +xs1\m.dzӋME~ùs2*oQp?/\|n?X<\3kM'7|+άŸ`Fb Oy6>a!$pƶr8 uGڝ[9FoL)^xk)8-Y<]*KaC8d&7T݀JjPT@TtN6^7D`a?ꜙTL.w?{8/{R5}9 wv^ 5m/spRжdSKF_G*GŪl> <}4REׄ*`Mia)5K]]4ܛY=2AEk}Rt;r`0ZJ >;5@ܝP>\zB-iu j%{ `C~q0pCG"Һ<ݥcO>824()e[^6M( 1.Z9fVl>Bt npU'8 >yӂHп.(%33HB$$_ Hbsbfy0TWb?ʘ~j hv OZ .olQj!ҽ)„Mg~ӱ|e(sP9'-=(P1zTIzpFUFͣ7kL* VXRS`%T.1BʠPP4[+,o9Dpʓ[4c1ruqϾاYS>;j]mL'ɴrp)\?|xw-\Φշ#jUw{wKإ V, ãQJyBan]PS+SK,qN跜GAb\ pORT {Jq_IcYFԵdz@^lx(ӷ~3Xq}/~1Dsk/F :MJm&QBKTa*PF 'X@a %ǂjř $ʹ̑3MJ+bN[UuX iKu:3S1C8$C1R,E*B$$XF,SXbIPKyuVUӺ}ݬ<QҀF.׋NPD$ba\ $ҐEX3b,$4K)pRESbj(O/^jp{hyVŝ9s8L[rВS^lɩͯ B7̟)lygZk8 FJ': ް]bWN֚;pz,xh"JE:s`6Đ]-:RP뢍->˺hlcjFm*.`]n9iu??[)U9pDs|}+њocc -_V G|}UէA"{Ҿ_@(1J:U|}M斯ZGVǪF;VM'׮|jA0:[)G(iK(IX/Z10aRX8 1PJUD""}昡D0!T!bajJSV f9zaF9]>ϵ+=mj\_ (Ho΢j_|!Y"ߧ#8B9e+[B(ῒ1ꐆ#]yڨ!6lz?L;$[#O}9oo>o|I|O܍ ^%}wWm=8)[c~gSr]KhuZǽЫx9}ZU8#Qg2>魖t<8)אSQE=Ɛ% DyTg_Z#ɯ\9Akg)ETB'#E&f@;dKN$ 88@F 2 I"NTq%оpؼbn+SDX+4n4mK]27HQs4_$oZuQt;$CwvsP,bUxx),ˬPUYT=^cUڧY%>P-Z7%*ҎW嬼-OVy_z'H*at:{k.T#F0k;اD~|VK80:Zf9NC\97["7f1⿡y s3.R]F\%~o-|ٞ*:]RW"1 t/N+:'+c0R }j. )%eU.uТ\y>F)O,n<dΨb@LR7yqE LA֒gs *}3Ȓrw' J%>Je_otU{ϯS¼vr>0d-Ձw.R#QڥT`UZyMi㰑{_(pRXdGn'fRr,!h?D(yG!$)$ Y02}(+l='2 O!h}>1ڃ *? &hʟH60ɓqV$g։KPQܘKfu )d<@:ga5nEUпĜy;ɩc39U}@q QXǠio1g@wmH1\ w (KU(i+;kQdF"V:+~ti̵jxV2WX 1LQVxF9?077xR=HB/=rGҾw_ҟ<׫~wpAjl*Qߪʸ~ Ý^zou;]eZO8/\{"?.Du,dIv3<[k|?gd#OCNgt~z^xZ[i[L R(y{?Il&xKlmF*UNvpTQIX/ ,GOʛJbg>ĎSm[d_},8ƭtԪ 2c LFk'^pD9WGא];cɺ87*$-GO-Ub2b@F#QT*Mo!Bh{|f-.Bp*y[ P9,7b"2h]ݝ ֕l I@H-9i* ͦqpFaJ.8DѰqZSOmV482вe"IxH*B(B&/6X XZ]`O9JwBGR7rA] lX!s6+ oĀIyw LZ~ kBuwA 1WCXj%\L 7j(8E$r.YcC&0b#B.߹YBsy&],Qgn16rBmv(|Xyyb}k5~CYkpQFy; HFʍǛzWB:gOaP X<;BBBWb{k(kT2Fࡀ6L92F4 YܵsJ(lj) &UD\wkNfЂw-mޝg5wŰjsCSƤJԉ\ehTXV/#mowr]w0|' 㑸"/v0z!teIFjPXghm΀C=|neOD07sgb\?u'fYIE޻Cv%].osĹ3[$PłRiD"AHdR!c*ɔ*Y}Q-/zK*(aeԜweqI! Ic'B2׼M`Fvu7/vVg]mC2U_/xxV}vz&2U kqQVVP9"fY=f؇z>y 1q+5Ũ-1(dIXri@AFzED#|%"_0vr0%0f9 fD[uˠb2ZHj*NM 6ڑT(օ☳O`05-\u6ɂrC9`ׯ^c=o3tt'r[63V`Jc%,Zzl X`"2BXkdF8JB90NdL 籉xD4I7H`ш,8r'bŠu :g#5S %҆4&Z`j]:盰>^%eֈ y$̀>&l q:1DPpR)MdVX3Xu&re ae(%bRZ51#+I|FHDS&5W3O(%i-6[_R*_R~)9nM 1l[ aȃ1 a3\N‹5J!'/VOiAӡq#?}LoaIݭ<h gU^W_ r8bL?G޽]}>k [½O ^w}w1#֓o973ZK|.f)!z_;Wf?v ҷ770>$H{qʳ닿aB9h&"}'V8{? "윤fn,F<fYC09O KX@"rn!rʀN4G!:Hnpkl%Fh="Z\)G("XIř`QknVt'o<@L$uҨd왒J+ ZQF˹Bd) Op*&Uj[x.ޔМ&YZG#,8v)z6_t򱪦 ?kS[ 1U #&KD؞߶[:R< }o7RgGO@cks*gZ?uu.h/Ί`T繯vç m|Ҷ&d+sכU=H*9_wgx 쫛m:q~˛1{P~ڗs# 8ʙ@ody|sLZO[& NIAfm i[yQ>a7yq'Bc2P>ȵ烳kt).ȧ͘ym[DKP93Rs^|Y?0Ip{wLrYX$ȍ MG`ax^sBkٽb$.|uہnonnUY2||SqVf |L'R?W9S8BWp-m)=!gԣ{y s˽"5hJѱ7q\FЅ_ťr.|J,E "Wʯ>.2˹;8}:F  ɷsXSݵʔP-*(eݺd$ *2DE߄3*x8QwKՍS Q.*ol1*Yn?. {wikRctn8Q֭jC1S&" NP`+)QHLWT#bD #'8 w3kqZL=['D+멚5 B쨍(ŰHx/9B ǘ#ȡtS^h I}qRFjy\9PEXrqYg"3ZjDա{w[TspO CGqNYڈI,߹rU2^7On<˪VOb ~޾\Zҽ/JQS&:=߆W`rFnZOhY~<٣'*A9$X"HqYKO\ה׍EH:ͺ/%Ts؟I:Hy'n<|aX-ںbE*A7Lj-n9 RGGg i|hкLqo((N)'}%,4]u DrYo[KG*4 Uhuh8bQ &1|dA'旵Ǻ=P~ۦZcΧ_Ai;폷 <% _~߅EH(mt?VA蔾uiN-MŬ4S_x*V%7+fc4;+"sA0 I%=q%,sqW}]Rc-b]$]`nS Q#uu <fWu/%fޏv|!0"`br{,WKjWiz|w5Ǣ{n9O84y9yya#/<)&``YD :a I8 R8͔!B1ЮmDG ]I\ۭ=ϾtG كQG燇;b9 kmlugZ&]^ɬ31l!q"d#r^GIS2 4E 8f_Vŗ(bt;U5-qaY\GC.q0Jg UC6pPJ%8g:,=QZ#>> LhZx0{ q#<*SxTxλfwK_)w^3njl Ay3Zxa{[wT zu~;Iѝ(_VM+nbމ{)7%.m`㘣N$e(S;t"9K=~7+W\^TC#c Yqr$E]k4eHV\WNhE%Ɗ{3.U"Ȁ2_iX"+?񩡣K-+5|"uZmeTcF͜d 8<7 M*}Ch;dՆB,pfy\H$(\Ax7$(8c[A3jkfs_*9'Fcի"^ywzU\{`Ul1݇W9`9l`7 [P=lZprlUB: "M{<?[x׏2-õWTЭ}p{WyٓFY tYޕn^F_Ct Qr;wѓƘ׺Q`3#lqq,GlSH '"YuGHhIƴ9խ5e)$E)ˤq Kߟ &4Y3!N[ٟ/rǛs(ߦb\y()/^h$ҌvǍ24"xǔmdJ69 =6yj&¬7)iCMZ|k:xDŽ!ޅUJ.k/kf >Tt*x"i+psbO^=YytOO^^20/k2:ގG NO;78 xy5]sNi4%ŃsPD'㊃lz18^=F<58.V1k}kchOc޻>;:Ȏzsaj8Ӝ`˟ףg~2z8Az~~2FHL*:khdbsQYCxB`%[oȽ]YҍЦ>pZ4ϨA,;]~vi~yj=f=|hnQL&b' 76cƧ1\B\!Jnpvy&KgW29݈ϦkR6cNK")Q&" Pg5JRp"fAQM0yBD;$s0,jnmTwݧj|w3BsZJC1!Im UbuVFC=ܺ2\gP `ڄ=/?N',y&߾ƣsQ׷7<~: kXfƳ;:aYz :nHks> h}x0Wpo6=kbnG"+aVDԻ0|u[$o jl|~82C4ڛt]S;flƌ!eBjFdv="Ni,~\vILWZ\T *#VT!Δ.o.JXQ5Y/n? 6jю]F4:`B Hř`kCy֖jE*p! {|֭4fpiﲂK5 %=R5yJZoorOXw E&2G(˜Lc"LEu(\;T)b)S,npmIo3=x@ ٛtp<jJ~g֦٫!%d^Q~ .~į0]Bޕ7#oMTlIM8c6S.Ȓ"JN| %%Hk͑$6~@w_2JTybݫOw7qt2$,EP(j; tʮ (f̤t}3R"­ buJ1.՛ Wnc[+/R8V8sD/ kX>Q=uR';ؗb!(֊xXSD%8\P(-~X_cx-cz\*qι i"O/?V sg{?> 8jGRYo'B&ܞ(1aW6"%G{)vsT&5L5 .Ǻ|}*[a:ޡhLG]ٱ2g.ߟOFγJe7zPU*xzB?8٫J ?(F4f 7 ߢyp.J}eq.9R5sSMUToЮqQuCqWz+%o`q2y5ܙ5~WJ)y&2?J5{`-'m ngW jc0Ҹnp\XQ3 3A`R7>ؽSMkcJ  q`s.SlAZ"ކ{{ !ڟ~_Z|t/fJ>OVo/f-h>rq!dߢ"bc`$OOWPiz52F3ձG6:.Vr`}YxZ^ ;Fk g&R;]{dsD+^9NU\&bUۃ)wMUx9 NLj^Ņu\@>}ވ>Xӽk)m$֍Vfi܌6ůTJ s&1~8ڈhXDATP4em`P Fz ZƊD#1HdfSpbo\e B -mfpR Bcj|ͭk:U8q)?<-r0nU{|wJ %Uز !kE}|!c"*E& -=^7ڻG%4ٟ`;kj$zz7\X\VMu`0v޵ypK0rн{lֺB61+:::|n諓͘PBe!Y~Єg]g`Pjp11)زڇ؇ONήgO˳*5MI,mNUj*]s㫝~_԰pƇr[.%+.( uIׁ{r 9t Jlq]?oa&131R0i`j֎rb]eC&` 7,x#Q2ha&XQSN+=F镃󸱊6R(T1 5u`% 1녾*>К]- +38"Ѭ֩fVIKV+ZrXfvݰxՌ}U!E^%|c!gsGILwCnMqcp3gnxQ׌|{ 2EJm^ d ^ʝ& A} [g{F_@bI(M[;`@=cm岔YsŶδy;#mCQqnF9"vKԤ;Y#f qMl'uE4'Vo"ҫ" Xh5 O'5|9w:(89v1E(0m ̖'SC =քWѤ"FSNMtZFӵW|;I.>= nPtMyar焋c""1=S! 61InjC #-A GKA{2*K{qYl>~:8I]seKʄQÿ@\p1$w w_x..޾y}#HsvO:^{vˏ? o=g~|򷧿S׮oki0"2JG7^Y ?qϺ& o| )<|/(]~+[FIt6 4Ĭ\_;؝9rgroq~Rd@/kYzfux!eZۂ0ӘF*5/'lT*|vM9þ0HU`&km{cʄ{V6h{3nxV+c9+=(<#贖6ܙ[0*~P~l>*; FB% yy^~wwse]5Wo*_F-lt,d#.n0^i̅QaƷgf`ⴛRӯ5# #E_A^Q4{Hu{ѬmƏAf#Wn 0^ ~_]9+S @wë|`񦛹Xuoe\i_orm/ C&Xx7z>&xPf}gZCn(pnlL 8amJ{v>f|VEHDbKtBH-6 3z3)0A0|rcL,E-`aLQo8(sJX}f~rQ9}C} Nły[w5dYk9X A@+|tj I8ND3I4r҄qEEE3J'EVĈ†Qxp:I6Ĉ{ʅ"E1 N$2P- aTxb$J&T UR}8OҫB+Y"Yw%d.n/qiZ։91MԆ J0аb xF{V)895,R~(0)d_jv*5,VEi _O|zYō٤O{i3yt<,VQt#"d 3c7'"vPHIC74QވZ*, 4Ns: _0NA=00(`O u4R4fTx-.XxSDެ D!0Rݚa 5Fk%,alfiOwM׸sѰ[|ZJ[m>vWuCX-u8Æm:Ec0 *P0 %BQ*b-$`(-c/ nTK#'h#5̥6rU6VAN&jAՆt$\HD> _$D-mL~#a>]Xaw;'!#I8+1"&MB}@d$À52eZ(?Ax"hz("@)%w>$ıuq `>8g5*w(؊_is'!u/zƹ(B knReE`erNyy1} iH̞'pl ]- B4fg?wo8l4C YA! n]='&eȭ 'F?Ne)1_'O <ԟzAU] #Ղ$KMTM(xQ *r~sl}'}m36rB=yI Ib\J(P'Bz; !.R2P *U!ɬPV c%_~YACR1UdK潔?4D>2'&2!2Q*yF;ٮe_\l/4pgXNQJݓ(]~/ndA0/Zq~r UPH^GL8 tp(S]ur#@_us Szɰ.Ѱy:wt ٗaZPH&oN&hIKvF83)MްN,HOmm?~$?OB-ߘ~zWC{o>8A|Ѵ#W!O25[m\|gm_kXs$tT?L7PJ99o1}?ؓC碱ip]DJ%<&eU<z/ s(pN [{.J2]'hH{2Oz( I<mGYp6w)cxD teY5&7ROn֙eL+V* i H) ȋb,LiZ5 e~L׏=,OR&"%?!E;1wemu>*1 n]GiJ1Byc1c癱kcbPAQŧN"̘M.*4̼1*ٱ";{TvjtVb v]ThO:ersڵΌOAlO- k֮݅Qڗ|e'bsB9LLEgRWIYǔZpU%jX$(Ҏf `;3~VՇ?_ٞf4պչ[o?y.Қ"ey [\Hkp6T)gY, iFd5f\)2C1!3̢̩qU!-ˡ Y  `q$z'B[v]63s[֢~gDT2l !^饚i;Y!1{~g{SG٩gM%xllR~ zJ)[J8V9h*{>iWR.4ye}r.2pRc֩<2)eR*뭽vĺylB3lEl/an/aN񆭿z)l;da>(9.Zyzp:7/)0C)07O`(0}EՖ23jU'uY2ALAa+R$/^E %DQf9 Bf~钣RfKa5zhp+kVytJ圴IqOԂ2Es ?DY(A)#~-(hM /e"Bt*g`RBAtV焢|(Dž6iƫ-fnb!խKtq5._u5=MdJWkkG[=B2'gq,d`h~]#'U7~Ժ8""(7ܤ1z eoAFt21dqSo\qN>H RLos 5-t|o;s/;Nﯔ;NR o9-GYvљl .+eFPe>"cgD7Onb U=&IdA}u7BnZjzw$D(JZ3[})83cg$GZ з=qwΣs ו09omS30tOG/h Ђ뎪){JT3oo8)1C7NNRa.VDUtDՆn>T_DՂ$9ut"G. Uf.i]@i) 4-rJ$egU"LHdUQdAKZ 8wA9$TqRȂJ*hUITNR)3fLPIFsʜ3VswTZmhyt]dEhL-U][tJ\V`Y be(88* Ӝ)V sj+F_u\\`YzTu>qDW4*y 9V$o{ yU<,::ުB%dc_&#H.΍Leͨ2\'R/5JDOg HJU'Բf^ МD럿Xi%(odNB3b Q6U-d䤇)'u G 'i&t'U~+ABUb@] o56Y|]Kt~rV7 ` 0d%vLaT &!u!2B^Q9|8d17=!3ӓz07 [aT0frGk^B9hh}y(@ dL-oߩtZ@)nN!&N2Pz)jؖ˚}rrerP͌yJ3?$6?̗ǧnnUf>.~ۯ%߰ƿ2Wzz^}{2߬VO?|DHP)ӻ^M>O`%~0{ZlRyYW{H${,AɺM_ó\-|=c3YdQXBҘ IVe%o~_ UtB;[eDBsne|ZƣdbrG1Gb慪eۑ1]&w$#E>:*SE4Ɛv25C&ab<)6H2)5E]CAF(ϧ08ӹw{r8g|T 4sAt9 ЙJN;!TOm> fI'%ǐKuc~pzr>h61:K^4z,M8 c,SDd;7S!~rݪ)|c֤tkY]֝6f)b=bߋ F& bwUq{._AD2wF<$a#bN c!1I< \1>S]P(0<(` ЫŜMlHj[0T^gY%eK&zqKM/\me=CV): RIک5 qq.Gj.EEd}%GBu e1wm)S}ʅBRK*M23U3]y}mQ w7E, xȖ⫙P,0$/>4XquS[Z͹YŋX*5T>99EM..L&Q/~f$LH=C&qy `['v7˹4-ߘ>PkQ24+\ zzzz~ |,6X"9=pF+|N ]?6 ,a W4Mp塆ׄbQqci.0>fzo{j"f4#g_'Ki25@'1rh19|'o~G~ZpJI:y7~\C̻p33ʼkwYG~2x"eqFfr#$2́.n>h(BFpsqrGE642$%A,APLAq˷Zꄫ8)>Ci*Q(4҄fy^Ɛ 5\ۖ*T! 阱r4&~"IXN}# wRV=npBB }sE8 ᬞSXW$88Cbl}.Pӂ!8/ӂb'̾{`q$"vGO7X\ ~ WEDNrR)f43Т!؂ٜy꽦J|i{N=͸ΪI^+bL$څJt2 [STBPieM.rU3ɺՋmČǞd@uhgUCiHG:wf(ޫU0vyxmǗ`Q񡁨R Qe(5CoQ("WHTcozW^yoDw !sg3@]}ymIkC\_tkPkUg2CXA޳1D jci;أ絧 z̡P߃8 Cr(BVFyJdhEWԂǨ19߮f"|=\5;̮z )-bb6QMzC"ʽ|):e#8_fz1ZZ f0f5: ˱ŽYӤAx}6+ho='% L5 r" +}auݻo(̳ۇy@,__]݀ [yN;=8^4Y,V9l1?qyu݌م]̝M'&)) tJ|!B2) &bt_#k\]L}g"L KC|?5\*rVW脫R;ɮ-`iDK!F%F%% Qv2M;Y` bf/]L2'7HZQ5N'Gw}\}`]yɐm& jta1mb1cNc*+aȿO9Ovq {o'+=:CHzwШAY_1X wv}q\CӫN4Li@buUxSz眉wVQ(?PWb9#:L'> , 4<yC yC'jex} ~Cc#nN˗2b>w?VK1vu^+Oqe,3;kj3!k ƒo^;)f])B(%h>h)z4:^i~V{0/?C@i+ !714'݀ݷ]~}cg}4DfJ%F /J5_$KSDNVKCJ0Vf tfydjबƏcNS9ᣃwQI2a+Q>eIp)} Zע648vG{#I)dT(Vh!\X{Y4[ #vˤz 䂅ݣNvv%8׭[Mя#Z"VD2s7׺Evk!2?a/?rR-72Zn xBo"왻z%a~e**&ѿ~L.W7žcTV#!^~gW;78x=ML`XN|tv79蓿o~[7]Ԑ`܀B=3,A9߅B*A`y^é@*$8ϧɧv_1kvKe5Oey_iVM|y%kӼ. [\'gt:fXа~js%xh'b6T8'y [^tĠsr.fgyE[1^AҊ^d V/G/ߞmrkfל,EK,}7pR,-蟳:89az~2f*?Z^I^u3 9ʑ$POs)/G糣yZ:{?Nb_}z̀`S~y6H9>/8ު\ oo=kXΊ7x23rs:)׶k)p=T3+<L)@M#R-ڗ^kDޒ!9crV|swL(U?%lY ,=h;l -k-JZ]{jpB;3c;bt.sΨb-Y9X!ZEID%k"RXP1,*BY$ OUntCu3֚K@T'qgO=0rR݌)y1 !َ2r9?c?lh a*uaM4<rY7HU:W^`t [m\bTrQ*ֵdM1 mQDъM4,D 쒉ɓhH&DV*#i)I tH](6Fo~j2f+&TIdcBb[Jtl'Jz닂Jhly+E>Sy2=?~Wp-,C*^L牢W6_i+b3 )6pD$T/EMln`P#t, ZleLqY^ rf4A[z!K*o 1 ɝtʺ9EU2U}BǕl%bfɒ0s╊;ZKᴖ) HY vF(X09r& %gY Ƥ.ϗ5OILlGZN`cئM(v&_& LxžH9㈴. ㆋhK`Vw[std;w%6$Ӟw <_(i3=Vˋk=Yx&WN|oo'oTE~ssg?eY$ٟ ^o98 ؀*G+:jbUȂs(8V1A* #{":%̮  :-Bm{0,R#eȚe"S!Wcs>mQ#tEف`h uԦѳM.E/'YE8$8I ,X`P!i P>G IJVYaHq9 < ˁuo"jTqo%p40 k+G'eY2;Wnڴ:Vn]P+vkesȩބ:L.g~&0"cv_DkD/b uASrL l-87n$[ o!6/;)$6lX[v#:nbzp#AC{ѹqET@  bu ǝvc;VTKOݜr#rn/)7QZ--&޲%[Ar!\較@"tAݫ}[AkΡ^f)qjNU@$W=v|evmkm(v+ owl#;J@p3Olc'\i84gcXi.Qc V >6v,Ň77-6lq~=sZJt 3p3=PoQnIhxK$ze #,|X]z?`)]|-10-0wiҠ8&4ab%֋7q/wm-i0O 0QLf9#qh|S]AgW6:@ۘSm@OߍRGU( $oKMAV:r"b?{W6_zw< &sdg6hH"Lۯh;-ٔ(ryXlY"UoV1KR#\@5ez!u{l*id|]ejjF2C-eVP|맖F>l wΪ|~fyu=ׯ ïofU-j?f[,7 /xoཀྵ ~f+{3{ϋ[U53_g2 *yxvw 6 )A<ڠV i^K9!984hJR(MsJ/B,A+s*%<4ʔ#,,!&UYܻm%\|KQ2Uh}0J rVJs?$Tsu"TWgt JK|A.ԕOY$0K^jdP*\TxC6/Q#pI*2/! (H曩ȁBPG*J/rfyQ WNcs O*r*Ζ#.Mj@8qarg1S̘hjߐ)hʈiR>?GFWTh P>p Cu~ m~ ngN(`gF7(׈2 T:GQăJ ` (V/-~d]3,Ѱ8{,"SɊ8- +|g|3.E#pW!%pA ?)Bm4h'G+^+IfgDe9J`ܕk( iWcD1_,ڨ28Yeuo8tv̀ / ;ص@[ ȇ RtF /OwLt6wF4PhYglmz) 3C'oOjJmkk;MGG]zCF% zܒxZ-2[n'sK˵n4$҈NGWF>4q9n|zt^mzR>b͡ѡͩS2ak<3võ#&9qDd<荠Ф4 Y#HdJxDݙ^mzAG#3ɘ|o6߱%pB9/]HDr7F 4GQ>`w;NFR^pt/)% 8 fvfKfp42QU]L:)??.Ӡ5Tab~v۫k2W3g㌒kה7 :\U gTer)8 |Z_D|ȈSp,7h\jNhgͨ-XGK kDq}7 G '.H~5:uGGx kƦ`;j o ruH\07/4&][ۇ .z}AfFZǏ"$D|cy;j= /ĜF 5OP׮9C횧t"2C)22\n”tB=9gYY֮CNxW:Oi qvbv"$d\Ӆ\ѓ{9=,1җ2Q:w  S2>RBBjYONe ՜ EEݲJN`~hXARtPTشizPJM¨(i+w|^xof|b|DTm.Ob}.1&3<3xA=x -PIw'1E@-e||TRv!|cʐT >LL4'-z'4NJ@l%#CYX8 yԁ퓧&W6awpzlgg+[5?,uͬPmz yk~95HE.ȹ#֟cZ}dpZ]P`MɋHDI>QvSJЉLuN$-N}TGN K %FM ). gF&DS=fAXpz\2OP 0Z_`Oh<}ZHX\MώlZ RPHa῟fIn̄BaE!,NR/ML Q5;yQhLl=2: ӵf[)4Np Zt quZ>e| z<&'1$-pObU/0Ȕ2:Zzz+;BnƤ!W Hy_?eL15SDZ)UA|&\,Tr'2Jŵ;m٠k66kv Wi%TFM'0v՝IQ)-ڤpx8N$)0RvnRL4Ha)9;)Q$f.J WEg݌lZQ=$}MEKpMщټk*VUS:/QVn}tm><1>% nJ#"a D;S7xdԼC2;<48?N$qQ(xH}{{sM@; ?r`lg' I1y Ԏފ_qMp_]P1*G+Q9GϔE-h*P[õ*%iY Wy ]mpDVP>>UO3HpŴ+.m*&7Xɫ"A K4Fbt#`yFQKe^q S\ ^TTCȹ.3p3]qZp-^1YC6!pCXtۇ;TOf%sx+PPڌsEi?>JU~߯ \r#\ŷ%*̂)UHcPi4( `1$DH;l vG^U"!,Gf2DЗCAbUQ9AI h-,G8cc:P3 FCٰǗrh[})#و*NUXL(BrPL+QYDChQۼ h}?=RQn3>?(/71m 4b藧gy"=n\oE2m:hRb={<]~ew ٟ7_ͪbE~îQI]`L:Ƈo^bmCյa~Җr}RI4jXF ?0Gs߽{w CoϷ+vWX-T>>ǵ.甼zҽAf+5j٘͊U S`%74 bl?o>@XXMws֦gw{L-bn,9%JXndiˣ#I\IAA  A8Ə>(FRem쓦[4AQ(i4NBB!*cjkej(3HŐ`h&.2F.s*Y^ZΝ16)Dq b+ZpE%M-Lq*16/R+Q~u\wMâ EAOKσ( GNG_֜D\ӗ:=p{ JՎUMRo U+@Iz~9z'=> H$^(G_ ITQnމ٘/W<Ⱥxfвz6QTݺ|Ia%<>ʉ}L߯ͪCo g_xUD WQ>f:j> \JդrxosjNnGz>}wl!o_L)ѐ!iѷ`)ښg4G6AhYEQ=^1PNC?ZprZ$"¹/xwk؊3@9^ʶ :-:~Ze}Zc-3, wpn2ȉǝ=_ٸP$.J0Gt Eï4uXNL^Q)1GUƫB X#vD5t&_$Ic i4Ѿ0D N$6;>0]6/=R_1_µ=Nx+]o 1D>&H#RyF+ F6 7>aXc{\cncsS b9KbbA| .jK#ZA+IpZ>_*K`Io GLe<0NAzKlbhDAj%'kit%F )/qNE RfZJ0Emqh[AT;!/wӵD,=0"7[9@b &x1, ć|vw`G},M1ypf8 ]VӀEN:'4B\ 5|0%_:4$7~{50G8-oXGt2+q ".'ԂZFw% b T(ÉJ>ؗqrTh<*BR[- mB<_GݠE՜g"uR&Yj}Y`D, e%h 0YF5CkЂsHQ@& .x-%:SL@j(-UZTV YH%T©LkD $1 k$(P +"D؁Iuefx#CNw77<͐uDXKYR'rJ9l991Xdp 0aMTHo^6Z (<ĸKnoIB#ğ^fRbqp ʐ]0 a3aA6z\bJRjl< kg)~إr˱->0Jm fE~zߏ/;M7 u_{] K~҇>|{t0g\"\F-axbXjřvx)A։W{?ցo SYL2t ` ȀۋLxtys{W_L-) {6Sr*s}0gӉ0bV Q{qqCW {NThwJL AA h#BևpQP2O(ލ8@#w~tɘ V=Mhŧhw,Gn`1<-*ema7i#<;ڼAX7Z8A6O4A6:[Fg`c۸/j!?ec8T8kr*H{ N{.`GeFܹ`;7׸m}iQjA I>8ΏJn:LsO+YB_Gϰ21i9J90ECgq^=y 4ԅA61ѬL* D֔:J @)(*}T)@9aǵH ݦ@0[pckjF<~ ; 9AD.ւskWXJ zYe7iÚ6 :{vڸU\nԥzԧ&XW +ӍKdTuQ2&ϔ!ӣ3 %X(],$ `Җ2컦 "6]QSNpyט]'&T9y#C oJav7~ss]la9ln)W#^ G^{V)gdkrY͹N75||bv-UO5F>V(Օ$4ϤJK }!RiʙUp1D,#HX!l(vTiE)k&DF(#)a*:U) IF gHh93 NlzX vEɥh Co% Ք4)Y+ޑϔ4)'5Ҷ6cf4hN KY" .K$Pi&85.M BPl Dq) Vf;"p_i㱶kcn׻hzK#ь̧^0Zp͝/Clͷd _V9w7ǯ?H{T|l64z{YA)GxQpNj&? N//Vq|(A `h*Z_ij=BIs9BLVbN@*9}0$۴sF + +xB`.˿9F9; rrsYT.L򂐾Lj9{ jD!x;&gU+lLsVT6t*r [&'y pϸ1dcgކ>.DEP5STl*}5J܃HK~^uBY=ݱe0黷6?>}TPɅXywne[SoH؀/N6{ Ž_7wf6S / `0۝ۿ`"I_!Rw;yc *PQXG6gdBR!ΧڗϬu=wd'U\HNħ^NJ (n E靸#ILa\ݍN Z?|_t}w:W:_ V\4ցa=/t{ t^9n$CEc«26G~F n-/OET*?Rl+`&x_տKU [mZ4ռ9wWf 菼PBEjSv#Rvht j:T+ՙTGǼMDDJ) M]怸)-V 5K2NJHjd0RSX5h",# A8bnjZ0 .b`|1[zZWIoMoۼtSj]_߼c,bh;|'jk4'Nw3_,ľcI?Yc0g gSa_G7k^>_L}sSgv\=˥/I>y0VWDCMVHX BN;HƤ[DC[++_)F%X4bQCm2U_kݣBGx95{uBHm+![XK\\}Qd|Pt']6@UcD+F1P߰$W= `j`g3_Wldy8j8z5洐| $9WK"bSC22ae@sbha$&%Fe&1 Sy(g^21AI#u8CWZ@a2Ʉ1Vgd0&fTd"IRP-UҀyOtݿJA@5%`F@MHU^ٴTWXT J3KD2Mb61qmY&Nfy%.N8)~ U:zP1\asm/]Q;''71֓VO=rb)l7ԧD3E7T M8cH&xTs>cbL 5`}x(T.ɏn=4(, OE45S6jN-5m_ 18 #x2NC(a)+93-vVI:a\6")n_@_[0`SZh5 xy{4mx6M! r1ntQIت-]jՖ.cmv\g50O,kinQi.6tO7dTA GE2N?nsn *] m gӋš1ҮMVA&7Y==K`IKdsK"L6lPZS6͎S A}k(:u.9[+&T_sVL8Ƥ^j=%Q`(>LfRx Ig<&dջc&/0O>s!~%ԋ ֔Q#Na ?RE #\BYt\kHS@*ԾTpC JU؄Holǫ,_UGL$Hb;+ %vQʡ0!,3bg⣛\z2L&f"\s9df7iSRuTa[鼷"w0md4VPݴNpFR8*?>-D2J N"Y-|2' vg/!@֨jNx?z~IF"Qe`MfB4wGrP^x[y pg%2 WTA?Zkke SۊƤ9Ͻ X]{gbˡ)-E\&ݛUɄgf+qvJkaƬHkS%ǟu _V(j'qv:WM|Um%UG٠. ZWרhbzvO*“++~uJ I]ahVVMiFEł*$ c૮2UGA2.Z9P+>ވ\:S[-Q_aNCIeB3Tt`8Ycq$: ;LihR!ipTXfg`MF2Kď8IU'A#L萝)":ekmHb;T/`qnl@Ř" O" /aLWUUDag̜J,uqG%CkhDDqIHӕpi*R.}!/KT1@DF\.=f3gf)&%(*טG &Xق5kD`P5ځ=Eg8yI\Dy[RyJ-7~ - NʡK\#BsB*FA[#(wa9 IE8&(/^rCX )𕬃*4K Arcr.ǖ1EEZ2f u}(G[i;N @Ttb]{-gvssHDRo<5>ྯ5>|⧃mS{>K ]0bj= !,5$ERX5D 6I "&oLdp~07)@l1 KIU@?NF UA$@ &y PĊU9k7UݸJjLX2(dKE!Y4&XL,+Opb 'sBr3^Zk@P֎d}or͋`wO\ G/l?' vڭ/#L_@ٱ="&eWjycez KwlT.y_~vގF 7p{uty_tvb0O&O/OfIޯn\,kfQal5McCo-HOR{7&kT\KM x޴Mܱ|lѡ˝ucV127aL &^@dTNr(uZ u%&ZT$t =.NIMKiYwoBhktZkX)aR.!h$rM u"0a2<5Oas$nɔI$䅋h-* Tw?vA}FvBx*Ѵ[큦j&$䅋LD]%6v=Lcg゙o'?zL48,]ltMidp*\H9E[6ps%$rfq"'bx &z rBE(%\ ZBxAX.\{)S \Qo1w /fV@ <{3q27> =V1Ikijp)W2Ŕ< yl 6ZܱhMfB3t>ƑsnիdrjC`R_`*T:5"`,sFÚ"4JÝa>GjTP=:vNf B6֥ºFER?/  zx* N@r TQş~+΃")fE+V#Lc!chQk0^)QO`@ _iޮQ8:׺}F:YYm׹/]IsmBB^֒R \5`a8Vѩ*F@l|p$QM69ԄdH7Z#9% SyN(5#A48#- D#%aNyc$d9/#Zk1[5]Jzf t(w0 BX$;JT 20Tqr cJm.x \H8#V8)JkQIFAXZc 32Љ#T=*J?'w'1GDFĽd^"&[/eގIۻQAL=+lVoIʌOmU,uw_C1(l,pZuOf]%h9]U> 2%SJg+ހ_ܽ /&C$+?_~~sp4~XOzn oӿgzf4C P&> %U;Ds}CY>RXR.`tO\NL1VuNnLAOtffKա/AJ}lCxJ 5أ{GCkճO^8D0?`7f|s=TuX93*.d3|I T8zHV$)NpT[ {v@B^/S1f;cz)Q׆/A"Szs?\ݯՑ(I5iZaA+j=HG!©7pG5Zx,֖Ӊ`hBJSP") K0Z)RgT`tJ+H8 )sTݢ㊗bL/2(1:^GDYL77_#`yEwG8}Mef//:њ$*w7*\-0zD@x{q(~2w~8{\9K(~b#mdΛ|ьF~eVPVLsM1C28}{ïEPVp!!d^o=HJsE;\IO9#aj*7qjM4{|q5 lf.-Q%!]r-MԵ:D3jpքзws'?k8f&aU8ZV'ZIZ=w}HKxfW~@Q88:G%5b ̈́S+ Cm`]H?zxBxT…2"Sˈָ:G4雩[/O??+Ads9o͕( lϥuPX7^# 4aھ&^?AP%c.ch,cUa*[-c5%pZ6 T19rýGYf YyGlYX$+QLM%x8";IbSL`IK|B,Y݅&34IhMG2$;"vUvq _ыIhf/dgp1\RQ.OM0 P ҷقHTn2.858uΟ/ j:G"m<)aX,@J5HZfHc)?@jiW k./z֭@\5y}TP\ `)mFJi6S`PJW8lƏQDC3˼QVYV}XAAɦmɮO5ut(8zvGDԄyso/ܮ\0C%G0mMLHBQo;O_u>w7®ƅt%|H_.b`[p`bZf#YUIg=E8+ʁJ^fn|n  ^z%1 a@B$H5p}J[Z+1vX g낢0s,%bAkQY<96<`ǘ!)u ,n0D\^Q%c`ͻn7J1S33FHs3VQt1&'''NyEyR)&c=`֠j22i0 =G0\1H*rVpJC1I $ ܄PeHgTau}X#Z񷟿>aPQc`0gIDw0M,6tqÕxW\gTQ;㛰v}b- WD3R2f~̯bUL3*esc, N1Y,DЌ,LrÃ#x&tَJt`cSk !e ?>.utMwtfq=;~3pL=A@JA١ (eH >K5F*`"o N^u۳i}Wh]fl2P $ZTմڨTDTV "B68\c3ve2I &wح#R%9d$wR2rdr1cV"-`*J-^!r VeS u٦m!czCOʧXي?<_Oj]RpJ$7]IF5"=kیVIqop=[چl|wn-VSkvq7yD?Q6CΨ _4fh9Gi1;0ÔS^(4&E!p' <9lT+ nϏwxMrpMky~bu?|͈ϾCjzhY[pm;:rEzJ>Z˨gcǛpRR %CUXR)G%ǔl˱18` 0;ya~ t6w]xo7o%Af(%pPVpDyp1A^KQڰ0/F/K-_HhxVX$Ŀ`Zh8W4 W@oԦ""[܎!a93a?9~9y4+*&6CvY" iN ;ﻣj46]?yXsf8^u;M?O}qv7u~;k<^c cbFJ)'y`y`̓2Huq;9H/I0I^,ȳN߹ W@8<0RZ.,%\斔E@mI̽tL~t<8D'*$90Ky4;jJ};4wX`.lƂչ {H7yb&3e$#JT `Lp@+79HYΥBtL&t&B\NEmV\U!aGf)U(C5g<;L.AWӎ'wWþz5Bw7 km#GE✁l/Hur{61"0~,C|i)~y*lޝx\!tY skfyXB{dI2o&0/SE臾P,eSjFGw/&c=f8ϥ qx39,VdX + LUaN9,)R쁼00,&gk*8`1^0 FFR NYixE.LN(&[E+8C{P"c9l$deU\\fZ!'enI b)G[Ճ@5$TF\c&$!޵>>7x;#\[<Ċ KP+ (#G.l'3s~ Rrn]e"@p-իP兙#XE f%+$5#jv( }v($'T `DuCE9RP%q&Dm_d6/@ǒ[%2bn̻-杦{DGS~a2IC!u8nT&(o2tzBR\;k%DD'-e[I(OhPN0JPTX&I9CӟrV`q)2B+eك=Uaz҂s#W[h؉.=_'H 9X0^(Z WWӵA3榠caľ !1.8bXBnUVig^Bd(wc?E4NRkv_o8|PDŽX^~[VEZvl8w})Ӱ$E\ vVj;-zY)goEG/^ˆ#OU3|4/.$X;QIw7I]jnB-Z>h(TF!t6<\ŻfԼ7J)M5EZc5B,+Em YaM:к99n>~ƐbwyF5jAA(wCQ1@c_:qh#:c<h֟{#ZG ?ȅy Z i}3^Rg'MjԂ.Ke4.ˢu&Zx5uZ5i8kט3Q_5:~|MHx$$ͰbQa0YMP=.:^0 FF+8%G#^b^$9bx{l,UxGΩw\ϯՇx)$Y~PGNAD$? oۯFd://OPxƔ!.VݽCpRp"Ff6qtĹx(i+dJ@a觧{xVJuaN'ϹG8i5`#>B`+ɩ_?">R\'%V b&?' 7LڷrQmWm E/e7Q*WGjXewI[-@6yN .(NZIn?9H3I1/ƟȥrW{"7hrst ~v3pFeQ)^I^ H__ IAn|Z`{-AEj_*ŎVwC@rm:"ZͣE\o (}O?xOowk {$ڔT9|318 'w}[c.GS/ka**cڈ t,[%h@ -6&gGl6+wBEUAW>}HTA;TuլGչڡ ug-+|5SԁRUAEΥue[E"x߂jVt艠S3E C2`.e?5Gn ggR?'+9Mޠ0}d.)/n|/)Jq{ul݊`1^No~#pp _|#Zs䅳.'؋mwf}[Ѥ?Dk3>CUV~Kwrj4%idBTqW*&̆X1D_(ot')Uj+^QAm։( )ځ϶AQ9]hwuJ))08D∮˶Jy_*d#ruy ˩a~ R58ɄɗK-"fN/WTdg3ϗ٘"Xlgs, BW*c0:*={2).f6-DaJ グH(|"LpZ#}s5\ƝU7W4g+|ҷ>u2Ž mduÇт׏-a;}_C=οdnAC++ pp s go77yZنɑvfXtԢd^X0z/ya^5^-#S[kkVt61* !쏝yNhݻhF%D6LKMzPb3&m3Zo=DێrHvBw*ioj WIujs^޻P JH-l\̈́7cfE.'hxe.7O37z1샛\l~9jby|snpXev-@MNR&L~xz^6-kTLn{{N1TL]"‘ha *uNfb{ ly|J80DhRs?ad}/]QQ$|8s+O굿z^UOarB d Kn ))m[%Y2+.|.3/.GWC(k)S_t9\rKb0O02ȕo;G/n|b_xpD"HVp=՚5Œ6ֈ2?]RD3gS6e$Zꀿ?qVjFumRrpu=WaIR UPL}+T Ȑfzɖ$uҴ/JRJ $ED/m)kk8(ӭ60T 39a֮tM^;9FB)!iDf|;(Ibj葿[! U1fsRBAL >ܝ|.汕fޗ%!Tm~pvcPTUQW\,YͫE:QpW p0FZpg&@kޙBK #XGb߽t eCl{6̓-`"ߚx)]Q@B K}0 2 "W1m-+*d˨aR (zL=[kJJn088]׈:>KVJ^`sʨ 5"W]r֢83.qF4;ќ %/花!2s%0t&1sweqI4!@1;c`e iвI&;3J$U_DƑ0V!IhvFH( k{{q,#K}Ü&9`(n/0&:k/gI"rݕocA_gc6Ue~{V^A?x]y2-Ѵ\DrMEjZm 2x"؎*ŔZysFsIh| mˉ9/*)}|ϩ" A8YVIpmɃZ*m5:l8l&q@'QnMuBK$=RZCKd%Ω((>< 4g L1d:Р$gOx& uE#Z2-yc@Z#DDX+z%F)6"Yrm5t&Z^ f,o&ߞ1l!0 noa o'm5g,Qg`ir bkH+  5ogorւՠOwx&xέGd)%7FHJJRr՟;j78oU6k Z1Z[dR:ԶVI1ylfzKQ icDiF![x1:W(@ Zdo28> 5X P[ gF$I2f5vv@c,43q =Rݭo `]ML\&b_̓ "?]77·]E[/׃NtrEe }o*6[hp;^㳯n?m%83iL9 ;;3D j! c`CtB ~Hm`̢`{xچ k|<|U`BcfV6N$-ne5ciOhR[mЁ% HU1|!\K; ]շ0f[0UA}_TR@dPB&V\Ab[͛Vyޫ8j 6XP&F)B DRr-5A }%DT Ί;V_RFbcV:CG*D"&q˨bir'rL`v`M2&砞H x/S$Zeu e -&!~a 3![6Ś^`Zw@`UL`GyE; {3tyijL`1$IzBØɜmi$;'6yrF=DO>'CLRϸ'ARxy;-)Dӵsp)h Hg5'goО+D+`;f+[| Gva VpAV?x$8K4-x"Hj:. $u8i{ GS-4n5QfF.~,& .Cz8{)EbWF[❗=kS)И&[4Fh]Z1JӜ}F5A(!gdD/! :9ŵcDr:dn7s^T8*Ãq-Qa ԑn d"8DvaÉla֝Eìk٣ɨ4* G4K;|8OޅC9.k?F)a`C?xGD>p8Wp4xgc>WOAv?.#{"ND܅[ŭ3nYhw4lE `qz1쏝ƚCUCfӐi/9at̤RSArš (OY@@ PgTJY{@P Z ǜ6vJなvP#v㖎PitGLNFpk??o3;S-@aڞv!K!Il.Tk {[4OC)0xKGDwEӇU8Nt:8ht73{)@_lŵ+G~Y?~O7׳RQ,h5U޿=$`'{\:;H{"fRzgD]ĊJXlrI~_>9w-$T\ēq@ %'@ה 'o3f)gJW\hvv+$n5$䕋hLIBF}Jt[)9S.pWSʓ2[ y"Z"S$gk7[)9S.p9 S`ڭ<_3SՐW.%2?tϨ:ќhjI/#b=yRgNGu7-ul@VԚ;A!- BSHSu'0A/QBBSHSw'HΏ/2*? 8NW_@&SPs'a<]}℺;/53W>ɻX?a'/NoM<}TT4,T)}j^D%GwQN3=]fʣ(攞SUu'#X }NqW՝bȐ'5]L J{[K2@>uQulybc3Y/ LYcZ/]IczƙjX?'Gxf1"X;v;/a|#?5zwW';*i_[(hc*J>AjgO2`{G| 93|,滍pʆ|Yo>9vN08r)!C=LӺA/l@faY/ 5#&5::GrCO$=M Koxv$0l52of2^bUsǴ t誐IWQEy(yh2&ng*Csa)v6ufz@Wa~ Qe_mj|lEzGcQ~w7/F} Ե>{+ ƻ׵S\&c  6+I8 yx!fCCuy)2+za"RTF-Ӛ%9TWqUmSIׁnA[Sʙ^ԼSoJZ?]77·7q!@V&q噤sĸlv 2:60%F{Ekt'D>A eMto(C00?rԀ*sADQM;bh.Y1biz뚇6JL{ݴ,cԐ*^.n-xj?zf /h?yb|Őv9\0wBD/r>[_J9[,1#{#cIۑ?18_}nilegӋdqv|욅gf2j,thGwR[5wai|9.1NRS2')BvB΂7i\B}#ԎKڜmw]d*,H %˓z8_`h[/d\sH9E0tg aWF&mlR5,zV3Q,#]q֛f%n6cY.[gj ƶBa7gٿobjxS}]˝|ZoÝAjTB4yjei_uNodƯoKc )@6O}oM?eq]į1"fq]Y\O0A`HZA&126_(jlF2Ez>:A|-g c̥/z38N[iS㱉~+~R-5OH9,ΘJJaȚ*l?߼yܵ}y³??;_tG87 欔gyi˛R;0i2s.,;2X,`S]N(V"hhռiq {VYpIbZ,W9|I2n& qp!ŖE%h"˜ZY(F"ULJ]ሶm$0b Ʉ$eycW# 'A =nB*N9PS-B8run+BJ\JɃk8t8x+ZS"֖pޚ\&=71Ǐ1guRX>x#hp 0r}{HeM<)0ߊh|Tĭ"BX+H^k"԰pZI='(YzؑjPb s_?_P;B`\_XЮ9ֵܿ$y/H+4,U$TLj@x$E+Y\2J@{z=+,c̐K4±3Y`i=J9IiKk-Ĵ:Ebk*!TA BKPew.??=cB3JdсJd$ nn] ǡ]sc?$3S/4$E_H(f@*(fYbi!ڑܤ[N(qzBOÝP~XS$pͮcMz)30ܖY||xR^.}˳G${BӗPg!!\DdVi[)9S.w[/4WՐW.%28OEQD!n C-oJ`CHčBYO[|LUʑ q_, E|KF.Y {P1ٳ?i??\$ [zOg$L^A#3>m"y3#*_qgɅpw?u;^~~#h1;.b@耬J:3DW`F못B$b870"Vv`FT (_|{k0%̎يDz+.FO%Hʓ$i'Hp1W?]{Dqf+o */wbu |~j%y[/ grϾ=Y Տ/(qZH?ٌdCqa_iRH"]R:%VUJs p /{ńuMuP^A5{*<(/\hLJI0׹^}y<&U (zJ2.S9n:x$HKnVhۥ$+9! ^kΜVrS C |UeY*]V%*gD ^; #!GD$ѷͫ:Nwѵj0c&%9w*Sl̤c"Kz !fSxp$q8 h ӎ đ*B.BcU geZ$bI-&vۏc4f))*!sC/7 Xsh-h[DaT5eHiMF3q? A׺8=YN\iIqIχ:oA# RH)@N?W.}r2ŀb__}Xa ߇5|_\WH4J-2lfCl`@Ɗ( ̸r?":5AIߛ"r[B8mƕ>s$3"1N-].с3@H-aL0KߑS ;q<1DlL۸{WqqbZwQu &GQW59ʟzfj d}5+y} >SubˮjÀ ,TZey!÷u%%Nd =/JFtXn+XTs m;[HL9]5[+f}z?4"hBs4dm>31 l[mg.FW:qljAPXG2,HHXq,3$@#" +>/>4 - S B)4h! #PsxefQpp=hLo}) R3&C4w 1B(!7 NicPI_uBK0$UTDpF|H K;~ٱ\ت  H<3e֛ aRhF)S|U]/>$`R)a2J@CT5Y@xx&JgHC 4])%-·SX}-B"qJ V#.jW'N* 1p C($ye)6jXe 8f\S'H- r}(d/瞼uh!ͿWUwsY?=ba 6}B>^vSnWW|q_C| 7~G8_,ǗtW#N235f&CX77}y'\@,ϼA6XnYr~̐7>ETa;T@ [*1S'u1BNg-KVvkc);Sg#;rB^3N$Hv[8/űژ;Td[ |\1H@!üxU [DRp1>g~[جEdJD(NY?fcs$1D7lb+5|B;8:@ z#L\Js~ONPuRw<q&TZʠa1=9"̿$9~6 )ut97¿X3)o9)q5gd[[bĞ!Ho @isAoۙ!0aĭ[/_T;AE@պDBEs Lcj4&O }֩0jc΀a@k7R@1 %UeGuP=ko<;*oO dV6ߗ99 -BbĀZӣS㷚T)k\2zIш͇BFPop 1Ho` `Ũ£bnP(wAzH{E;X$ 4^)Ǡ  @x`!BJre]ju: 0XD][j%FKej1 +-em~wC 돷 բqEaBa}n:l )=_ˢ^vH=L(g(e%; u_8 Nf(BFF8!j2Zc0#Z $$"5J ,5-9FoG?,~Ş?ytIɋ $I3tLQG 4%VB;T|8KvLO{a@V:r_MF T}^iguKN}v݇JYp@Il-SBK'ٷ`gS?/WvϋU(üc nد6eZ̭KWǭZ,caߓB׬&g\$9>HCsJ^ Nrm^Sv/TvSO$@pxCoSR}groHxة`RO7i uSsx$ 6։^`.[mJu?U`Q,^hìcUfbuxTܠ0.? 4Ǣ3/O&0.w 9{KސĖAB1EЂmF4tTVfoXfUXfG]\}͈#"kŐlLg+ Bt]rS_jÛgXzrA/ۺu@ah*Ժ+z׺GA"@8/F#:suu,VgF!|:ؤ(k5C^h&K+g@8$ZIv)?U4Wgg7 Ⱥޝ.G/(l=W?LHZ["Tx6*}端5آR9 t0 utLҎDĴUAFg4X,CX ˜+0 Fb1 ǼnUNo{RYBŃ,c Zl*~%J@gބQB[Uo#as@/VEB0KjL.T 5TT Hpї=|~`Ɛ Y).G$(~*kqO_HP+1p"(DJqg=0 ??s +ΏW֘b.'any|e5{3r _!ºfR-^vU9o,G͐ [Gby>Nf(&W]#V^mRz ׮ ޵m,2 G]՗2YqdH#E3,Ĺ7י-Ȳa[W,#*%_HB#rq]\V)~nB[U BDe[̡k &j0_tI[Pu€YfhRfز@,y1c&/w"5ܙgr6XRW!U8B8(&G~tpiwy}.w>yu9~NC_wxuxh.Vɼȡ&}5M)7Ј/IUF*DkBYex%/%9$V831)39"߬T7]B ^>ݙ=\tr7=7LY8z if++zXgkN_fn|?~]jݣqn m.GǧϾsz$ =]Ua:ķ4kf91v2jw~Wa:[JǛ7U_#^ϿB?sD Fg -_>=}wƀ|5㶇"#AL/6$4R?RNcmdk /UrU{B~-ک`_hm&HT'y [eLX&SZ`nZ*cʍZI[gaflΘQC.'2dD3cj)i+e/^Nms>E.aKROF{^bCZ/*Z}77b`a 6vRh^ s, )6*ODYB:KP&S6pDh!Z;"Bdg 8)dRYg<ը+AUi 0cncW\^kDs227\;ՋB&6gn"͒SsHY -"hjTFkvX'Tm_*{PYO٧RQkDՙ'bd۱t^̷ۛ`}5y$`i#{L!^'}q#92΅渙)iP;5QL6}i$})t'QL/By=u3q?wףO9'//#Xlo*\|ߢϼY8ٟ k+Vu;x=|NŐ>UV/TUW CrAք3 -\gq w[\#~8ucڃ^B.Uc:Kl !ucbFa-_lJ+#ҪA[L6!-o$?gw޶ߚUwHŘ3`̋Y"2[vG?L&c<~,~_}3Z_oE B+:9GlvN٧OاMŋ;W=4d[gv2z9j$Wsnݵ{m< 5g}i':6 bK_ݼ{ 6$*[j{3"JFG#ҨѷwnS]4M7gB0!b$ :s tzqr;O5]6dIŴ*|1r[XUxĆAEl(OH'%)c$u}! {UZk'q ^6Kq;~R ZM fáltW~SYhl]9oJQ'dvy C[^:}?cKs1zՊ|U(5F7LaT*7ir[#~󿼌<ݸc/ʅ5LS6#FMWEV.8q4_ wFP(w{hRt3 ˴ mjE@Kw{ֈAȇwMe~dCZ/6c⼱o1ztig`P7{t۟=I~/3<i+{^&CASoB=7MBUg}M-h)Y-:ӨWaZr%g\/d-` bCpD}1t:`6[Tm[Anv4-ĵRDG`4AcƸvdL&FԠ4IJP+Ba{ MFqJ4'C"Adʁ h&dN}b8d ub16vYFhA Z0fn'T)MiFڦh6W ܉'l%' qpiU)[OI931K8eY $,‰^F<cҩY.beH ZoWRBСoEKx!w`EPQ+C)Jac`?cCɑ&:6JIpљ01i$JW݄3~mR's,I ZıFeh4dK`H!FFws;]R0N7ј3T&qNyigh:箹+Bn$i>wMHn|IQIRn{[ph]{o;u%XP+~B1`ala@c_-Tf+ìyVbVИ/kEckE-B YQ+}nVt ì + ahf,(7 ZC|z=4]5]쒀G~\V_@s&;EиYf K84X)b/t[{NJHu-I3:S戢Ks@+  @2YFLE|<16?wAX5NX*kg6G0/u ~dOHB{@քOC4f؟˛" t/y>@cU/\Q/^Y3@չ*1*$7_PZPDoaM= oӘ[‹E-EFqiRfN*db.LNu:WuYѠ:l{Q,]t15=ZΉ43_-7ua=LRIp<ذ燁]Gꆩwe=dphC9@.ro>NK$hY6iY"#!bR&!4c+2 >[kQl-z;%:)R&sm#BXٞ|/ƒKzj-0BB:k d\X@% Bώ1zdwL,ЗC&Ď1NdN}54Cg/%k.VeQV0ԅ z mN3`p մ/N'xgTbQ.<Tw%j|~<()Icl3sg"ܩ?U eBۑi&zA[KՍ4'9AKZq%ću՝Jt~xfIbvv73t֦B`eϗBB EuWfb%Ja8$VSF9WL5V_K|)W۬%pJ۰kX\Bh3{c4yOiC{+z6E3_eTBZN61dR"79Fr.Qf?Дݥzs\vZ0-%DC_UCCqŨSH Yt0 iIƐ#!`Hi.r:Ijli,E3Wz\ue7nW: uG+Մ6hiNn9lϭx; D_B=~k"~g` Pn s_CK$oce]Ɓ.Qtuqex!ua"R޴,EZ3ݰ3< UYP>Ipj6 4S(A.)dGu-Iw]Yo#G+_v zrˆޱc "ntnYQR"ht(fF}qdDfdn:v\q[oO'Z"ФNtߟ(~W;֎]~wǧ>MħUq߼ Nh@#:bLr>~;qvbo&ŢiX4bѴ~~ q\ւurqMƹd)P Xa%B^+N_e:ͮ#`Gp!XݎY;(VE5h1&}8ɷfMtwmPL5)6RO#œy 7)opnp?GCB6("vϥ[V\i>\W랞 D pqDOGLzCXDAig2)'7J'*J'^t :p;-Yٛ/yͬoi63F Vm!%%r}T!oE/'_^T^2!N[=.[p&Q3PGC^4Os^4OyRz/<^@gHIZ᜖!)yD !:p.(p.|{1Jۻj ъ}}<0oz*ZڋL5yx{{`Of?PX$juVF 3k" g'gr`V(ehvijm/J'*o/;GKҍɵwFa'}9-AFb:Fx(a!P)*kbBX!4)m)OU~Wk޷<+>[ ޟ6~  c,sD8+`(F P8t"25N%ɵ!I#?-A Y*@ [X&=J'tkWgꇷZ*`m24ݨ0Ml3gp-?@QD=nFA68k#噃; SNilhK4I©/qna=6:fEr)mk ,[zmJbx1 Oap%O^ȀJQu |Zw7Ӝ|?Os>!aYa&Tӣpܐ?g,2<m5X%MA6[.9oI:oo3vͽ33`twugh˜_ȧO.F'Z2xe~|AKֻ6@qkhރ\t"( j I\skCbe F%Z8u>hU^6=Ak(MM(b!z<3z=\juG5sQ-eG6fCL[>Zߟ @`P$R6QXU(f3[HĿ+(iML!f2<\h:5$~ۅy#_d .yBi|Br6^&&קa0#hT<2csEeX.V5!K c{3^ڏ՛%Rj7- YfsOGxoM8#9Jh̿kޅ!L6XfZjEǿ@NWKh. 'D;Us "*?-|97/{[vu298xqavL*㿏}; wW!p):T\}4Chqm[83};A\ z+a cTRIUkmspHxJ 0YV3"ovV$_;,zJHVlI+iDdvIi*I2YǷ_&&xy)>Rݷ~oO5#IP+-=Qht>@uάdEyI0N(/0[s$Ef|HJWB 3jF$$'OG QC%XqJ=D,Tĵu?fe0HR)TPRL]hFC"!вR -z`L`zAM#%J!5x&g4]!`.C$q fK*#V2yHRJ:N J⪢8KRsFPk >=h6l=B̙>\؀saV#όPWdE0F4kʡ!kP6xDq>ɌcX*'ng2! tI(5W9Fl=j"צ_R mO.Rsx~㗙y_#׋orH? "o<ߊ1m|@~~vV?(cIy~E$ e v|fځi%pCX52^.},A: -TGFpɎqԉWXK/eJr釩}\7DHҴ$ٜP(CIyy72{2M,}BtrF;>2-K,g1tJ,?ؗi/l_پL^b96*_/f=M*I< ˙VD&2(t Q6&,o]bt˥o/15hHޚ\AjoLpkVKREs<6z~:(:godzOvbO.e;%ׂ`tZEso69R_v [;+0I|V@oc!_MG˛ v/g@q%>l4=D CB:7©~y_m7TB7آM'kE-H26^] ^iuզ;f2ԁכ{x 6^> P (BcRj͙K tBDpi8{͆nh>ZvP"vZl͙M CEYaJml(,ѦacɍQP|7: *F/9˗׳B _z }ޭѰc*oMG,G=¾ tzHpDdaйTy.TBkTkBP)mѥ%\ 7ԇ|"Z$Sk%j׬U7n9X\gRn{-[hLvkdqU@`W5& z%o-1< 3CP|o)-&F\5h kCxۋPul<+aq FXgNh,{FN@$B'69>.mz;+1\Q(bq6 +N[dp)P^G)G 60DJcIs}"˩>do?L޵K d]2#8מѴd_d lhui*Vo_ !Ls"fCzm % `CJNuDԘ YS0ENtN 0.7Z bpQ \Jql|;rq.\#J}j (^74hDlB9u|&,z\BJJdi"JKlF0* ǁW.L%a,4cy鲂"Hp=Hp%TPv@^rka-`Gðɹ!WVXO%AiJ "Z|nqPqFyavl⾻!AZM)RٙKXiȼK}Z+5g\w1Z|Dt% - A- RNL<40ă`k0]N7 x (vfl~M>cF JZ9[$6$fg$(ka& +ztK1[X#,BTɱ.rHun9l\F!U--IuWKMabT, jʘ|Z(ga ۹Myx1e&#,f/#ҚV/?!x״1f1A>{]r4sc}6e_iRƹ:p# !{shF>.4]Dy$¶DeiKb42mmҔCik2Wx]MSi!{: |w#Q-&HqD~={qLGdDcMfE]C˔K/KDQbOzYt]’))t[]#*IVHqR=RA05Үg$mF f0Vfgg P8Dim[haJjX7?KͱcAHA64IK/),䍛hM15U3[4_xl) NWź8j jh.#rDT-S-٣TQqCZQzdᑠ/T0%<JlFS;+2PE9p/R%,=i[P])9:*nthOa$5J$,H]" eZ {&8.;Fv}f P_;V!_Hʭǟ ntD. pc!R`/O7icԐk5Μ qSAp헇vrRfL2+,;VZjXƪX`frkbA`-WPc<EonZ0L?zc<^)urXF =Vyo{=:Ɗ>& ~2rHWl{aQTt3)DꄫJڄ.>\zϬUuV  -^]:X]:z8Uh)F*VFDt_A(W*K@C+`0PLT؊O%&eHDHX=@W!>Ԫ3D`Քti]Yi7B8E;qE[>m ͙m{=ѐ8$Fd8lb"ymu5k"bq{?fD RzJ"\%FV$1ħdX/ Ԗb j$8i Ըs>N#UBQ XV2ϟa>ߚǂR噠v~(кgy N!flۈ2D9ah{vFH[2%9v)sE4eZn`EL;WIYn439PE7^ ђ9i3XY eQZ0B%F٭ʫHDYBXkҨ(cP*9T"QR뢤-8gup ͢o-HcNKWc(FY*;oA\'wLC1~ے-t_֔n.b'R,E!)vN eJ\PVb&M=;!q;d[ɭП!?<{׾s-E110B5oؘוz5Z,xwL=PplwFddžF4CCkHS:/`Irی"&``x()&C,iM>$EJGr[CARa[Pm9 91,ZǭŒ`b ؅VT2X$p1V ˵$<4JE$V*#ϳ-w }iqt}G߸x%ys9:ǚ^*)م'H#| %D$)Mau8LRQv;0Z&9RLI tJmʒ5Mg!%fIɩe!!)7RcΥ`LXKF\^K⍆@S% uH D*--#B`FBqesND:5g./&;Y,J&vL!ޕ37^X?<˳?rO֕9wg7jWo1bJ-iH%6ūzt*r?&dn>UYP{<9`;r-|i_W$FwcDˤ|[mJ$F 7Kݦ%阕-1뗎FXtlSn k/ w?=O-f=z?~)F,P.wVuj%%m:XI+BAFZ{g1K,^MjGuavZu=ֻDy0KnÔw;nQd8s<ػ~ 3[d_oMlھu& ѿIbX|Mחcθٌ̊-Q~oհ|2G { (r$A>t3FL5B-j9~1o; CQV7:Bzdힷ:#j`MI[Bd-w˧WIyH{</cU=X6IXBbSYRZ<& dOSəFõI*>.m%SwP6PnbǏED22PŒβ*bYSL$陓DWz]* d4ݵGcT^LF9B$23]ŮZ|&jrpYggw^Ճgv2kZ|v1/|yŪ5!w2j}>Y=x:"`Qwӻp \3R tI#DZI_e/k3Zv-veSv k`pµ?I]_\ynaK .&]kZeO+Ob.vt6]<=i|;p`- LZ, 7_+AǥV;1}*m!fxLڪk5W|Zo*\%HpkH)?OpÌƀ hje[D(ۢA=PJ(¿f,*#( Pt(x  'k,IA^%g mQ#]PW -.7Z +3R8(cD\)3/ZKĔSkt VexnuJO|nF1 BjF׮ڃ$D0F+@׎+.)rd)* G AyIEapv0 9wQWqPĞXgL ɰ[)sgsmCcDCLjP@*tL $ # yRp"SjV`l bLwr^ED@B)eΌ6Jn A4DKNX489`ZŸ 5qV1hIH8u@bz9%uH!Α+hᩱK˝u;DZ\R0rK)btO)]i*Gχ"ִn|\U_?|:꽟4:t o#oKZXYSfE89r UK`]ݸ-g_}'`>|2`' B-m Mn'q2V=}|-̩fjJm!s@YŹm(cԤ+_b-z\ARn(WVhig ת6(gUƢtkеf|t׆UOٓQ3)=N% IB;.ZG-$/ά֙W~-jb?N??OUrvwsvWspVwthN{[֪Dj_@ԣ߿ʒJ^ # ͆OnK ʜy\rw{㿹 ~ޟ.us?ǩ]n^LcNګeˡƲMdmKr{}F':HRZ2- ǙPn~[/ -Jr5)~?~i=KS3*7-Y7(|3F1xX BL'&meTi4k~&л7nQ6U9Jހ;ѧ8)F'mIb)& cF׏o/b K;S2 *)7ӈE0f-]197%D HY50|,s^:IE vc,ABAy!5.(!ߋpbLArJXG!NHNn6.jD"1oM&M]S-i*S1A•٤{>ׇQ&7?]&RLTȘU6˙*3FrY(4eϫX@P4\zhi~j9H^(B|!"/?_G"S3frYsUf1&.gYM;VB^ԁ@1G @$ĽM~#(֧Y^U}VA(YHxH`<82Fz?GX q,cuP vFFGc ZCMݙX#a=z5^јkg5Y@N!+[gPos|IQi<}=p"eoDOS:N)> 4s6RJQ?@׌Ġ/c'cO/i@jE(n  {/\Z0;/}wTc,g_o(~K`WLa}l1Uĸ͏Y<%ףʏ+?:踪lr KJo[pdĤ3#1("#%k+$;ϧ˳$'YlCg{ J:pϳOlt Xsy-W"*KuH!ʇ+R!j cE{/rrGy1džN;wYɄi-7(ˍmJᾸCXq6I_;YUXSJU0xC$ewmIr /o).= ApkW}'X_5II#=r(0,ZM0_,v+ JszǤѸW%0^ ɷ/^4p!&*h$ᴫchG0 8cQzЇ9.h6[oA2e$k̶\Y`PCF_WWB~y7+0_j3|j W\Iɷ~B .$aeS$]OK;9P#e'BjEB!f#bN=*].}|4V7 WŻLf1Mj/En\<!~o敻~闛 vs}?_tҧly>O_ gDca 5D7o.~x t SoPpJZ,9ZRqb$o! AFp$0N1/q&m -4f-5 ;O?!=w^OVE0j*X p)\NK D*5ڠƭ€05aS-rVKtRfƑզY!4fc߮? VrXb J-ILZR1bpVC2<--hJ1ْį NO7?_oNDD廻x_(lZFf .j:zUmbL'_ryxT.k/^YQ`,V{:Jl_spյoݯd>x&liN-%[/D*r ˸ }Xn6e㚋VLIkafZ!ZY2}2qApxk0}D+kAQ(uUSR[wVnuyф.+=6+{vM.Ѣ/MνԏYt/Yu<o9n"h'QgtV r!=9D̙)5k W$Qi-{;F/z DXyeL(D԰ aB~aPlgϲ`V= nXOZ { ;P.9))AinA4ʸ"Z_lE]D.`+`71%pvg敲ݴWJF¼sy˔BW - R?͍)v :U B76)J9-O3{O;%cRA "7RcVj:1<$ÅD%A)J=Kuj[!{Ѝ" "GHgfP>^pPcM(9@+iEǜT @/2Ҩ^I: A$Zi}r9/I^Hl@K ^\+tBp)C׼bBbPb:ݎ9*P`ޭAK[6Kp_ͼ#?OqssPL9{`,U4kPU' Cć1^+S>edQUF;Y]0~Giur;T&8@" I5X{n{{'9Rͳ-Nѐc96A$毀mY̐y.H'AqVG06ei kܥE;~Ld?!ݰ5֝&#`]Azx5$+7Td\12t]˜ PB8@OFñ,Ȑk`"=XØ D KT4$N`v((-Dg#Ɲ3QFKI P=-" k@`o ZzU.jBMDIc"fj4HNs^zGp.]^7&}V6o+QSi%jSgk4wZ.NR-*5mk1(eGr#E Vn[?7a!_mS4ﯙ;syJOll}JOdػAN9|ei'B!p|2 DᘾwLߋe%pťL 1zbQ`qGl>"BAUR^dTZd(H?_Acnّy@؊hhPL<>q`(pM6F=*!PxAu]HBpt:>=ëaFx{ 3C0xsmIӝW_%d0ʛ5t!5^?ֈ zޥ[WO*LWPSk#.b=_@Xh iOJAlme{]b^6ڎKTŷv?cGhE4q{5k1S9DudI{l@ rVA ڧBì]Pˈtm U``(=q=7ey>.s2/ q-2~?}!XXn_HZ)z;@qZ̓y$ 2\%?V !Ҫ9JPZBuADQi^VRVy;3+ҌD)ctD0 M,GlZ\:U2y)S@X :&6'48U1Bɮ6A* r+qX.Zdt (rM^_NWK.p0"F fONJ~F'11(GWR6~ %L,$6\^$=Lk!yOوz >(;5zCX ;hn^?+ޢs230hӦ:PɮE+IH\S X.0㪡`qus?*9co\jc1 JGYX7͵Y%j8jϗ鶙"O?pJ8lgޟv[߃gY-8yEq%~C8kR+mSK 9?jc줠[K­d!F?' g޲'#4ۿ^bW;8\nPQfE˴֨pMQ*ƅgʚ_aecvj#٘ݱw$JCRm{'7ARR,JUnXD%L$@KP!AX)17c ߽49cHdZFFSBUZ3 'r AR$$'Ah veٱoI%d`^ocaD7 $1W_1Hs*"ٮQ8c*j!;4DZEГAvifq4I%DxIKZwQ Bm.Bҭq ӻ$MrAohA8+0n5PqeΦ1%-"p19\IG[ZIFӯ_/XNK-26 ":Dփ@cAIcͥĦ?܌@ҡ TNCTNC TԦ>|uB Wt&uFzJc VMd%Kh)m)C MsL*mQ $SIYod,0Dd=qc'h=y@OJY XԀ]R`B4^JAV/$1%urhAyoI)9(VJU^ܥR*]^k K0@^z+@[ VxC %\  YzEǒF]R()bo!/ ɂ55zpnqqct}:Q,vf: P}v\z{䄟TUo*<j38^y.L'iW}Xft<KB}e H)vVNje4P)TqM:Cz0]eVw0.;hh ڹM+SC0Utw4Ίwsܺp v -%;P떤6?dF{;5.WA|; 9couTJ [ͫ4ῌ6!wNW-c7V>^5OZWV{ON.rçt==Ѫۛ|@ 50m=͑{Dz~Ũ޵p{5WtZ,%>ҮFL4z k?]e/~II!&4Jۯ'ڎ}O*=,O޵X*EϊT$z8\ul*}w.,~?O|.W?7 t3i iкEeKi±:Ņ5.GAjV\s*?*}`W#f!R {~J{~yD(*{^6W;s;A[@0i8'̇|aMAf|y>W9*\| A`+N"q̂e"5u ʈӊ DZ5A2Cf5blG1R ~컁*Me@N3z'LDnu\mEL kIl?҈`2zS-dw׫awm$5Zʇ+WB;5kMS7M,ɍ-U>T2dW(y>7F%2'yuoW!`?ɧ풕gwU > LM l=0KTQ&:*^&mq1VJԥȐԵIHSJYSEb/faHմ@.ld'(HwVj8iqꦰN;X&ܨʚn_oWyt/ o L/_|vtYw_o"cIHF٢x9St'Ͼ1 ikep.[^7=Xj\1tk~cXc 88cX.ZH|\iEvW\IewlAz$֪dU EZ^> f/ܖ.GK? }%% 4xcpodU;U>Tj~u졢?O_7/h9L:;O`N:ϡ]z"(U UCXf{:窘gY# R_ q\G=mue59~t$GeeӽnmQWШVB 7!M]UW!ļܯ #Qi-ЖD<{ D;_J1Xkv«圻d<|U&Q eOaVkYϢFX%Pe'R"9‹}d]JB %[ }&۷՜-R ǜQEBN.`9/lփ@HWbۢk{!kf+J P$ %u plie9yxHi<Ӥu U.uI~H NdE')r#)0A+p_A$ꕈ A `Zl_g?pѼ\w7[XYOz* Yc!oDٔ_7x7 [S Nwn :-wwwkB޸lS(b}6eMcIʩBZLF{^-RΪZzjtP@/M&ڠNt"S}^f=h%]ȼfF0!xԖ^}fӓ*aU}<쵟@ftǣ\tӨx87]qU@a֋)dz漯"Z9TxB͎#s}e3u}Q3fZ q."-BpUԙB).=_>\s|b'c㨽ʣ*ګ1/?hW`IZU$IdC O] hBĞHCr_6+DpsF1U BebNG ւF$w@#_[S4qJTe5uOMRXF$aoX`y3HHmSUh"E_J)ju^潮j-qZz_ P0|+|%8h7~%*Pܛo_W`˪Vw`P3슂0 osTT$΋6VaV;̬cN Ѭj^ٮ=I 1`Z麒Zvj1.(˟Mo1+)g;N]URziM`ތ~fwTD5kBУ= 5؜-%j(y7ok;G- iCV^JcmՑS1~ZgDC "UyiM ev<"puIh i\@ UpԧtPcL'Hg!N0hѳXFI! 1ZF0rA z`bs iTB17.IOC ,~kcSJ?}} _-|WmDn?\0䘰slXkۇe\?S2ӲB/l}ǴTR./cnfD5=LVXY{YT|^*Œo7|35BO]yР$Ƀ8"[I ?qI{G&OꛌTJŠ!ቂG%^[ T+$ F04 8XXKiW=L: ݌n488hɬ.rѫL$iV{9K֓Q&Xֿ7Tq`REe`8FOH%tP,R)<(GcK¹E;!aw+1p0^C5 LJNT_!I tΦoppo2p ';SUq6Z7&12Z, /y1i%X=_S=8L}~48$;qmh~heް<{)mH<]cfg1"F%k_9!^~3x|L>ϙOlz8;;\γ>K nĞ8 fțܐp|:K_8F5 g`pӗ;eTnxg@k.Pu~~& NvcCxTѥU(4~ЕH&keV 7tCD*ҠlD/*KRV+28`>T#BnL!7#BnF1h=OQ%G!V)6TbGidXxY7+#NBO5ǚR2j?14v1t<9*NStlMrCz,V0u4uS>Dv]8DR /T 4^`%s/׮AӅ\SŠIB1 +eWe5yob(u?oSOF2><94ƺy\W=Z Pϳn|FÝ0$} Vi%E5lÒW 2JLe[n@hPE1M$^NE;YC5nIs-qՎ>UQ;<( ^ A|h  vg KƺPD=X$۪}%˞f[ Io+ߟdpzjD#duc,pz>w;)/i[YJBcJJ["+4!FpUNUV󆫊?,EԤɃ2Lf+8vuJAD 78)JwkSÂǖ4Z45Ͷ?՜ ؾ٧11jt}Gjt]VOV.BX. Ɨh̚^:TOƅgvQbtbEp.&ݢ'*Ro|F)<}Pi'!"1Z:$FEp1thtKB%M)x۵q7R ݅f 6<&6l삻{s5jQz\#gOe&`(nay5 Ǟ>i=+sSD3-O$<.5wLrp_ T FeFc֚!",!"hp[L|HFp*ztv'#*L\eR]J6c]]ŕאwJP1')؁h2 mke; 7$MywVGhBJ_44mhF/9eЩ3=B>AQHP/5qPW(gGejQu 8T< Edl 6%)3fDŗ"LpMp2Q)ߎ[ W_ _[iPƹI-mίQHܸ;bHTѝ( )$ZL+L[r72]%&R>C0~4>||q|ڋ5@vZF]0 ^טTLLDv*/V7䙫X!Ā{q?gl7¿f}w AJŘ$w ; H$іsp7!1MiF{G}*nz+9~ {bB @+"[ ?F(-ߠtxG\'m*`C]x7*d$1߸@"1/Ge[p%m0QKpi1|4#$_0L&Y|ODqאf5r$+ZW&X4G!pSJ+IFXc8+KTUK 2\YqC"X3-kr|fx3+ãc ?Ni.eMj.x?}{_Gq[D?}䗫b@ޝm?~%/GހqXK̗0ޣ s\aHYVBJęaA72CЖ9>yBȚՆs|L8)HK\JTT2PؕXxݸ/i7ޱ?9lwr;{_wcwܿ݁20__LOJ^h%8W4Pߐ7?<~~ X_߃v>e񲼾1ID g ~{u5>3^sU.?>> w fC?cF$f>KnpFT %\q梠eMb^G@F\_Dgwpiwrk\\l#$7Gh@9kObŪ' ty 2 $v w5IuvZܕ\Y>Κe7r\GĿ>av]?4^|>vkRRˏuy|լZwFl{~\ZK`q:dU! x7nk`m &` -*ʲ)ot}UpAYӟk8vh9BQ>6(@g { =|@l~UJ`",jTU405)2[8 868 rE؄,A(N[. 4=5φ_ףtmt65lDh״2˥:_pKY1wax Y)+k뀥%%cTݨcea%'b:A@B\H7YP\_ bS!pW \\醺PekˊZSepC4-g.Q%ܩYe ZMDs9jN38 .8-9ǭ %&ZO~٧+աOw+&]h3tl,||~gnjQ.\W¿U)ojFcx}-?-ƫ_ľc =_XNGOUޫ#o]>{ ~Vܝ?*;pFY/%'w?Kk{/kJq581ay@V!OwŠ8 }o);}4KBĮ7e^qzOܵvIk뭑}Gieh1CQϮA-޴wOQ,8Eәd<]׍!(ԄxūcHMrT,JarĵɆ}H:ɆS-^rJ*ər{ʐ…,e%h |%bfIvs|DžɄiC!S}}$!-x&9t NO.Sf[i5|!h/~{Ԛ|Ղ Wr=P>)C2f2[RH>zsR"LW}iĤJ i4b8q ߤJq4H@P)0RѵӲ|@oщ`-'E]~K4ΗD|g3=Zc+4J[ld]J 6hEOS=avafiM NȀ') 6OP+-x !p!<s*̒ND \h_$\=E쫕(i&b% BݨENB^/%S =- mߺL^P/-BOM> H$E)sۨm88Bv&<.{_|hF,C>d 4ޅTZ& "xI#œ94LZl֨s` CL"OF\) ;=HifOD\fχx'!sXY©OC 5+x\P4_{Ԝa Z*s%"P/x=(u%*;<ԐW(`_\C~q'@ULU#wD#w 5~N",6BS{rƆf}NfpJ~4Ev8;.7Hܶ.D $BhZ`>j{ DeM Y@(S2f: R0ω#{Bn~"E"<<2ݽoPƔc~N:ʌ9J_/t'oD"nm+y~caHΪZ ZBshb(3N*ӏ>^ ClOx>35 wO8r gߗD mTpzh 2noŲl! ɠdzE[#á WmAB[DM3ZdE3N o::IE ]EN֮Yu[`́W 9پCX~Hdq]lulu SS,gǟ]nDiUO#!7N/ 3̋˝~t6#k&k>2M]pO``A:m**\a=1kA/C&Xb~>XCY@՟Kأ ) 1Zy da) jly>'~rt/M)}!x `މ4FY;nlҴ&Dy+!#)TXD]2Z3$70A1z*Th}hȸE^,~c] DǧrivϿB\96a;^9;E ~pƍ]D6AzFb!P hz = 9O"9Kw?)VPtAGWwNW%/ЖSRYj>Km_3猧lFDD D.H,x#OB蘬T? /NFy&9{f(>!&BѠq(qG?{ oɛzcÕLh |e "cM-mcb8Ns6B*-w,-D3|N EG-q4,H l-j@Ɔqu%D>5cWIhGKL2cnAMDȽsV n I^46[oO0/QYsQf$=j\񶸣뼎DDj]K|kg{݈@o̘WR^+O"r 3Z5⾐XYK"HpΓ$䞉f֡ eC2`\uưcRa7G4[c]v\ˣ2zQnהyM #oBVjx[VcM`S;rEPVmkZ;.uePC?ryl,%bϴ1~jٌvh[{0f%'>t"NHXx ]ZŸyHwIBP?Yg+j)Գ7 ۺ?csjDwRY~3-JQ}y(:0 )Ԡr-e])޲f/g,Ph_iڵ4"KM[m8>fm4|`L)gQ&RA_wI9 p#W ¡ %:$M+0A(9m$(,|i]t]hl~x{X+ʾo÷v$pJ]r7}Xa."Ѝ7psCDp!@X/mC Y`{ ٘1D) *~O4‰,{5 _4:C!ThJUGݏpJ_cVuNjz^wJԻu:m.-xE>yLrgyb5x支Gm侈[Fkӈ~ͫ$3o2@s.Պ(DK vGSC`o"a#ƾ1_%#ͼ mU@ VhlS%ވoE'̵Mhn(+c)HSr>D7D+k6 sŝٷs# ּ㕓Qn-M{ r6њV(*;Ur qcVǨn(L,PFJK<0 k+MG%Q 6FTr^_Ib Ķ(\)&.gJM[_RWA؟ G_q>{-N# p}dG ErG !Ih v$BY| hu$F9 Ͱ2Sb@?՟5#T=q68ɷs`u\sBZ%]qy;!!i#QP.GA}몎a^꒕Θdx.X_ȉCp T R|[^[ykڸB'٣7*۽ߧk?#==gz胢YkQ<}g2sty:ILގg:j*-Y>.B&/eFGӵۯ>к#^VbXed6Ntn|[E:6kړ[Lŝt(4ag}2fuї9),b"?@|~ Jμ+#uvI2™@-N'_ftGɩbWy=Th(^RX[(D*W!@XW 'SHћH+76 Q:ϺM\ +Zbïo?V\N.'pXwRmY0(2c!bZA4ĈR$'(tty8uY_a,V4bkJ-n2W|5x$e"vkACjf}m=y\M@ASm+Q,%CDes!sZLs9|BȬ7ytCTؾz9rK+薼"ӧ#+ d($X<{ׁtt7Z ^(Uumf\\$^eH:wx#]hG{ O$TbKcǰӇ1Ž(fhw)ς-Kޖį ѪX;H0J67xZs o4\ܽ.t!]\Eܗ!MZL /fj }luOgט\Gr%7ߣP6˗Ǐ@l 7T)wm-+˨ dckt6  .i1K2I=CQ2EIؒ#ޙߜ9c5ػ]A<&TLվ|J`VM8YJʨ,FRA#Ej♑<.*N!sozu'UJUa92M:+#KrX|;]`ɘoMU۬Z-tQhDvWn@׬ӊUQ(JCL'=#z"h*>3ʲ>Iyт,8sj dgZ PIW\w-P,7<вgPCԻ OMI H=W죬ECb 3xP]RduȬVP9i~xwʆECOK=xN҇.⹃ĮEu"6FF3FJ+'$ch9!IF"c!:CxTu xҞr^.x,ǣIc)a% ]ʖ1xR}G.ue>1aSpȭlV7 5C|sWZ|!`)p%B gϹS4J+͈Vҕ޷ |6j+ OPP`кwⲔQo^/-+v^u9 &QaR )_ 74Ty<EPB+ A 8FIs sJj+kN(^h0ɨ*&=w ?/j< 3Іξ6nPa9ãi>NLKnQMj6ɫN7f|sq:8S/Y'4Lt߹f~9- ӼyO>Ӵ1e?"Q+S?X^W!4х8G|cCi7Wt]#(\#"8A|2Vȩœ^ fxS6ORYMPhIh|LJ`4\ A%$<\A\JX(JC9eEցh"KsLx] Nm 1tIku)œ ݒZȥ' B)$ʨBkC%K$$옠z(Db,1t^Jur6yoR˩1i@D~}I$(JqAŝK& s O7`h,㪡i*ַdB47gՊ 4lT)a4!ilK|!"JsKK,bTk*=C;.x"^5Ip"INKny2i-kb5OI̟=}\lkup[MpR@˃1pp*؁u)X]BXvgQ57͝M ה.۵uuR0Y*61`iDcD)((4eΡJI)e3ʙyWY]~~my>nѽ.Rvs^s6R_X)w8 >'d#@ @AL OȨG@^T'4f F]$Q9DG"Ҩu<3 bν)qh+ULM',HIveE^;hrNBwO66Z]D/<5#UNܳizypQY7 BEV,;:0WwHD g"1pٻ. ߼ST׆S˃J(JRtw-t?[h:iƣ~)Ov0;^29A?kfFdfIž*K3ѻ,ǖ.MUJK:oxKɳs<;- =aMb vڂ؟0<@v 8pd-2k{0{ g/U#ޠLG;>KC)eJ{7 |9W)|ShaeOƣttrt89y1ʉ2Hz^菇4vz[n~v//i+7Ω >=i|5g/g=M/ޞLW9"./=ɫt/Vˤ82auU@;=ڟQYޟ:C[..~n;=𧿼?}~w?|Q>OW«^y={C3{NOo[|ͻ/>=~_I;)QeMU{hv2/~}{O/zMmދ^1ȡʂQXgS{v2r{ӝ6˫fdrb&>A/4Q]S[t*g$(~[|=\rNĢTA{8F3^g/_{Ӎ{a?*6H&;IލㇼKѰW^Q+ {Ea(tѰ ux.Rђp M6w{VSͲ#3Q{KJ:Lp\"SI'ZKM &i*cCwZej+y 7Wt󳐿AAPREk_{k8EZEV 9jj3JFklbܫ_Z,`` 67.Svz]=v}m"Z 09 .2X". ^6׋0swI nGA;&vRH$RB XI1fc7(Հ?#L1L/ƴM!MtV7[gq&zVˇnXWLF('q#GWzb1Onq /rӽ]4 \d-au$pv?( ;ܔ^D9Qi"p .M5'uT{=shPhSM1$T,Kj~Iyh/9H, 4>+(Z>#v֧ledK`Q@NzV5D],|g)?|iAW[~AX.7K?&s7ۦX$l,eB!!Mq`Ne%z `e{>&" Sߕc}5JMYAx AjI5:ٹ[b -ȲHTW_ ۍ}ˤeکt8e@ ĩv 1c!тHZxhMs/#7kQf$S[Cp{[ho].:p¾f LӦjlv.y{r8t]&wj꾂47+G2#* pp?Y[ѓI߆,qC=+? '1oKp5zK!O͓!O^ -ӭ_,يF;~Wo-dm|7?=?aZfv(/Dm4!+Y5 4#H &C2Ewlk %{p4هDJ2˒u'4Ef d{=:>0AE3&1V[&RyB?v[֛W7/2~(r.-|leOؓ v=97\a(cƹ-ξ|ӛs C_TSw}2\pmsZq<!Ktsh۩em;5!t}(k&6@.34>!jR{r(Kw<6:=-48X [3ɬ)t I{4(6s[?.]BuM([2Cs`4߷y=ى-s\3%NˊGoѹ t_3qX[N zDm'=[)*Qjc_1Y3n߃n|Kw˞X%n2g|=u:?E!)iW,uԣ:ZѭQ:L忼c:ݟo&=hAj> Lc<Ȱk2 X+@ h{mef3&?/7$Ei:{t.fc~M%Ei@eW-J|j7-s)s{`ت=~8)uJUuaP࿺Y)5d n-D\)ݫN( ˎu*fM~}(9a㞧 ߆>`q-HG@eG5D' /{-t< te&(m3%f--Fx`Y0hGI TiߣmƤ~-5AƀrNp|UjuO礒_>t4&!YF o"* 0TW^*ՁD[6%^x߇Z t Ju&&h!T9(|l3V}{} Q9~g ?~c~1~UI}JN%84Jm`ӕ%DZdw;jnaeZ}KVzꗀw"W8ԮD-G[. Baiu-z*Y6ίA@]Fi~Zxͭ45p4_"iJn9/K(8{(xmDfAb)n㊩T8ىާbHiFhHvqKe'zq$jt'FWIT5iuA+$`>0v>vytzy0 Wl/::ؽ4:Þ~m[+MQlmJ^񘵙5Lt<y\p9&<_h f۠FXeHRƪˌ8(q e #\=c͇c@2%׵&,&Kg<9a,= f0ms:FpVD9kg\#Vô]9)*Tq8vxi YSM,SaLsW^D7t8 VQBhE@K$hR4cq7"81%o,r0E6~|TS*RSPs?9gZT̷N/wY1~FgʊE-7M{/U++Li×?`^!>m2|8>BɪRm d}P@lrgպBy$j*895k]^`]V&;0ܓhb.HR|muT7ZHTo8[ եu`gbJ 3i-8Dԣ֮5/XJx`&.^ЁphQ|Y{YO-R\f\8@$272FN=6i樐sNJ=b /ݢC3- vAIZ# ^tYRTN)=Ή.p$,r[JIBmX- p\@# ԤLE=.hg Ke\|^&!:RuqwV%]iO`iuvsQ  *؞\Y6䛚~=x` `#c,$ahT"!RA2RM s< Ak9S4qmd6kPIB*6 wlDi7"M L2I^043Zᒴ)iKuj=MD<>R峏- ]i(U n4nM$L&2=ݠۧ%Q44J/l1%d1Z(w'G!ITE0فb] ֢>Lbw}`zP16zXa<8Fφ@ҎxнH+كjWd$#3[ z$᮴CL?<0l|h@~&4_ >-c| !,|y.-a[s #o֗KnZgهsɲրfNJ4#M:к|J3z;?rΘ<)䠉E{[Gz2B P:9DDI f Fx (m2Jq#@^M,gJܛ3(^U?.΄zPyYuJ)fF5, 菞]F<0Kk[x%"gZHQ.组~(p6?rElNHXlB=z!"**KڝGOwOfaf ށ[!ՠr쑍΃&p.S:b)3t͇ X+6?;iqpv(+Ws>o:ܓܦh1Ûy8m `EE/l0L>- KF%ÚY0$00A{ g?xI]_Y,ʣrs`݊ G 2g6gTg6< @dkW^C8M9 tdͥE>RNpL M="0k~(]sР5S0H!a8#툗 r@O cQQQZQ¡5)b\|.q4dhvZc9Ik_8)|k`0m7<Nwow='u hUtl9ȟ$ `+h\,\Q:zlL:" UB2 9 ׋( `!&Uhes1|dxv*\SVݾkO/Կ Np3+s [,)7&Re- uʃQ&0V l1BHG h۝AH9Ō XGcdtGVS(Oݾ-V9r,B|Vy{Ȥ5~M_d>uOf瓯%M0@A#sV"iKw: B7siFcp6R3{Mp>l:bH%{a+(-LxH-OGY?fp7^;i?w&7Xֽ$a1zhFãմ5|>,{a:|ooz >LEEx+ei4m'l.VGaF+ɗGYD,QDk )bkwrak__Zqoይ|Q B3j*ه^$pl>ǣcPc'`ۢS ϲ7giuziŠ}dIp0a6$"]V -j Tّd{[^vg6v4 AKMtrӺĸTAb(?u9;:(`gzxk`{/R9 F'f*囩ݗ__˜@?C_dj`?{2U?FQo0|{OwߨFlzW/~ɞ|8gܥ_OO{ߨTNvտ_'ɾ9W/6673I9^S3CX6Jef+nPsigZٳPmsr@ajwG?v"62h|Ip^3_ N: KӉNn@7M^趿pEݢ*Oksp6_. o|;/9?IkS xXBU/gV6.6ī=IĦSboA{m/|2|8.vAaz$ 1q"~yWppm]7KlY8k5dku kGy A5X h/!3<)QQ2Vq8\%E`ꕥvM,4_1VIz,1hU)V2O/m,u,hXtRXJll$ϥ8ęnfC5*ٛP/uv5wu/2@8g I,JI7qX[ ?Re.1 h9ÌVؔ98-# ,WROJ!ٰL|!TZLԈdq2XqiR+*}.')tB"6 AyFrG9*kYW 8$%EY IG3 5D{4F8ı1S2_NDJl@[|↴m7{p4ɢ=&e2΁X`%WhJ_AŊd:> H^HR$K5Q#^[ᰫpha =L:"pQ@CghR>4,IBi9*3EzDBqtfH/J@1hP=غAS$R}ǘJ5 FBWL } bLkؽ1<}m4(Q2Dd@ɑ[-햫:]R V!7A(ߪ+P3 iah0 v3eX ,b}b H(CsKj",+2L$VEi|AZc&"h &bTM娑 ,r&k3! 1/r Z#Fkc6I WcC{X{VIc@{_Zô  x+A9os )gxA߉%(o3 ;narz'Teƈdtbr !d3@P .j9 T\s2.,Yp f 1v _p0"@*e1…iܱf֫ VQs/q4[#b|:"52) -mO;d CA`g8E 7Ίf<0%mD#Z&+GVVXીrf5W],EHM$> ok" B*&AFK;S fҚp;f"A7=ͺ?wMr&/}&>ҿc0B3mGڇw%Z&U~vcQZժ$NZ.ÂuH}˨jeX!5N;Z-m4ͪZP$ǐWэ{V2Z{ حN5-4MZ*4j+{M[_k6!G4m! ŦoXӆm-:1wNk]|DY7a>H'n mSSZ7EV,]K-鵬ӫUv \o3B`+{}J5H]fhYc$٣_rB[^yoX"i1nb7qt4=U(t{Lg1{fo "z ֹ- 1 Lcq E%C1ǧCyGXmL7)Qؤ(W& DFmpj2єZwG|"W`_HzFX$) f"M`kW%!.5ߖվ˓Mָѽ?$3?OE!y.38Vig_)} '1xݾM*b; rj\۞ Fk{JeﰨQEI:bGw#T&aBvS[&M ?ݴ[mDJ؇[am "@ZÊJ&!Y}KTiey]sdm'u3Q%kfx^ B F[uZ1)6YWɬ/eꁱpT? *DÆeiXŴk:k9 rcNkΖ*VR)% &$FN鶤Mm\,n˻|er(FEi'-ތ hލ(Yh`,d~:ϵZ7t83U8Q5hݽݚ},hnlѩgfT%^ d詞gEw~EX |2eޕ/^ J&.#fPZKa#- рHA.)jm=ç`<\ vfR",$)\ʃ?I#(\: RbÈ݂osގaԥ8hXe4][w'q59.ZD\`iq|*SLcwO:tb,^?Űٵ*}޼\Ht4z_[AC[,$lL"I!tXKAfB{(3pv ֺ2 eO=e*(x pHb|]^I5P' 0.ԻhQKrĖև-h40-ԻІ5E.n .іm4< 8&~ݻ2Y#Qw |q‰Jb=#Q+&lAal֧KQÀ 8 ._K[RN#0mL8,!;F %ּ?Zp$onon4fb&6; =??~& 1lCi噏D6o.L[MW.&*":O廬I$mr^62 F`8٣tdc lu84qypw܆Aߗo{337m^Wj( {/C&0}ĉ5l䐯Y''+T$(Q0FYJRF1ؘ0B1X; H=]g9|/CX~;ssӳtUI4zzu=wNr"M?WnZx 7S];jF]΅U~A+/={EQVIbdyoyIoϏ'Οpx9պ}NAq.npʕ\\(&m{x`{~­CnS%WX]F&q y Wff 4fKzg'ów|t>GMDv߭v\~7do0.4/|xt9d'2iLjx9~~ ~>|>>:{2N'ٱw?loT7rZgyxg/_<~{"I_W'^-M|yoN^c;~}s{W~wv]^F DNO0NAւ~{as%=[Rio"B:3@eE]մ䆫{Y;Z jfp_ RAij5NcR{rٕV>|Z>Y&I]u6izF]n5!{a_ /YnF?Ѓs+[-ݬ>(N]Sj[q}r+1p{ËޯGɰrf_:kx _8-vеm^<ߒ}خ9qtNlny,RivtPx{8rJPm{˼mCsgޗ=`nOuN|7uڞ~:aviz{\%ߎg]\9ħ|W앛@9Ѣ炜{ݛh8{YIj?h2Ĥ?PNgN4Vr|ukqߙ__F}{ht/EO$6e^P,p}“$Fw3;JzYKy獽~:97nMujSzk(gj2+^t{ha~z}=0NQΙ8G7? ,;|܋PSzUCg0hg8#@3dO&=&^0RgSB9wc^`p8읎X>v3;>;m $L&j2 >M͗e)F7j,22,8Z`(D,FQx*$:V ,L-JE[2U>6< A !^/JjBAXg@_"`L{Ŀ_|`"#IJX1-܆FhLQXOT̔$2 'X6XX0ל4qҀ+fxqCCs0`᠌GD<0w6d{YTͬY$KEY?r7aIݡ$D!Ssz DcK)"RԏAX(PR4js 1] 6=eR(JFWK !Fǡk-Gc_Ym\2G9oMQ8vJe0&Q]oF} Hڳz41?vLMYjkscGM F' fQB;kJMY$]xns=ű M#:AkTߚ홾o$U*E1AY;5+ ]wwi)*w7bk&̓Bhv3ui!<үu<>ҥU$*[q@(ֺi<^ d&zB5W l[] H 4hoNjGqv `;;V؜։o˯>`?n@x70Ct{NĐǼw~j˭<OkTAxB*z v/\d[ *(PMQF%&(EFsuyazaC2@j> P'(F?I3%q"'c/# _kٵϮԎk^sk[[-ԟz!v4XۜKͫO^}ͪݟU[CT{MݙT[_YSהڗR;?fԞ8ΒQ{ͨfԦQnZ0*+E bJd`ɩG& Mx)1"*08/91Ntq$(ٲY췽Un lsօӴ׿3E# @ wsхN]4J 6/T{;:c$QOH{ۆ[,]m~J}u]`W7O;–®+]aS}<33vOٿTX[l[l c[_]k9c䊫{ϓ{F:ɫ 3Z4)p`B[k j i gs55D:,YGPhCwLJr%`;䂽W5Zm0T+W޻AHӡWԶ|O=i(1FzLji *l8Ef#OYKY@ "a؛ arE.J3U0.U^G/7#^5+b ̈́llȁb2akXUt `3j!XǰNcx 9W s!6քrI>r+22]feLN)M㉥\nCTn*UR\(XfHNX0yY$AT=Ra&]"wQ{XU0Z3!!e4*2g))Мp # :+ŀ7}Ax /- [ @mk9)%1i% +p(.h2'sτ _!U%|fHCTʢrc)AZfSu>lAU.d 3ԱvDMavBH+S̼a4/" Є- )HY& uZ+8$RU \2So J 4~pI*xLyb B 6C<W*ZCŽ{WX[^4x-7\ Jx NB:%Hy,c0kEgSv  e y易 ;1,n/4HRuS^cVr٠epV'[S Kڼ. Ppg9cVmbK4T^ 3 LUAnE) nCWUo2N+1N6~Y؋D̓lƾ 0ADpf![Xz,=N Ќk)[ZWjVea:$ 8qZh0'toA˰CTb"r dGX;p0 0 #/$)IQ*2#A޺6DD6k#DQDb<3&,4caF"+Ln?dMG.t0t ?cA1FzӸL'p#gd.N wHsLz\a8gC wN% [u(.G.gXw;*gGd!k6?E{'f}6H $p+$*P?ޤ1q`}#2VLQZ_~Q$; n:gJ)Si4.$Dۀ@6A)95<+;@R]8+ S㪙YqbTS",CzU%8+<' <9Ut @?Bt[r^L`&oSM&)IjcWni<+f)?aW8yI-KJ`$\M9c}3xVlCyzW,>H~υBI +꜊VѠ)MoQy-RMx=S9^3 &@ dVY+t #:Lhdc73[W—#E\_}o{^Wd\!#V/lTR [q3#1FΚ9\/'6λ,rE`FH8m͈S" v!͍$"iVf `v/ MnGY `ScRd``v LZlHF*{졠JAx6_w8#AF2Q±;a="Rũ6:C[ sݨQ|7juh@N3Q"T7'eviA8RD`F0pӬc6P=(Oz!!#5W"h b̀WDaˁ'}F9PJ"r,uYս#`^M>}*wU}`>fc.sG9S2΢ff~?r ]+~ kPS+ryf!_@ QN F+5H* `KU !aQ1jYu*Df>2:3T{&ba4^*,=]s 'P>j#Q>Rh?.V2+5в?5d'KoF`APZq8tyL՚Ag1Gj}>yE7$8f;aaMgB˻_~@zW,w9D(J4T1rBk*X 9Sm,V0A}ap`'Ⰷlz-jaF%Gc9s MXL٫g*cw-,<߂ory0+-I o%N_yX+Ssmn2vTс}{h?e6~_Ql1oh{ C >T):뷽QZ͟?r,OFjpMzy8w0ݶ-ヌoՖ8Y7c[+gYdg3NM ,dZ^Of|LaS'\;ó1a$ĵp# \;ma4/BMK1_IJd4[RlFW}oh1`ƯL 'e)`M$bM[}osĉ$V#\Ⱦ>vގwIjޓ$`N*:Sf>tHe;Y|.O2)ѩ9["rhQHwBsͧZ]bK%&۞%&_nb"Wfx3z_f0̸MӟAKT#tւ{49Pݷܝ}@|(cRͷ9BnNۙX.tGs}!vj9snHj4cvp#9!qT;⥝@lғԒ4ݪ:J eSLF•MeCŋ#7oW]4C*R+Ys\&v9sA >QSX_r$k  /&ls=GB]ДlN]e oq0bQ'pq6~*R+n70;naWޗR] ܸ2M({rm+Ԭoֳ@] 9ˇQ#v z9es]&>sOJ.ח%v3ΨGTZ8l¸FX F\G dVZfӎt8-$Gm7~A 5@wp? ~G Nd}Ϊl_s"֩tYIeOfHe>i~p8Gy G-<`*Dʑ@~] cB i*zmqt%H\ ڰN;n'כ4ed-{m[΢YL!,cSJzypFB,'=hϸR(bƨ:4[lw9JœKh)MZ8y|PB#q;b=9c%g DXؓeuPO[ze- kkA \'a|C4ԶI͜]]bӿ_l[xuW5$ՒxqhM}[bs Ŕ=)'\n\#zS\މ)O_Dܣ|sjY) -8D@YPB"w $CX\qYB[%Y\><8$*%J$=Uc]zpn!P),QJ(˙Yˠpn\1/WBY q6 csx Iv\;09ji+fg6FRopG5Zx, ^#1FF:BpdS@]@Ҕ.NI_L/68az1"kWZ-D-< i5asMj`D4j'wm:?3zEzf2eWՉ/W\#38~O-\!RڶH0))73s7or_q`-A k/7|Աd漣A+ޅTޮw22|Y(o>[lw5 WiF_I87ۺߌdv?z''px蛛՞vxY2&zF;\;c^rz{Ϙ^xhI3FK2G]T_WppEyHBA r{/ D2` ByNMBК`E$ }9q@,vR;*Bp{لye^\eV\˼^B`vc$&+ -0wVKM$Gy##q-pjU8|(%IP`? ^!xt"!U1imF]p՘Kq4j!C z^) Z1 LJp2C󟧯7[y# w 'j:|: 2sGs!!iLQNfVhO5KDm*J@ _"CpNp4z}+i>6j ' }x0 f|$ZZ'$,XT :/v:v!ӤT [YEGQn缡HAu<))m*`Ta-CL*0aw> %:υ,=y F *XEe%Eu@:ءK;xg.p "$Ha &ؾ6"!DN)՜%ZD [_ձ3aEM\H?d{{k~UL썑Ԙdz|$A_ߤ>ըIl^c2El@x8VHcF91"&qa>! Dd>=sn|klj.ZX;~\. %:,h Cy0'/G: X*wRY*4_o#}a+_QY6ݧi:=M& B[V&xF*G9qJS̪ȼbZ0Ezˈ;h_Md,Ց8GBKJ9iN9on:$O=$4vXGWKAwV\eQE+CthrAS\ +!FDcQy3H9(%Db!SvN16S=3*8x蠭 |^&"N QNnֳ#,@5 tIX$"cڣٚ]W_X,{ݛwj&tؠ8?o7n6y٫l?ˏC @CJ aD[QYm$a" ݷSRNΞuJNȩAakpO$7sGs9%F|vS\q֠T P6$kj[ikBMkQ5;qZt2쵨NcJ,:"B)\F.EStu5p1]Z Yo0>Ɠi]fG3*ǜ3QyFcȌ4ly&)sEsAL ɺ{o(Vk=r: .õ 8[42& gךÂ#!#*!tƁ1S"Iڝ#Kngx$89uKp)* *㾤0;hAS_aJln$pjpj\g*ZGnKwRrׁ(,hO=`^bD0XOnZE-HSޤ+w[in.`[[MEȉ *ȡWZ9Tc;PyŻqߐ?zOaraQ㾿_o_Yk<U,BZhǛ@gt3J E ]V8LH1Ryt^@32 >,dOl0/Q>5 nd?Tɞ:Yp\=9:y/wj':zl礦Ա<;#nzkۧ"/W{&KT-/^J apjs.qp JJuH^!a#L<#/q(l=}ɩe%x[-8D]PpMaN-koN3!p=[mP-œZV"*|"dLH[-œZV"*<q+S|ᒠK>w7T4iJ5o+ i ߇{߉9,.-^NZ8CW5>zqc~vQ&%6vPH  S %)a/*-l}w7cս-QѿWSp+[҃[VqJ6`uyQڼ)cn)$*0WI]DF݌HTtҳ#/5 yj+8<8R{&AjM*OIko4BuGp&u '5WHTyePdPPaZ[V+"[u,-xj>lO=772T*(w\|,]]eD0(g\SIpO\e!*1vQ7#:fi7Ɛ@ "F)R1Sog#؎!Ӝ6'D9fV# b&7Nq5Z:2*%L~pDPXeZq`JG+*HvXTBi:A2dHK,,2 *cC@M zg c"wQ{ˆA2FPj!X΄h`EO)8Bo_SE`eGgAA94$+;Cg㓕KrTAs&Ka}>ƒ)}o.I•VpITZMpJ p55Z?-n&D8x@R9(Y%ORL~s t{~?Ek3gOY-ٙlO?YZ֍ĉ5yF9moX-Srٚmz1B.o~U7Bk>31M3w 3/nm.b *(B`.Fy;^8LPAH.%+coMA}#aAtzNJaJ߮!|rYjgVG  .(J/e^9K+ ƙȌN[@tsYnuc}ɁM2@ȬFvy sDzaN(X-+vL!~6yhq !!%`XO"Q&Wbg# 4HO"mh ż'ςEm?]Yoc7+&@RIffy3/Y0EIXxqy2uUX4}Z0=g8 ϯ=*_gc˳Njtvy>.>v,Яy?_W}$VumѓCHXn@ES{d fb} ?_?#:sR Zuc<{t9kv{3|Rl_KwܥQ3??"7碅Bۜ}(4w.f촏[j6Φ*1p4;JjCݯ57d&H upd26F&Xr: A$ zOÌZ*'jr$!V"}Q7J Q:o) jk0 R$;-!qWHedG-Xa{O*YznD{V[ÇpV>x"\ϥƎ$ H4,ң7%`ճ(jv!8< ;RD*<CAXFD)@(- J%):e;4wCʳwG0#B* Νșe:\kRJqL  MR { \oW%:,08_m>tӤ-{{k2rnqpmv:s&^<႔6{2>QEY5$DCD$"Y,Q#)X<`- WF { 5V? #')DFlԘv š%۞nPLQ֓Oz%YB1' k˜`ErW"Orj;a.7 );EơSR-SG:Zc8Zqөb<:I X$ɂZq4juMZ&h8ҢCļR-O}KS[D 'J|F2\|pXY^S#bGflOgRYF>[8c.Y. wO_7nJ2{L-فr)ߧ% IƳDvhϼ?)?svFcxF1L* ,E0猖YANgyY;p_޼xP"ۛ*MyK"Sik[ II%FbHL;HV*S:y˅8xJT\QFSFTngӻc B|!!k0̨N'Ϥ8))~xü";+ݙ`'GGrejhq5_Tkfi~ibB1]NlՋR>Ȍkղnw73:t'󷥨P'~TQ r" /&P>JYgZQҼ{@ݏեA]w3U"d6w{뽡K[.3YI߭V٘֌y"ӱw}2Io]ieKd91 HbeD)(_1 ZVzWcNa:I))(eSHSC f{`8Fzԧ0i4cvV9q&-ަLڭήŨ*U O(j+ku\"۞y@U7260 W\ A]\ mO2Ҍ{Wa7=ip"Ȧ_HwI*a0@Tx%Su[WRȵ h3˸ۇGr/xxSfhGϿWx }Хwl 5{kY.gH)2yڕ3R_lًf.m5.a藄ZR9V#tj:2^@B#ÀxeiF%*=ST m}tMj${j{ V9>(0i0cF!vRO .x U wi/xӊ9ie^gm떜||z8*n\V)HOWq}y)o\ԍ;klYPyA;Vmz}~|x<Mݠi7O[X\~S~_1f{y%rM}9!xh{k2ۋ(k?!sm)nxdraX`Tk(HY.^ 9I-M(n Z1aw"F'kIJ>8w΢M<91n3FF2.,Y8u\HxI*f-5gmذz= ?rPL꧑ܩ%G0lХ'H͐V5(V.9FM-%iEsg^m =A$nc#GJN48R6/ Y[C~0{- G^մ&;jUNUT+J1,*wgqOhoH\Ϗʵx۪t6=Xgd;J;V󖥶S5xJcZ'EӍ C\P5d*JĹ>%xu_|lM bz/և!RdL Fz*2uůWʍڎzDDѥi5붰aNJn Ѱ1 F0n']},%FD͊;e2IXzw1^Ɣ#I?/־+PZQ9n#I@J%p\DhT}#lXXPª-.TSYU)bh.Os ;1iȭ\_3A5SNPSV ֖RD;:uJ!mH9AlpAĶ'Tܐڞk6vQA`}fX iH#5u TGNr86e >VhS:/qvL 5%׋N26]-?Cp_p{~1 sj,~jflZvX=:j#Z1^Ǿ- vdVm>R;rj%{iXPz0^F9mzeْp9:qPe g_/MŬm,5 $()̉`I)5 ^ 4#CwTYJ*I:;S ]* 4;6 D Vz dqS㗛ۻ% t9w;v. He=  ʣ>oa?K8zنߔs};$Pۑm-TyN4N]JYo:Yb`{Yzn2'U"'pv n7֏ް)d*jcf0>x[YGBi gƲD 2YI9DI2njVq'F:q ;{1-gv?~RA'Ey!HRJPj͵맦dRN$88R,&$LsrU:Bs04~ĜT6RwYy˾>͗۶m3ڂ x{10LY!y4TJ/8G/aTN#,k;'XfuUz6s:Щrd Or5Ċ I[ / 'nǝ,r0SAeK'C 3;:)3/yGX\O(@/}J{MV~mBP͙) gxiFx'EԩZh:`t="U E{]3Z&|Gދa Er&aj '@/("j̰_(e7gjk""x dU$HLd@+AAiQ4$qܰ(1Ixzx `% ÓͯD 򮭷F_AP%Q/e_v=ns GKs|tJ*-E ItoJ?dþ?ů2>0_e /_TNXur~kiRLQt[];%|t~qwWߟPIB"3t)M?< xjiᥐfjROW? w>oL ^"QY,7'G]wpVQ)g)הrBǻ|.ѳЄvL?^I&Ph!5S:0~1>(X?-t&H /E=*TǞ] ){fw3;+J=n NuodPDao=Wpϕ%[c͹jLyQ)ʹ'`ۡ!t4ɒ5] Y 5}*kvpY:=k ?κJ&^ ttǶ^R[hٛ%ŖA;'A]C`i" mM%؋IVCb v $eM۲CUM8wNSk+7,4(:%aC%+vdhfv{z(kpVgW錣-!ʹxxCxBH8#93X[ozmHuK&hgEd߫,=(8|6@9 2O1si%U0$Xg#ʇidHV(e@ .> * >X"FB8׫Z+VQ"MF+l8&/M*B(ZTNe%7nw;mYP(3=7Q푆ύH&}wg?ܸջWU_ hL_!Bv1p+iY{at;dGؓg>6{(*<Ÿ]SFd)#@' 'MyC,CюGMe[dA"&:ui߶z ;jJc#NaxT9]"@ SE VTp̻(f+3Zݤ5 `;ŞRl}ܕ- {4*Y]=0-Td$ ARkw ʀ Ԁ6BjǂI;D*nwxLм8+r:vVc[R6H1SnƐPhoOƉ ^pEeFj ) ;҄|$9\Q?N  NޟϿiDհ@L͍d/X Ggst39=q4kp.G7dQMl_܏n)O T1Sz#/xR`NiԿt1NsNb`Hc:]̂ dBY LvDATNTjb0 B hdwQcc4,Q![%lF7 J`LO]t`&ڜ{hhm3HC[z~%M<ฟ6%g x vnւ0k{T &?\ D_g&w1凛\|mpsջnh7p\9֥'?ќb }7~q]\C}9~_JV]7-GFNl'}чǿ? YR?7^?2ws1\'s졊ti+lDiwVnįq-)I=nR[)9S:)m|R=ʹ[Bs[ y"Z$S\nŭ N*,FrP;o Lp2v4K_7c#됦5M9*2I>aQD(oIprb:]= 2}qY\?8.J>cb c~Tkb\-*'" H 88::LpN&`7J 4{kHYWIyHdRޖ,fW$7(s NS=SccgU/RpIy;D&,k$TLhqpLRxUBkbMaFM;6%3qLa?ӵ D*בBhҏӁ+Aa/L[Z6:ŶG!E] #YSwm=wJ֑ڴY& k٣ y#!١z`Z%3)z#8bԧ2vt4)t"MR*u@g`EgNA @XJ^*ipS{ף wwodzgݤ;/?tn߀wϮUM8'͟鞭a=q(wKM&d (fQ? ۻmRvNgˋppHfC@g 鿳˯i`2GH+NLI/o .o1oKՂ1~$0c;/,#~;jEcU2ʂ+3QmQLo,'[tjQfM_DòY\GJ~ *nh9TrOn/([Z\R8I͵iqQp(N6k&ͣyR ]}s4m(84 /tE<#'{rY& Mf洌pHE0|)Uшr 9N"AZ4:: !җP +Ns@'&Y_u*luV XxjN'o-M9( A}Vyk(5J"gpJ #3erĸK1I*.㔼<ݕDE/47fq-)ƻ#h7N')+jR rDtSۊSN/4WՐ7.E2fz@,hc`Fdh#@l6DũC%BVpj JZ@2^wy%hgu,X\=MntZ i %bEL]|DŽ] /BZ䄪fj 134&e=]ʞ́|?͎cK7AYtS0BkCR1 HF5H/)[UJ t4oi=˨f )bAyˬ僖<tq*M |,) rP̚tRn%IGD$#/FPt|cupLk@z TaapM,0'EYe4֌?I`jшɤ\L^Z5xMp}3dVLTK:"\v"A!P;j$XKXȒs, +zB.ZSoz3s,Ƅ&3>cB ̵Q(?eN0::+mvq<gN -_՜iœ 1[㌢NXm('wgsB#P=pbpP *Rfk>c?+Z3X 蝧՟>9:\^@,3>J,OQCkER`Z'Ej샃|?93GPQ_[XSw9Hk97[Jp!YN{;.qDĢz L?c&Iӟ&IʘQ?lNzi?+^0ȕS3]>&W꤉ VN)%JR%ZR^Gok56b2@D|?Evh3=#>b$PJ-cɃzg>*_ct9$M_AZ%\oP0Gאr{$ъ)[[$dx%8#Ұ3vdἯmG(&YWw/v>1ƔKhm\,tC3dgu;SGMGW]ْPPBVl7?AlQLSe7[<%e뇺i"V=8 ID{o)c>M $k/.?xjo2@O{4li`Xsɪ"Yś^_?7^ t^GHMf C?$D3[g1 ]wp׭?2ln]j^oӴK .gQvsQ|+ !0ȠQcvB$D'6l ppОGW9y2]:#sxZ-9ƵV#nIp?b@ExEFSLj{wT{  O).>!7DĽ܏L+& uyfcS(hZ@']-mи^a"zE: -_GJVǰC `hhIɑr>z?~> ?ml&|?e$>C&쳴~䈰6J#Hhn`n6A !#es{nLAqLao`h`#PQvQLD0"n`80ޠ "0`^x)A@)?Ra}Y!Jgt7 ݕ8 4m;#S^,"UW#D1Q;ϥe GÈVyC`b-C<7KQ%IuJc{RQ58jj̤jXB$ǖ68vJFX rRfFEʖeS]T u{amO<WBǔ:G0S5y u!5>(S .՚2<iZ:FddZGJR\IА\EKtMOk)9S:F6:~r1֭ yU\r| J]VFEn}ՕYFf_7fKpW)  ?wO.XlFTE L%hP&Ž'bQZb#ZHYbgVs;_߼Y2(wϺv*n*m}ňWYqaaS_7M 5i]}Njw S')ierU2W1L1(V&b} {e _o"։HƧ$zys>5oJ(+XDr Zۋg?ܞ }sz8ydc0Vx\1%iU0-G %k k\)FB 7*#c#=|Fs8y *+<3#7"49=t.l?yMeWXhi*K) ԂɊP:a~dYQO2CsMV{ A0O[* )F)szyI5ynjV}E%Fv'淫uļFb+p<+\]Q&708%`Z5n7Ӥ%sK[k+j8TL(O=|N]0ϐyqHxecc"j);''`^y' 7'VLcpSీ6kO3C's:f>ZKR86WRJ {C8lrJGY_ت7zZT텯k0`80AVqzd'3豄zhSg`%ECƻySDKEmb9*W>uxN/yS7Tv$LP-0?APaC_E?e/WR^'BSP2"%xI"-ugSr֫^j͗M}Wb+s%!IWV㣕u)pUÉ*'AjtvqmBbkw0~oZ.dtD$x-]9`m B sMIGW?~pa#d`JSc'}ͽƣx{T -ȔòNJ,s'$.`+X2c؛j-Y·k[%[2n1?Vϓ]G/◎U\)ToO^&xőSx} [ ffUxayio@CAdog) ' 4V k_ܺIք6[ʇ4ڪ?q7p2YGg  "QÏSr 8WN *WL(O~P t+ p[+:wT(^BX}.^ûkQvأ`wvĹM(BC4ކYo4F>bF2.ba7ÇHF>Z:*A*gT_V~DUk)F&f0^ gnZ.daK9ǧ:<>Q!BPn$$?}8MW|fV04dV j:fp^f 'Xl'،?JXW^xNoFS3\M!?`IPSP(TȆA#?ʃGv(㿖@Z 8Wuvg !mOktL;;IRP'3%|: /d?C>I:{QuR=mV5 aiX:hy$KüVR,LíH Ԭ$Vr_+ƹI̐V/S]2(#0P!TMPbqJbWLG-=%2Fȝ^[稍p+bv#g8Ѥ`.0dwbw 3Zeh]@O"\1m cbQyfPj"9IQYw3s%SӳX޿ U[Ĩ`֪ Sd@AN( N DX^JiՉay6 VYaqq&ZF A,BƁMFSLbT"`0ZjE T2.\^c))Ư`DHuiM5AD#ʺ?{^V.7W/~8؞,,Guyk0@}cZBbRM{G| E{K1ȩZ:FIӎ3[ yUHjY7MjFnNMۈȴ^gV~iݺА\EKtr4ng 2֝ީ7Bбmk)8 !Pm1B69 ƢB4^2hB2| g>XQVUDZ!L*9**:E7VGNٱuw???&Ȁ!r9Z qd8hLJ" G%4 +*V{ӳ0ٛ' 3?rDu֫5W/i2p%nF˹XFkipIcv=sSL: ߃' bbf:x@36 Nm+Ip/c|d)C X, $L6q~3}$qA5?syJW b 6Gy@Xa2\ QV(ϻTY Dlb1\2Cp9Cg# M*ԶsGdיs)Jzg*pr/cArFLLQ`iW 61ܪg'ܴ\CB>hF=_m"0):ilIfC6 Ǘ{T+N]AN|lMB \M2Åp BCs-ҩT eZ[)9S:F6L4`nŌZ.4=W:=Ƒ}Lù}dZo 3 ɬedG2Xu7sr$u5!pQ4R+ 7+8ṱ"|z$# Gh5+ k黛?V6OzUMFz@P8٩ۄPv0{6FO#ӥ [EiFjG!e*墘+9C΂ x&zt/g/{vqfqV1 w%P^?ISEyn1c)}zhc_3 | ׎r[y[S7r3JrnAyq=#o/=d]qErdɆq,&E,gfNR$|[)= H_}>[R͈ޙY*62/ i@io ('TgϬV?o݅lNi% I0->HerR?cfiUiWq0尓 UWLk`5<|{A_S[sh-L^E%ۛ Qzd./@tEɚ}f3JBR굖gF[RB[=7.D| "yIp)d֖\]v~Yp7{󣄲 ;eh)}B$ӂӄwem4ʑ/SIn+˼LR.l&.vRS )p9EBFn"V0DI1,;fYYlIjј湀N.nlO>dݕl͟xl Hw|Wni!uSz,c NJ'x=iޢzo3]7eN>Wb!*SO매)=4|Cp!VlQ; ONjxbto?q,nn>QP[$ͩL7) 7zϢMW \l'OȰU-˕ԟE!~>)%Gی^#yk5(U9ΧVZD.e[GVqQW4Up4a 5jk\D: jLq2wZDI& ,W  `^@B+k D/5y~+/9a7"pw>.}er2W0 (oAf1Ŵ$83)}T Ok yi{8?7D֢;Vo?rT>*5υ\^&  7G- .Ipc1RQy2u`#Ǽ=`0^簽-EH㽞- o={s;IRzq(W)&ݽ!~ו/2Jɗkp0]b nW߽zqWvh[/Ve<]}q=\,P*Ji*(ἣ`nV / ږSj j,هu1~L{ːE8Y۸FSL<`R!&-aK?HϪeUǣc飔nI%@(GN4 {) qd)q]pKK%K/x h[LeO_I`=dFN!i88x4Bk xP*P@oi%:YaB` Y^2qZ}9.GG6tKF 5B0@̉da D9#kEh!K*:eAP9oZs ܧ,*'\ `pVRrKb&"S #d2񀉳4ђdʂ&ae\ uT" sBZ=ӔZI FkQE8#J˳@$u < KBPb#Z ;q6ZpUd\߉,>Rě[Isk7[>e' _B9`Ayf.FL EX@IQH࣒5EҢQ>`d =#k,` lkdҔ(T(mHPEU sh(JaGT@:묤LV&Vz/•:[j0Y#eQ u {9 &}+Ft(`S_ K ^a tpikQ0SF%s ڧ,*7Ht= RY V 'QVZ``̄ N' n˺;ҡeLGoh b 84U L9>ϚNK$U|W?-f `ݏH.npk .Wgë_}Pjes9SCPf֞5!aP$<gMY%^}{D Ly7fZh.DZ,jƓhj~nG?[=ܑ jG29b/WUxy7Fclf z;Xb9E<友nfo|\0,]2(}.W)lkF =(mmXVX؈_3hL.v*>YXd[ ϞZ\G(}+emqѧъ:>o &$ Rw,g-9oIE|4U];mnx4cJ#[9c֠G2`&cz#NQ,D"5w)JZeŶtXGNd6WZ#G*7%ީbQYHSӒBqn1u->+8՝ilBS[hL=.ʠ5c1Et4s$1# H b8"ͥ@9V'±lj|,q|3 &~4߭?mQ|&~󚜔{5 f;6AJ7۱W4M|R'a& Y'ƥ.Ie>,Vc--7iSzhxXħd&'醉O{ a$& +EaqQhCa a 3k>&4:3OTg'x 5M, 4)5%`h&!+BAa-DE:S_܂3],˪r>1/|<!9)g`rQ1F m,})f ׋L‹ث8^jbRn\sǤ Ph{V D4L&,X0d^Rhc_ss[.csHhynq{3Ff1>2Sɂ|/x5|`?A?Sn%MKƹB&/C80,qC,JHї+lUwaи"IqWT쮷ϛG9_#hFNK|YQ^xhx GS潾gaB 3Yޭ͐O&6~r*,Ql\G8f2Ow?x،G3^x@ *8" `G>nDnk8Wl^hE^9X!IՔk0ÈX0`;g&0rpE~l2z\?D ڰ d%޹!Z{Ck f170Ox =L}v=l'۷Mb߯֐$IYD[ W,tҭ ɢpa\4 QCƸZRiWv2v|Q&u4r퇮p>5B_Szg2 OI;JTFRy5Ҵ1| ])ڃ1p ~`/9bNUˆl3@+͘?aɞa7o@w_^OVoOz۱YTͰ&4Yn9OO)vPiLKX,]O iU)5gu԰s͠k ;ѴsC,DaC UyԴy4Y SgwخsNc)z ֯ai|ͦi{n`!^̿6;<<4xq^*&DgcR0 Ng@buӚVi=K{WcEWXyL{0Kw1_G5pAn䷼* Z`clV)7.ܬ8hbD!l6)I۠11dx t_(6^c?d$bVvYWAN yzdOEc &/4Ɣ@ykDќ9=qdB3OBsFbE}%jVV 4  !Eoޛy k$ ?jg&&n?I| K'J!ӌ"bìR`8DxVzTɵj~MwKc,贚fENף$Dmey58G{hFoKG0x?iXZb/bl-!cC5D^kX'%!\p]ATU۹V* $ػ8r#WeG&po؋}ѫP੡l㉍Dg4}._& V!1`c",Y0xRLџFPm%pf^'y~UU %ę+kF4g+\au5ə>؍s5TR`l~\c0խTΡϷb;L n/{s:iU~s5DA|+9'.&K\*Jr$'F]9s1Gsjw䂌={ߔ݇Dx_:)^xfw^j+-8|pZt_Wtֹ()f _79c^7Y)46m;j):0g4gނ<=P6W-nS- pևt`_uҁmI2z`;"_mzAÇdqS?oa^-ߗ?k3m)|MUߺfX-}*=lk 1}0}jX@9+_>kD^l"sGllω?JuSv1(f7; ib$|N]uy=Y_OOV| s8>ƧǹN'' SO~^1Yj(] "9:gpI5#0{lV&ؐ=5+耓f4okO@WُxrkփfcosCd$NeRR%ID"O Ij | 6[{Wr{FUث{׃FAi;>w)Sz0\]vR;$m#6_<8^akXHR.wfn|fn|݋Xܥ^rQr -P';N:I31EYjNtF7abkZzbKk{B!3&?zlce-/y4)}Ԍ͊q^JZ-ai/mcS\צ~C!q@Ian7%nFon8$c :_mꅴRu -TXX5VG_X#ܢ'>+'>ֈ-s}!~qaH;!4sm}E!aO=(G/<[ԑZrVy~:SVr͛z9סBu)iGbG0dsiV9xZZ,oW{f cr)Ol?v۵Hko_5gCp}׋2 ZM ܯ3+Tvk?,~yDֲgv&=Şڬ _gIl̉\Y6NGM/v '/p\paxHԼzq^NTXZ䬗\YKڕ>9hŹ [^f92G+4,B J(G-t'DKÿ]Y1pz~ˬ( LuYtgdVpxpN1+&n|ߢw[b}٦+iM6n+ pq6n;e{Qno>^~twUS<ص7ս/i׵ m͗ofpş>9;Z}^q8?+Zh10EZRW@q"2FPUgn_RP䍴SlW|^<4v]vF |j;0@ʡdAUaK ])sFP]OѪaTNdPFbpFuQz3*d&Գ'XlDévݵZR^:wHl@SIj6u2EFXaʡhc<Ѱu eR͞+j(d#v:TQ5"9]Z^T0U>$u6;abTT=v}R(+@kй*E QE38#ۋ].5=<пhhs٢4VE QAB!vĺ90>gAx B3FPMH4pwA$IEPfD$ }$ ,) PWUmQf8RTDZL蔴1"ғ- A/ 05#$Dŀ Qj7}IʱźgY(jMJpar 7Cu= abKFY5,&EYv5GbTR%Jl}RND!#aPעEX9^ͧC5,J&D2;*{A0Ơf$éU!:IbϴB苁gP_Rxgg .HI2X85PH75Eޔb46?RɥVDVRBy,n ՞eYtT8VxJb:4Q3F+FFv4P` x%QEyH ~]d0[5R< 9 6)&]=`f6"D^0=\K Z=Slb\TjRbIɔcM^\ 3GxOF`:Cc6T:+ R ZY;p8Q(!e !vSTOB)B +L$lPFFR5dG%H~*=j \-^"Gg<Ȥ) <YIi^ e޵d1@S 6&]v^(Rri9(V*ObSͨh]ko\7+ }`2c@`{d&8/ˇՉ-öf}[uN&f`%uvUTHXghdHc\I@Lxi ))PuBLH^`nx6$҂+A" sGP=rAd`Qe0ޤ:46idl3,+DZ1SI;(u켡|1B } S)uGCJtd7A`g$ p:' JJ@l tp.Ϙ]\' Q mb;" 8'gS)39:\Y0!/z,<8,-Τe_%Q{Wak+kg4qdDrx\dzJNiCeeZ=Sι䑧nВdAm|t1򤐖ETg!B֥3&`-&dԚ2z`1 OX AT $&kn9L"k4"9C4R6KCٓ`1M% EE8FDĀT]D8EeXZZ48ept*r/4R%Oc+duVkI9I >N+IVNZ{=ܱ֮3ĤSDN҅ %&сU d! AYb- 2e'd>S脥<1f$JAhgW7o\AᓳDK7 ,.EJ R#]b\k&=B403L.=VMտ$8)'Yg+lvpXRdNKDQ  3>xmC#ҮObaLH\98(8 k#ѝ.uge+fT9r-{ )P\y4SAÕȐ]i(sH$̎IϟB'ܜe\קꁽwWח~\'hQIUg{2 #VD:bI |__^-¾L7tٳ^L_oiG}0LoKH0r)Q\O$](z-Sf|e!w\2&$EY4X "UĩŤ5SztfuF8 #0UM!q–uO"9LƓ|X$yiR..h9#nwuj8Q5Ekʕ&LIDiHB#UzFV9``xN' INl E}se Q ϸ &YK0CBcqG0Q]kNapwu<c?F[:Ԯs6b e@xmpfud4apIT#b%gr|% Xú Wd_`}|lUN8dQb)Ge^:( >`a[L01x e-"C `kb-J+g33W1jcGewquXZ^fb;/Hp̙NHh0HO$}Z*Sh-GO}?ӧ ceWΌ'tuԠ֩19 *5J {DrdePx(ڧE7xaѻ0,#i)ŵJ/O8K'iY-` 9N 6Ya`js/ \o-?^KŅVlK6V\bDQrD&O1˙$ulpD:v--׵1mז4#I .1}D,+˲pO]n1JuOU8\Sw)+;&CaebZ\v`l-JhE[W}@D^S?=l뿏 7BsCȦeNPs;D+UR uv]q[Uq<h{G1Ѷ> !kұԕD:oJc*$X/N|7o뀨? ȶk56}*C^@?ݖ)^k卬GPceڵr3eu`^ʢՋ:AUڮm ҪQ73{x]K7h6L^`XSgVz,2 A A;1k|$~em6+r*Dx-"GVatDc9Xc@*K ͸7 i>5ljEh\Sf*NŒjDbDwhߗ5v׼Z. @R(NRnzڜՅH-tHfZO$ż.g\X%`Z21-G)dbZ Ԫh-[45kXWu\zzl:bY,up9sXjK*tNuTPiݮ-Hlޮ-S'`k݊>iSBLUhGqNǚGpiXaȺ u\;uF_(Bi%I7:y^1֛ ?ea64DL 撉SI396@s N*me^SyŞ m(OԬw(׆>eg33J!]Y +lƟK.1aD mN2΍˪/{ ,{_| Pj-<>"22 .r2x-KPNҨix䐐"d{8Obs~}T0X ;w$>&05${r9l^yD. V˟:(WwީtޘP/4[n7W9qi&y ?gX􇓧8M}X w lv]h$̤[#}yoԿ۩WQSg'w&%hzmos]H#'4BPyegɷ=@c@3,Hq*gXŴW|Y]uᆹ㯟j]04;y(U ac0#>?}zQ35 +UqV~u}T`wb'zWT(fz8_!rNFt/f&k#/ J( ?3<RZ YFU_WWUWW0D@'.kN``̘]Eh?HB1ϋ #3n0~3P.B5w$OI剱QKǾ2UE|K*b,da`Wd~QfA#6 6 DZg _GK=18jU"gi3}ccѻNj53>vT3'3ܐ&k%Kq5OI+JW3Wg]h0120w0}w ѳ!T.`5pϫXMDi0=wףS}s-ţPѫfjza_-~ IW&/NpR[|nX&BdZ7-;2Jw*A9L,%-4M0Ղt61A&X8+A5Evͼ9g J.H /B 5Ukn(`dP(NMAq=LYwUQ.Ȕj,+p8f3x5 aƌ `0{ 9#lj "bx`NZI(H ]D{QՌ ޑ.aT3#%`"*)'΀DXB ]0ʂe3F3A`+0V5hwJ`<kPD/Ma%-.4`Pq`q.H3֢޸jIL#X.VX0xKz+K"} _|P-iȸv5Z TRDU!p .2,iXZAeP$ջm G/р #qB÷! "CS`z6fT/՚̪d'jcfU;YhB\>H#Jh{eLI55>*DfE?&uZiFpt%s%̴k)BP^@b_jjj.)oG_L1S$jZ̔䛾5-T68I )Ǥ哈fS(&VLbj;.oG7]\_OGYuegGr;[͟|mQ7%b'YQ%]$`1gOW9" q? N7gG;__7.dj\%]ՙmY@3Ѱ28laQ>cN|l Ef1 !³#gA:Up-8,ddlfwV@5⠰ZasRƌJҧ4V{Rxx&t(}I%#J MC)QJ2 Ғꥆ#Ji.ժ fqU`yPxs"F4cB/\AVm7\+*j%yW%"@\:s*ZZci-|1@d;ܝjsJw(Zx 9x͑!#6hBi4P-ˠZ "hW Ε,Iaa29Fp2PAbL !,jVZB BR}+uAe: LGy ҧ4VJ,}F:AIl|B)JzҤZJXˈRV[%JzҤZ+{P(%: .%: %p(Msc*\ 7C6y xt2.gP2R$;bMYJ;5픒jEjkkV"U^.R 6>EIFkzQOPIp|D).}IjNobRR PuJK](%< Z*XPu߼iSHNq|اr-4%/ /,%Ngv-nk3OX[S/{!UY?zp;277.˺;c0N#rYB__ݾPOH8JiJ u@ۢH"_:?Q lTpf;(nB4ek+-ն_*0(V|6?mU!<8)GE{w0ǞXmPdU5 Iyw0d0Z#o\,^!IDkJEqԧ:OR,"-):VǍz $@nJ-%W6؈|McpT>\nasSY?~L2y-$fgti"|Z4  ~T9l+`B&T(&7OH@-;ntti%w3I1?6(c=N#N8^3=qa;KA~b'kH& $& XumQs< Nn)g[ <  -Pb-X(\A^$Rz4&n'?-$B"ኲZ7%)+ 7`՚EKv軇DVXSqq~ձb94*b%Nb|3>2,VV$Tpl`v@ct9i ԫ jAUiёj Zzj UHAB[2JG[ W'>la+ 5T  -Z@̾kc}TiCWþ+ԂӁ97䶑_,袻mew2VDރ[ Rc%xw!2]am  ;P.C5tYm%L(R&?eR#XEJ&f1ׯj|v9ZTbr"]-jE_Znv7(ؾxEg<8;:ifs~fpi w8r? 7GM72x끲sQeHfyA֍yst.Pd.Kp[e)B^8D0%MMծP :cnsW@˭i*ڭpfa󏴛n}UdARϫG9J &'ǰ|(Fڧ4VCڽW+rQʻF{FSTOT+Rr]s7WXnwC t V\~k^k%*sRS"8_j|K&6p\wGƗ^e1^ԍ}\^t|{w];58jT&}X*V zwig7K4-_C5S D 6|񪸚WY0O |~@*@bT\l(H|m R*}4WaS@҈H^ڶgV aQ}g,J%Va/PtX=(/[.3IOvA%l /22p 2>A@N{_'Z{7[[(l୐\^?VgRQB.^o{/C(6ǟa2/v~tK(}ޜh4_yP k(c։J}>ŬRҟOqwᾆ} ݟۣ[i w'7Jn8N+cY Q4ivFDqُh;D%!.?~wƣdyw$g(Wy h+to;9Oq;$t M=vad.$=(agӟn"_( s:`Ch/!}4KLP"#撌TBN$+Pxey!3kRLs i*]סdiks 9%r2~YKtw-p\y2.&a 1쏋:O,-\n2v <ɵ7|Rjb"CA~Vl"m&eOx Qqcx!i<}}_Q~hbRzEo3Rp܌jA*x#Q ?iRrZ}vhƐ~ƍcF^nջ/~[vR+| ]h޺w׋ѾP]D9$vTso!*&5-R~'_A+ZRqp멧"X.fyGJ_]g Y?խJHuwǹC%tz2+M:wJb2vHl&NwczwDn@ *MP@*wJ}^4?ݪ<[2r.%UR+ tUB{ңRKւe.qy0Ք;ɖHl :BFd&rUjCҊom*5S:E)ZX {^JW0'cR fL:bvϫR3i6}^ &KaISNKyLj+t^D0%>/U"KgRrh7~k72˛ۅ= +H9?CuW%11 )3WA3#4A*ϯ uE@cK=2.4mlac*cƄJI!E\s(cϞh+vevJѴ|8k̿$/>߾g=E/.6tM]l]qqb1%ێCNK}<@;b38p8bRQZXwF:HL8*(o4 Hײ61nUvQcjlbBy3 rK( ʤĚMaih@]d;1-9GnWkkʆcʾ"F꬀L%#WȤsyia-H, IVȑf䉵`$Ԍ fr e(M& z(r:MsJ#% fdQ[`Y5#83AFZk7z&ơ jD&H2z# 4IKPEY >2ΒM@42WK񀨍Q0=~ ׏\ Dߧ3EQ075.?G- VO??Y~qnߝI!Xqy'~g>O;aFhջ?I<m?j $łϳbw"ϡn?+(yI~p WIσX>LG?Ѽ m*r\T@XT(EOR'+9{껳NVx }YEr9"`Ӑ֢}k~n#݄D%Lǁ /X*%fίۆ%_b`p2䇕b_s|4~L`9\_cy8+͛c\3=Ɋe&Ti\ LհaoK h65!0V׮ǭ-;-Lǯ6i!J!}̌xə-4$lznRQ7qH=zoԓƻx%\ G3[z1:a6ʈXҮw#5Xz\)mK6 Fz44Vk Þ^P :^l᮵khbj>#l,!TtRK:Zۯ)/N$[1_p_a$_h7dD pqv2x$ ?_Wη}2[}y~c|L&\:+F7ۇ黳S? No4l,}8{I.Gvb-~ O?GpgQv|M]#RճyKԛ__F٣^=Y䅰?9GhH,0`m@~OO89|]id.g%~Pg?3orFLN}Zdk3,J$ j|&7q̓8ԳRj7w{ A}3GhϚ҉?]}}ZP].oF\\^䨝꾳b$J͔iK2Z * 56&WWa}L](i^sdM5-ДS// FJvKdr~]_}"&-,Q$Oa9,${5BjӪ+hb"euW䟓iiKNͮ풞W%E%}W'-,ҶeYUH Ik1}:Kn**D@r CҍhDw4wC.R Is4MEޞo?Ue?ñt9QP&9Zt*h?3r*SÎ71r U#^ZsbA0 5y PF1ё`IJ0/^W>Ca| CG+(:{QIY^߃`*1쇯4D!A0ЎW xaIk>sP@iE6F%".XInҜU.y +GKd"mh&dByN+Q9rspͤ]ȋf@Rv^g#AysJwNI+M@ P-O6e_YNI"J+m~uX-hv[ɛ"|j^rMvv~YEL)Uh(WGb0Y>r>@fibA՜-^[r[EM@- r'5*J~p VV#T)H,RA9Ό.66TDFɪI߬Hz#&ҷ o޺w׋ _-DMVԨUFiz{3Nh,Bdj쯟~6GRt]ϡe䉇;z, /^fڿ0Wu@jKA6mlKBj'Cn mhoM׹~U+{/nTƽ{&%,XgOLLu\.ybfꖙ#)qlR42`kH,dTί89|2AqrqxF:^M,X:@|GĽW3%kk۸U Nh7Kſ%|yDXsbftlT|<>ͺۇ0}{?΅7+qwA˞,n#ǝ1~g8_ }q\յ(loؚ"ڊfR6vmS>ƀM祄T,ax6+R0%⛓K6R!xvϫR3$O^z^ &K/ۤ>Jm蓗JRomKCz6ϫR>Gqx, >/&yUj4Ʊvy= /#8_H =f& Uj*JӥոCdJY"wr,1Y5YH4jR^J%Va)/ ˚Ar}A@#K5Z:tY(9su8i^3[ ~qH8"8ܻKpNeo]Vq̣cc;!Rv!ݗk@ІwPiD~RZ{NG Sj K@DF\C 0)"T&H10"-ʘB9CEo2$B')P!dkUҙkJrrvR;5mLhф+H̓$Osd#*"pAL$F1Ҋ8Ats_ti.-PqJCQ"UNȴH@Z!qbRzYFڜufK_e$8ژͽHMeZfy5Kxe!,28oVV*6if:~FչOcK7Ft6T1cc]9$oxxKS<\$L Zt}(p63D̄ mUA{ڗ?2uѠ5)ZEޏqJw_cX=ɟokĄ& \Ucݡ 5Y s 9bB/d6:sRI*Q-[|侫h:f^f,9?WFq?ܒT)(IRV@\xZIfy4gz#_ie,IfPs&@궭Ւ0ݲTT,U_! tPPͥ1\ Oi .ϦQ,Kj-dJ13!^>PE%2S byQrUEefM%RUP * g/?`ݭUKh(1RM&Fh`|ۢqOdpɯPḟE-aɍTΔ)ep@gFi733J^}8׺y?T\@nzSqTf- 灩z͜mCc.7zV*]n]dɎJ}ø9t B'Cx$b$ yGSd8a{;1P\A:4y+Է'o䮐g.\̘b`e܋>pP3mQMX}F|9G RuO#̊xo(N!8oKo6Q-]OkHm _Α5{{n;w-nZ!o۽WͶAPlmàCo Ƞ!ctobυ~]hq8 ^YyV,vU.aץqG:I|{AoNFB8hnŕRw/ZrZ+&3 ZU)w.F" ğk_:n&˭vSL˅Mfw?Up?7 >ihmD |+S0y:B~@o{_ECTSeVN$TzXPo흝?8\(ruLۻ 'YGhMIࣟIֽa(Vлbb:ψnFIϻӸz.,7$"9sûqFR11gxW>?znqޭ Mtͦ /. #2j7Nx}@b Fr 0̄^N A`3Aˍ,OcE>ͼ }VNidfm6ٛ?77g>Hlu\:!@cj$Im\ϸֺbׁ\>00 td"aD!"!G8јorZ`f|iP#=fgxhL< ,miPSYvp޵]@#4 5%fCeO0 NG9~Q {6 !rƒ шFpl*ΝYQhMk8 Sî"1lt;LU}{_ÖNz-yRe/K6wD$eeRs2g_{0GK7Vb&>#*`鴹ut(4 M4ɦP~n(yzT BL'1mhT,ΗݺoDSlJq>sû Qbb:ψnC"ܒ)[MA #tq\,;Ȃ@&G3_>76jUQF> +'ԆyңR Hȳhl[ޱR w-5b#dg_KJQY)=zQX)0+ fB( ~ץz-QdfRR Kk Ju#R l|> #/ւR َ;X)ÇjF5X2N'j,ۇOWuyA&8{z]~֮󻬘7YRd/+{UlL=%v?}'뫚bAlE7:PK'dkIJivRiӠJ%ݾ)JM1dFb`H? pH J%:/&֣J:t^8!]WCp+!8o6$hY)GF<#?v cː?}>ȚRoogE/n.ݟ|?_[0ΚRfr-$m&Q`e%Z-;׹&IgsjΞ˩O]mXBTMsjiO-LǬ"4}\oU?A &^Hw>T"4'F7Z;He(->n'>p칒cOzMбN~)M'$rg(rE: 3m3T7fY4RJB)%(geX(g>bB ,ɲ/Ů W@;U Khlj/nqydFOaU, s#L&u*F+]d rx AJHz>%vSb3g[Y+~˫NVܹIz~]>iirbI?-?[S)f 42<IξXYVjkMZ:3# 9Ch,ӼTiDA 4溵S 4-z%NTeTrV*VT\)+ʜ@esRgnx`U90Iрhth˖.3_?/~Ŀ?ʂjy|XmuU?>4\z?<~~ -_/|326|s*wF~znڽs'Wi`&T½lrS{n'fKr8UO :,ˏky6G3"Tnv_ټ> 8ŻCK` ֣w}~ E A ܒB7!e]fFcؽsq`Lyw(}IQ$a>J-)#Ts\-'+GI(9n-<6lwS%``}:G8i|0 uoV[.~lg=6[ 8eBNZienhYݛ:nY-74zriy 7u'p sbdsj 탴ӟ-ـ-[,3hL+r$ :R\(  { vI!,Eɪo/Ԣ-bAv{j .suh^jG37|Cj5&yt K}FLἪ!GM򺰐oDSlJsL(ڷbb:ψnE_^${nޭ M4ɦڲgqnz7~лbb:ψnC"PћwK/! n]X7nm0qІwI汦kGGVI6D`MJJ ZCtqBA+AZ (1T,7.2եܲ@|y^’R|(["%قJ"X; ,(_7D[ѷd~pzyZ\M)r/ԛI6@Fs\]d'hD6Sd[r;-*A6Jq/F*i?-*i0 LJ!R) BF΃9Z*; F$~`?g6F9ys*vM?lQv)ripޜm h D6n(ʚ.>OY,fmLGE\̶Wm%/$i.K>iƱrp+ڽyyV!܂Ӝ@9m!3<6H;5IG6ڛԷvW6J*mPm͚nvz:ґfP:: ݩɘF?OJ=F3%:Vu9$3uHz64##q]f%sq䢮"-% qz){'clh @ɻ#Dp"pKΫ2ټb4-5˧@OTr҅_rU]5- d˪.S:W, r&#p"ZjJ3 R3yn-ŝ2ݓnfH8G .teE׾җQmjr&[bnq֢c_pg'7ۇs76߾}󆃚1o 1|dy]Ndڽ O߸Ux} T.dxV OKODӿy)r(s9C5.ܤs2xYsUQȓ|,F5Ԧ6Bzj+(aHl( 4$7z%T;aZ/VZvu%wbO.F+?'%@TpW#[^%J+ؚ.0P8R!v t'B 9Kg-+B,rW˽㦚``Kx rbLWV$2UYi䢨X4,TC:_7StD2oټAdo,"imjD\="f;#Y|2~on&0z<&C`{q뭊" -c1 qvG1`v RJ*1 nV^ #KH Kuw E\K뜩4y1f 6k.r9=#K.܄-.񴋆rISYu}aY*y),cHE)SN2( giVp͋s{cEARP-!Mf*6ˏen wöW \ެNs,'xV_gv4 0 0b@ٌ#{!(O12npc;I<9KHX Y9/,N&^Q+*I]L|EDQ$Zii̋@Q \TOs)i9Tɧ V xNsQp[0<Qc6PD)2цjh т 0DY!UަT4@mS{ l&mp.56k;꺅:AtMSfZBMsZ,LS͙I ;A6voT*- HN3A2@+Oqn\XI/ ZL nuhAyxE53 6sDBWzha_CKbb~./\r-D NIdr3qC'(0?,B=v ~ 0Zl'!SInj#8h"̪O %K=lN`[d3d6#vhWAX,Zb87^sl?wM!dɶ!0F^m" ÐP2 e~Ij 6<jC혮aأNVK>fkŧ14NӆV* ڡfYg0, W[ߑo!\w_oG,l̟|餧`B9P^⊻/Mǫ_/ohɷ%WֽEޢ !`$[mP~_ݻϣQIQ\2Ie5&׀UHc1x}FH}D* iY rFqJqLBc^ݳ[,>#$>Ԋnѭ rps6eCtCgD}Dcx9'dtk!92Vjk;WP7`"yإqafZ˃Mr=qJ|Ls h@]EgM@`J+NT W]~^xJCpu5x[$GC9 alUtUavv[w7X]8qz`ߝaoN=}wu}zwEEVϮ7yN}0ە6=h JpEo @cl`9cn#$wGn0K{˝ s'`+h}?݀xFX 0ԉ}FHtpDo-^2`+hs3Y<b1P'!mYϽK#[ yq Y0U[uޗÎO|JKǑ͑~̎ W74]j>[{d᱔. j,oyj!Vk҃f)nPΪA9WˎLXZY 9Y 8d>768{:l c߂Qtu5x2GKAj79~766Kqu.jGRjp,RNq,b%ju5!G6KX +u : ñZkz\/=l T7c㈥>[m6GKጠVuK%K[}aXz,RZ'`)8VVʎsf)382;[VXvݻ şٯ0_? 󺱚yPuzTJ_,˹+Eyfb?YkY+|V{q]^/Ǵb㧟$/jxEPZpIjPSʉr)7b-ܛ )U)M V]MaqI; ~ˇ*x̓$xŞ\b{ 'TѸKy ѭ7ho R; r4w'hc&RW?.hb2/rnO_,:dv!O!4͉r@JJk5^w^h";jK꫖p _ۑ./F]W`kETh_0ŗq=ժ)➊/z2*;</݋hpGM޿Ⱥya;tpRZ7šý:ǫ^,9^ܞ7*睻6*!0/a0 }1pvC g4[-)Q eȣ[5xBz-5p!)܍BCVD'm5ڎ׭`b)*~EUSaQ?BVk!1MSoD7b1P'!mїiSѐѭ rpJ5=8G7 BQT`U6AEQDW9 s yq jZ[zWsm$t>[=m+VUb,X*m6LYz, RZFFRp,6+6KXp EGR8VVsEUC2bU%{{d)5#K=㈥D8uV%H8h-%ĉ^)If&LM3ueefZ ʤZ,j$@u%0^َYY.kknFE嗭]ɸ4nLN*$5ٜ}8IHL^KMvkiPM]h/[ |l4_v?H?lT %w:8#B<="8-B1lLĮX$/yP$c)F#{6J9Xj#>x TSMr7RL2Uf6B6ǭV6rW;$mȃf˞T< -(fpHkCѿމhuhM e<ю7D9C!z)c~(OA4F[J.4F[}6 v3!,tV9GQ"3,sZs&iV2kl'b >2i+,uF30YcڰY^ZG1B)u*J4YS|4`O6L!A(>}kJY+bϪ|ꪋ D nj]=,ܴ_TVm SS0 NXdQh42f8*ׂpRn74–2w@Cs+e rj4sqqk,甩B+ϗ,Ȣ;2Y+%OA͝l mw^ sL@Dɜ !-Y2^M pcAa&[~h6o_LswWTO|vٯ2GvBkY1> 5j×Ink4q; Cvn_k3 0[e3AYu(05`o _G;R@oCZPaTַaBΔÑǙaԊ<⁚ I=\s*\`yR?N DITQ)휋}PCC 1=@PNl?1{K(i %GMqkhaMgS4 AgUtШcԦ8)oI,QC-XOj&P qy&l:0!7uD5bԨmڏ)=Ab!aBdX=y5jY/@6g0(ӯweZH2v!\Etpku۠o; 2gDmw 5Ɛ_C;АwI:%ǻ CnN3bۈ[-̺WZ.4䝫hk0{ 5M { L-7O99Q )tX`RxN  +Q;y?) $mұoR/)izosWVQu WU+0F1͸j~3rPo6 ПmѰxm5!S@dH`lɂʂπ)y)9?"7sSDTE"G=7P*+5VJYlKSt2G\gPMd4P);v5Qׂ\~/ntȟXLq*qOLHE,6կ N;J5–v<OaHjc"I)jɪcdoxt; ywc.|>L$L@͠F?gwA_n܋?z&e?x@ݫVwʘlg6+ʟ${f4m$%쾛-aDד]q`&a9a/~`yAgQ[i)ԔM"3luuu+ #Ct j@+c(abۺ/|^^$gqI8Ov;1g=d;8Yqu/J<*Au# BɞH,]W9#f5Qk!.0bDI(-/%- W; ynK' WȨ;pDRl3\1ҙV׎Rr،M Ft()0 6c]hɈʡ䦔DQА f*@2ۺ3&p3H}w5|zF 퉅-{13ABnTb Dj73W xͼ\WXV+Wō\iprJN03 b{ ̣zpIF<SӷD[͍bܼjpm>%-TfP%ҙ"(Дw:+8\(!~'VwD+ju '׷K?̷ӷSg4ql!pz0x&m~mh4btjoL]^{eAG.˕!wWyoƣJ;EVN(qs+ %'$/Y) UOlqFP]7ch{8K;ꄃ{X\hk3/zShzAA~zQ2zJA@-0.=}9Ǟv>/J9pI)Ζڪzk@:ZS;%eXNU2J9up1UNov򜽑 _ݹw(DԱEK$Ncߪc%~/m'OI $+NdJ)Mُ';=׺}(CU'%j-qfU(GIWoj[>^#Z,2z~poq]*=)n&k:[}qtmylE|~h$Z!ٌaZ7'aQbS8)(LKL9wKVPe[ä*s9u)2E ̤6F9h4)!+3j,S!g"YWFI k jLN?1 #xR#N0tPƪf9MQ᪗ht d C$RȁRWsToyﴌǯ喧- v}K6cBJVW?}}rO8%tŃ>'O_&M{?C@&?34*$c@;SE|Y'.p plC&/Z!<7 #1Joo^~b#.|\ |Ó|5;7e3B! ̸>cz,X2j:/~)&jN~qhkMN8.ƌUGVfS&Ӣ$# y{z yKc^=o^ Zà h Q EJkt 4)P @=poxI@C ¿_K09P;B@|`;FC:B"NQgC>IM._u٨tldlufc90OD^_IJ䫧ۗ}} 2lKbxmaa؈ [NJ QRY+  ˹49ͭS1p_l$pA}1[V1aD.-êݭ&?2 C\~u0=X_͘NbIՂNb:ZUn 0idls3F /A*SD J"9v6̨41J*(rK95}rY QB#@jƏU '@LUѫ7aM cJE@5A-+)ں\(Q0e52LJ[ ƍ nMGCc?SLhjFQz`J 73|ʼn>I1Xȝ~AQ$#$"䆲Hǩ9@}UrmMB#+^j+tq_{krzuALy OX$JJᲽy f\^=*F' QUTS9Ghoꮴ7H!y|: G@hJ7{]hB(^<> ƺΣ°:H UK.dѧZiI40SkerP¨Fmkɿuߺi(.v UGmӨQƫ8y}u>Wje;E_NnUŋlݼKgd | _. ?~g s{կ/f1ԇ^'%3f&zc-|Jl-j\F;Ґw):EA'cݴ,hR1QgX;"0uK/n]h;W$\ǹc ЃnN3bۘ_d;uK n]h;W:Q:i<80@̨ n1T2]S B$/A'`/ /QהRo"#UQ mhLr \+s -9rZХh+|?E4,jueqi|7mš()SqOYT+{v{0kJY8Q8[ӄx*bPʢJTi\wnGgFHcJ9*tT/6c@c35c'*e5c׸H=(ҏ=%B6s8~͟WMnq{5+ƕk|1_t_Ug]V{\?.. ~Y,/ϩ'ſBR/;c=ꢸl^Q9X<5qw Ջ5+2dV w;So.yBh+ 1ib -$ze+4n'{{W7c9B4[,&Vp6Dj`ym؍$Vv?u?W-tXoeh2uCZ" /gZzo_^t~h.S섾Th0уgu*{Dp LRK[0Ҋx$he*CU1_؋fSgF=*$-T<+~R X޲},\QGD8{iO`.G1@(lxoW:̍!ԊJv6DB WsuHㇰk(ɒOl%ݟN%Q'QRd#ՈFtr2U¨֬ĐPE`P_!J 4$/cri䰥>#&r@h6fYv!\Et˃X.}[*!6Lw4h>\L:WGuBC޹& X7J1uK DuRc;2mɍ[UN`+N_XފvARqu3#5?;;Glhi3 =gsi=)|G 'Ub2o%/)iV|]oDXլ g<Ͼ]]ks7+,}h _ճxa~Q<:j[Pnr*lILVY!cKJ S06dĔ/qc Vuq۶ļ7.2xX(fWl!C"cݐ ߮'_dە̧f[9o_F77zTKMF#JEM!(9UPYԨ*G (=D C@@/PIQ g,(y_*+fϗ6(=hr C)'ooB n.5i!߿!aHM^#'ʾ?ɨ}kNlS+v +N,N<,ת? ew=CʙADĿ /j3*`ڢ5wV,ڥ2! pZi팾yfjGmJt9PXD]f5 Omgbe;@}# ֲkCMHJJ1(w@yq*gcJçHEZp 7`PDP%6Ô0̸d$5h) z-ݓ##FAyOny!8er;]qnwwejx_ %`N6SC\+囥nR0`Ns]r[u?gHʹbnj(A o[+wMQm`}kWw7aR±7!0&W)ڦw3NmW66c4y;+L+`7 d,N%qG ݰ 6KY=N[*KXVw>.t[k]I (7> n@MDzҲanQ35z/Unv=&zZ|*+DL8X{L^v:3Ԓ&Mdnl63̣ \fz ?Ϯ>z8x?it5|3d|箌 +Y or޻WɢZiעƓNl㵶_w+$=Yc YbV}OoG}‰O'#@r>T HJBރr@!LWs]. =\ZIf:/>\}oCehGI]{e`$жNco/QMX WpmȢ ,>=5ߒ:"j{QjnzKR;>m 8Mȶ[M+vM $n!/1&{R/y7n:N3xv_ަ%[y”q.{7]C:n:N3x.${nH[ySbj}V<ތmsw3|._/QҨdɫ_ɫ(}5+J!kJD, +I(XȥԆjca,@#5'B{RH}4G4J9{ E{Ra(-֊aGjF"Ĕ/ >KcF)PX/P +~)5)sDaW|,Né(G2J :+H>KMRcFa(ܣT+ Rj=6Jh×JR9ۗ9Qv@(X5(J1zSn - PdfH$M"!7,eL ~(Vy9S`$OJjuL9, ]pMl9VZVzb]fIRʠlbiyLG{²IDewӕI646]d!,D2,YM(Rt:OV=v qf1bpWw,&nq8M4Iq O$@"تuUiar'[og5em5AwM_~1W~>Dkq؆BSd6yY>kH`zY$-;[f^WR/LK/QFP54CSiۗ@CTa]p^iZy;ù2?r 5Ͽ?c ?x48cS="/<"/<"/<"/ O\+LLQ:I!b&ah:QۻtVy7nMo;\1|cc fӇv+'gYjo<@CIboTLAPk 3fo2rRNlqw]4οP2ŷ gFo坟(ݥimo ;<t>||1~|S7TVF5B57PĐ:eՎL-Rܯcs]̲aVl][~<"/8o\2{ 0ьx'6pbC1V *aL2Jnh=,"jg~.65 %7q\\%YhVd %ʃd9 J1u;̌q 4qCE 1`..pmU\VF&*) CD Ƣ5\PfEɴb Z`E>f5D昼mǩmoT.Eg(O)eL)'U:2ud)&Cz]޶ ul|^L+Dm2Pa;n#.efcgbۭ,1: ~|:y7l̩id"O^nog51uS5(VXtnRgk(MgSSpARE$ (s,Y26j;o;&*pu@$ִK?Ȍ +$m)RܢeP45͊LS$DJcFTΖa^P6ۣMU Aw6;(pd:*ê4_S6Au36߉}r2Ut; d4M-2nK`k޵ y“v)g"ݥǗc=d5x&檲AT%D+^YeR/׫9{({J]XSh&S2ك=K]#Hvh? lba EPH-wd4:~   ͋*4vJO 7E!sEN4H$4ώ+kn#Gٝ"llttGx;uZǽ1}"%PeF;ڒ V%2?$/39B*͐KA9l}Qe{˛ͣ:JԾ>+5R<Ȧf;:^X:Cs%/KUN2+6T)PX4ʌeVJUFP9}uQ"i,Z0>Ec4Oj̨R T,/3U X B*_\'.nJNbuA7Bâ-}2n)-g7QYWtČ}>8<_ X1iZٻ,Ȉ&SNf.*AUffiDY=@crQ2F5HV|40p]BJb6#BM>gS,ީԾ$3%S$S80o5$(ISA H+tPe9= Y娝LCʨXpE{m1}4lgu0K n>K; WV}0+kTcLU͎PҬO~ժbW~ZLBW߾olݻF}@rA)lEV/ϊ]T/v U=%Dʃ7+DaB]mvdrzYv EEo;w0+k9F,P ێoS!sj,e|ml5M\jMH5D\F)9:CcAƮl-~JV#ؔEX}GH&cSug6C4 S!XM8In:N;`k/cSIH-:wk!8D0j MP"d@ľ#Ļ aͻŧnm :#SmҽߩJ>OZt3ѩba}h_`J8Az8J vt^<~ZϘ~ #MLG кL8}]*xݿ'Z lLĦy0cT+4bjHz1b*WtB s ,PjK- `5@^w%Tv#E":Wc(bB w M@ot5' |t3 1xX@'!m՗)yDrޭ |”T_qnf<ľ#Ļ nI=[!)D&*x7 w TPfynjtM6IeԫǑ-pu(g弶M4VՇHM)[MnS#=qKZ>OzD%s-hì=,oی_>.oPSbJjfM,"A{[拧z[_7N&gRb'П=N7={Tsǭ*gghYcEQ$\ebTIVeK._>y%zr2g}Ƶ?^,ѹo;JVn:<ãC8AJVItpWE\?Y'%@]V[@]S}~V7ȻcU@+Dʔ` I*-TI̿RU!2PuzY^w?\LM2P^DVMҨ6S&nިF& [Ll6sHiQ3Ҥ[̢`p絙"(8]xa$)-XVMAzoy!|IN0ebĿ}u-Ðo̳anu!sXѽ|Իc#ˈvb"ʏ}ϒ"DŽmΆύ2]!_r#2xuً-7mQt2J/;[8>$~xY"鎍r7!r%7:ꅊ{9sob~柿мJB`JHf:Ee;Pmuj1 8vnC\6I,o9Ntr~fx+7GL0U[tq՞CMtJmHrڑNK(L,(|r~~{ʪk-[k3-&uj *AKN!"3MX6`= p"R_lJ͈~=Ƙ[PJF}R_lJ~(+&u9tR"G}R_lJ)JflTHPkTf/7ePY%.\BVyUa` r!5sd-|OM5uNcǖ_\ea|i 6+5Q'3JŹYxxԶ(GR_lH J/R]F}R_lJ'5JC)#+!%@)#~( GR~(uBr(Zj47JQS^א(ER+Y/=fs.JK})b.=nC)+<(P Ԋ^QʘJ[×2Zj씗Qw$8h: =LǍR~(T_JQZi8ss|F)0?y8/ņԜP'K LjR~(eK QJk'SF)W~(>#Ci-t^z(՞f]]{__M'_z(՞(5A9ؠAdnz~ ~N_?taJ3XlD` ¨}iRoxE6!]V+&[U <7g$p}N*J7wǖ$!5DullpL:n4-eku[y6eFV񟝦w|7B`ZGgmuِPu:b$7CYL>͍K -)xYT'`̠!K RUcHoSfN}IG{4#b/+=( 2{c~ LgpCA-qJ[)`WN.hA랊؜~Kpll3gZs/;ZKƜ hF9܅?iM%GZΏ)@ф)K4"'W_+msYZUvۺt?V?3oWWJGeVmI=ŤS4:Iၶgn)sRJVf娱eqA̩JTgT<גr& n$ T ~1S :][/mr1K-ėcĶj=) )Uv&~돏_~ SB(e|. j^-nV^ܛ6DB qW'2$ۻ3f?=\_A;KA ?L(blrY<-2_j?v!@vMfUw﷕ yFJ:SoogVT⽕d*P]i՝pMͫ:8^wyz8O=T'T(^B@RW.ǵl;J90[2dwaU Ӛ63Pv ۣ-9قٌ f̣SNm*n_IY^mjbSX-M 1maDל~^ C%;u  `Au8x7]O.;o?@=@JB,+$SY&B)A&;t Kn{Y\}*67-f~zb޵>7n/\Լz۫JUĕ/.c$>!) |+ݕD=g, jr4i̩r)5Z2Q*k ^tJ0pM?Ucy %{>JF3|@t%]IV9R?+)^8bvG=ǷZU^i)Pe M II1kB!5 Uek]bJF ȑ? :Fh!B'$O6$-vFF#Vt^ʜRicd0·ȱ0ೀQ( C\0tΙ)c ƹËSh`K+#Lq1d tZn#G@[ioeYv*)wJTVS*8AhB/ZܯX8]Uq-*n\!Cw΂O CO|*/wa{ԠUܾc/$X?km F }Q(HC3Ʉufpdd?bnvܪ4-IO1l%8SH7UQN 9;qތb O urkU)fv \FX0BDžI1Cxe|cxk A$yŜ~Fi覔]ņF ;782Ѝ9 j|cT\9#%Qxg'%Dך˻g Y?#۾q9Tbvjj^h JJKȍ^ sq%^,[4g/@ FF\_B ɤK\a!P?ؼp Rxh,(粴փ,)Fz@J}a2v!~sgYC\hc9X+X1jUJX;+ )a)֕- D3$Z5,18 gf(b*QCB -hK )F83 gX!xu¬?O! h33[jHH.k̺t(Rmݽ v|.rB雋ގg~m$5v UtMTf tyy%oyfV*Zᎉi"2QTfxJ">ynhUឥq/r'&h::^&2ϴX7uakG'  {>rTkܴ٧6qX 76}@MUXay{4C\Yé/LP󂙢,s뉡V2[rS(>Ya7zWJ;Nڻn MY쐰$P7ZᮉgaӅ,042ShV$gL5Tm֢:dpY(8q+qS`2l~SKOi}(p;;A?b:P^>ϯ~ ӊw-ź~6žt\NFCpsmͺTU{ /S뗞AI_F[veܹEY ~f_͉ ?u!C2?\\;S7Rm*<=x*":I/lWWj|>~]*\8W.I2%OT#jT bD'uS;n "[E4I:tmh7ZvK FtRQGT-mkXڭ y"zL&ghM{R:/Д(EE>1kX?4N]~p:q5T*.g}6Jc^.ɁuJʥQ07nQBOcNy7'~`Ye,;DOdzx?'\jV(pZsq]L kp^ 7{6^/b nN>̜Lf|(\2h(E}|BNrY Ղv4RK -qm! reR]v7znkvrKY jDk}s/#dqR %63SN FBKu0YC KӡCJ*uL=:b>YTn y~"*t\NFCR58!⢪}8sQ!iډ٩~o Mpb:5d>o)dQ\}fy!mM,&|w((%#H3K104Rxz6fIu1ˋN`377CggΦtdb+o(]гW^"jĊsLQ 2}ոBpχCyRn'a"Nפekƀ+b ۻe92ނun~E'+(0޽uy(/yџК>@ *&gBJyq1:n.Fp!OKXA ']ഝ8XIJr)3Q&3BA~ьYǪcWY7(1#Zn>8g(cQyb{>̐*Oĭ⮥ȱ֫xO%Sv,Gri.9SnWSa;^rX;^ $}7x$ hJsكw jXeFnО@:ڏI eHk&:cLRfY¬9* -^>Ѫ(Uz 2H~.ç)5;q݊Qyl;V?ɫݐrSe)* "+bHJ\^[ }l~q!*J>8S5ac>ٙOhvEsXM=g82tKdlO21:d\l됍r]FY>MFJNyf(-rrRHȝsIY&>h 5>vE~ =Vg;j(X"7;'H )1Tiڶ?6c:NH<^|ox U")J:Bar .Y1!WK S=URI}QRI}Uv>JSƓ Hx=vn/%iy7z"EIz")o(1y>}y)*# Ais (ZU)]2]YNf͝MO1;L~}6֨7s1LgF3aPd~oU-.xp'2@cltStNr=#9?#6X{91E7DXQ#V}N ; AT쬆_yohmV}|X|* ۟NGRwu8<=:<\&Y6`nG3[ 4ԌҴdʚ>ćd_?NgaPhdƿeo/ںY,~>L#&|y ]= W+O>+f_-17{h!DET3)dIr46N•>~D(+ [˂kf_ I(Oo|F=-}ST(8i~#O_U:@'9'-t3z$oދiDrE)l4h„#HhQz?Bk)a kWT%ȑZ(d@t{/55/ =m0*z1g1 AJh%{.@J P4Re!yAu)! ng>'S&e%+@4*)@ LxYJSZm)r\(DUBlJ0Ix0gKKO3lإw0&;er)ؗWuiJI[G\.spDM@\'" ~|.Z @}z6BtKyBِν 9fQ7tVOly1mDn*8[FvFiDڴ~$@ jjBafrF ō4PA{},7]J$U.w(R7mFsDžu &!}@$O`H8!]68,vDgښ۸_a(6WR'슓˦T:JV; )qċPC#C0Bj+a"TNP!).stcQix#lD%t)Etl%BzftL2^Rm(RFמ0qP!.k6l;4ZTe<5y*+-o vN)lO㌼\єϟ=Ǿҳa&{B MOG3IKWro"/Rw;+ծÎs6.[w.|@^f`dCݸrP5rzWj` sT7lP r.K68Ƞ^)4h2Kq7fyOqr`yȂ!<!Pָ~KV58`7% -C -Sʁ8JhDp[h !`K "(Sl`-$ $[p6uCy[;BCٴ-b%!t<;!BB4 v 3-^@*mQPN! RQZqGv1c\maT[?O.B%]U]e#@ݖS)vr!t,xC GKZ/@-{%H?nMaAC.To ?:\ 7[=WMG([\wE: qxwg=~~F=YsT /aZ7'>Lf';8bVT}ȼ]S ޭHڅNY\{ÿnQQordõS9ΗW_̚d<߶WFM"^LLH>`xF|cDonƟ&qr[WWTu ~'9eb1&Q?s)࿀/xw",0g{bFB'7UI;q!R+գ~ۛr2. k?7Vj&z¡-@a+<2%Uzn(- ')xs_]'Ah}$@+kAVx-=ܟbW;)i~mCXP^jge%+R$;$Uӭ2.5W%-DȘ`T)/nxe:D-/QHVYgEaY+>PfBK|{~Se0ն-^ٚPݪWhIwIҢᱮ=Bk7\&y}~x7Fs<n2Of_}ߞ:QUFu:I<񓿞Uħ}1[N;+4Wn(' H"8ݨ*KcZ, ZM$DBR$Xrlil&'1Ԧ?Ʒo.C|MZAPbT6>WtEm5mA|klTyv{mq1k{7=ӡ0zf^h3o:+xk61wOjLx4"bW aPV-ޕBޜ_ݤw'TBh5oDȫۛۿ͇Lw.Px7Yu!8Is]0N6Fiy2boo}qM1MЃ$ϛt}LQ٩ވ9 ?p,AG7n J-5x!{@mɎ~+[ NF)5?nk>ŀ1p| xCwoP]kиoF0u:kg{^_B[ϯ}wOyjQ\zkE<1QIIdrƁFr'薂}v3@ >.d7ʹӛ9zkQWQ?.Z7(\͏˥ {'yܬo ksOx8~wsv FYށB10&kU?Q//&&T~\ h#&NRLrKfS%^|0Zd[@¿/IhfE4`~͎gQs,taxw9(MԠgOoq OZyA_,W=˕~r߳zy3:XLZRG„d &a-Ïh T@qKJ6{~S_FTGFm;z58} "F4Fq9-:2ƕ8+DmYfTK e96:qr xfUʷOZbУ\bҎ|=FvVQ͖ BI!sP=@ :CtdPlH}rwu hD$iݗPq@"^Arѥ)P%0$Ѡ)e`% (g0BqՈG-pm A-vX?ǝm,4#c CRn4ٗdl l}/\U2Dɺ-$5B"wvҝO-{!m(nRև ?-B?}ge f iKzFiȢs\HM2ɚh-iqj' .hJtĦ^ TsIOolH4 qQ$A/W'\22c("QM!DT E눩Dtk!(.#|.lFC*ioN]|~֡|]Q>NW75;g]tcFClӝ݁ Aק݃J# jWaF`"Ywu՜($v`.vs,ʬ0%DQ6oCgzGm d*Xqa-X=qwwQZݜהKe'ӯK_'% 14 yAJЭǹZ9*FZ#X]=K; R ATA t9` 6'1 h̶ $*v2$ 5>j>.V>&$h$HX)WZ%sYp HH pnkTWِ' ڐȁc"0&iZMb)qB%* I^txPFm{Ȫ+IB0r}i *cWDԅWXSF22!tʘz6Pl8)SNN V (c?unxVfx`i┼\좁֕xG_Vh1o}10Լ6:SdjLh4ڗt=7M &Wc +4_ ^**Vs~0;p) eVn$3vJp83Tu3β_#.Nbsz*i_NhJQ:qMh[B }l1u.}ƝH9%61/d%LWPjFL!/ҵXO9~JPkXqj*xjC} F zўO4qFEAFJvlB0;څȿVD9Q^ߥ`g*eR՗/- H)czgro\~\\FR*&{w[MgwY_͎>i#f"en (uSˁ|_ZTշFwTyK g^Ń!I3? X;ǽE{*Z"Zaf)y6&_=;Y95jzZk6A 炭z]ضSk p8K٩t9Mp};^ìԇY|}CZ~LPZGCp pJaQ KgX]Ys#7+~p$.EÆgcc bE[$նw}"%:Hٲ#ZXH$_&8U+Pc! ˁi 3@Vxr*өo{6RSqq[ A*kmG.Oa1 zj+}QE|: `6^Qu2z@SZŗLG9#mˢEJgm#d 7DH*"W^/bQt) L#P2L('a{<+=x CKyd@ x W&j[M2[ I^;ЋO>cdbs moVkV#I^KP O>]|F$^q61R@L{Ie!Uɔ4`R R0J@ :1fNHȃzk+u4yḙmݻV=pX 1@C_C X٢&V>adQ1/m>DKyK<b|LH[ރROTr*1t;dŬX Olj4ɅuH=z5ZOFNX$0#9AݳUCTc`QMIDAA8 F3$ :rZl° (T˹$/d,?kF"]㏡0 #;'Q 6cb(@陊6C2"5ey&Ʊh-9ax8]TaX+Qi c/"-زt9B= ukKҖΔtDF +x])yFOun*'y|@Xe ]$)C+c l O8h}(k>LZ9T?p 8&xA"{T{Jqg Fc"Bca83\UEn*W<,<$2M &q]O~._\ݭF:(w[UZeht.s=$,Clzr8wr|vn&bZ %VJxw=}7ZhE }pfMjNS_S VK~wEWgսدjwOԧ!V N7nmBI(xZo.hbwC 6ƔS< i[A #A4 $6.kQa;^I8,㇦a%Ɲc35%%8/C΅弤I Ú]͘m19=~b&p(U, C .Lz.,޾'T|ʳwR{S?1=:K%if;^TV}mTj|ʐ=% !ydOȿph4E!]+u]D)uBқُ~M6ϿL$/<$֙'80$tp@5P/rڃ1W?VG*yCѯѨbUŘ(xq71Ǣta-??͎8;n츉>;O0ɴ@pJ)/=a B=(!,cFQG CRĨ!|F_>ӊ,g7!Gm_t=X \s|r=z[TYV-o]Lm1G~镴[xb'S(mkR"HbLq !܃j~F54s42/c|z^c g1jM]UgsB<_>NH)IB5ue@Y&,XAX. n9TzW,,|Y>'w(NvC N%` ENw4mϖv2ҒƛF|ܪu!=GE\päP>WكV`+̓PP !3~!Zv!;_WM?*NRӞw~eo &M oJ6qm!@uPyZ`DmOmސ+E 3oUj,ρ6᭔18%b͙D^ 1DJ2QJEqi%)!De v5IX5!VKa.1"n(sOb?[y(RXR&kfDNsWjRJe<2HEO抙'&=KB\+^SxBY04В"a+T{pa8p-5UCi#L7ԆkM I^jgԌ9f8 7A"V }V;`4&R ŠNPmcּQl7TىIn%\"F/$r!&jbrUui]s8qjh' >\! W?{;)$8&”^ jg`tƈj={IyOqź{g yA=c bƊ{(-6^ eU$1xi$JBE(lD&#s3^m[oA _:@I;ik b#{ O3br!Oq,g3JhPNy&ʩz)oi#u;bWIߌudWJL"(jVӰi*hgb@ CDnz[TBdٓ²<&*1n>1UXpNNMod 0GqhS[֣wC)0!`+ɩs^"Ľ! TYϹw $@,&' 6)VJj%| 9WT"CLB)ҍo?|g3o\ДHZS"u?ޏQw.3Jjdc[uou ~6;N$&8? Vm+T/ھ"p;uWY],/f-:5K^*GnU媗}UӂU9l&>eMmG#O>D[).~۲n^֭-)C;rD[뎦Z.Fȟ|6TbJFZ1^/HGj~I>xq fB}|s/@NjC ~9ӟ'.=!K.t0h²p U4w?&]؜yEC2ۋ/c[Z2Ny zw b=l cx7| ҟY{0G]:TeQLvX+"wRY@16`ER -E8^-ڲThE#<1p_F,q{~b˸7tr,Ii$ˤHo>IJ8h0Q^m5\{[3 ]0a8Nss ?o✡6)E:6NhϿ盅ra. .'/;| o*w'GmLmjk{BWw;ÝRIͿgiuU[3LM6¼͑ܺUW\4vWg6u˴/Zg]W*a[*A/\_ZQ{fMW~ u}2'yPHah{EA0>᠘tyFCxkx  dzZړמ1B![+!IGuJ` W ':&4m`}9 }-RVgGm6Q?n?2 Xxl6>?tW -fi@]<=tSF8k{53svY!<<՝1q6g`p5U# ]}Iy~U̢N*t~LTɼ1~{+dAk73| Ģ1;nܨcbZVxuy x9Zb*Ԕ][o#+_vgeN?8͜ 2'K&gH$ }ɖlfw5R$ůbX)+3k *2Ć0]n'd5חڣiUaM3 }vܣe%򳳦cCin0q\Mh}"HD`05Yuj֖Mpqsd*ܮ~_>4XH z;4X߂uJ m7XH",4a!GnI6%Tײ&ET bL'UⓠSޝwKhwkBDl H,ĕՠrŁy|,0zqލ݆L7fR@dwve7'LFp/QE)yр**&7gcy=pJCW_NaJeqiI`r?5*)1EG4=(ߔ,7)u?񧜭bX8%8y${sY>Q1RnGT_oyKڰNv3>STnӏC ƆR],p>_Hn[-Q9ZCQW!MI*R(ϨzjEc@!"$_`j**0j,jQq΍I"KE[9N?NOHNyǿf?} Yíjm\ZS3Jm^r se \!cYK' -ڼ),N]^N㫰c nx!TRQ&w:(TSMK@ בZQ*X@:ҏgtj>h!(Bt8 j]06?#ylt Q:պ }?PByDŽ+KeWٳ11Z6S%!9B5W>?aCgw^9L30\#x(RJ EIڥR3J(|gz0Ec՜ņZAh% ^HX)|J؂#[^ }D`-nj4mxG!:ZߵC2h` jm 2*J(U k1%p邼 ٮt9UͥN/TK9E/Pqԩmњ':m@O8UA[>HN5lP?-c>۱>o,(Nέ(Z~z2U1>#wtGVxow =4N*ʜ6 u2NA`4uU^yF9fmEڀj:V;e=sP>qW:Xu5éA0e{Bt/T- FD Ygȋ(V?E^sJlgI%ͯ7TH5jVUA):Be_=c+ SOŵ5Vx%-kS7X)ptS[dyI6di_Xo_|=(q𻞚OO^K4(̨QAɭ˷in_}s>&nB>&na0qa ۑLPɉVfaM1u 9T͍@47 ~P}\"&xqr!D ?OvNx'J&jrwrnxgWاV~&~~; ې$(hx+RXUpfx֣WJ%0QbK ;+B3:ECrY!ʝZ޼5`(E iFWb*[]Nb>\&ITZآQla$IJ7Wx+xyy7WBAY  #g5_|(>Щ! E߯gva+չ|:+a%PzND;Sѡfz{$vcw")hxCUa{*U_;epr8ԲI݁(HɞFieP]*t?a(C>VC)Jv9CMm@ h: vhZ}hG+Hڎ,] )m}ꔛjJO ᎘NWTGu KS^գK1ymE}"<|R]^IP;b#y ({'PeLA=z:ح{AX:lK4)hxJ"ZѤ '-TERvH^* 0Fr^rA`aL#wehF9VS"uXgVK]7mql]!A[8lׁB=j4>56p?Knf2).{ͭաa: c ШV(jͪOh]ߝ0xIj nӆ悁 C탏sJ⽺ҫDN[Ϯ&jn)r-‰7Hz+Fr(gѡuLBH& j!Vꋒ:mJ:1Jm+ıdI BsʭaG sa3q!GfXF\w7Jf1pJLKh @@Fn:VGaӑ|8Ɠ}oXq[^ Ra % (*:#~ |#vqt3)4y!!r$ƙА0`.M&VH h{|h|hF;34̚wMXViNч'NiB$ Nss*hc"u d6I't ݈glgd![o ϊִOOVN?}:? U4iBh?V:eWkx(c* ' >yvm7v\Mٵr4780^fIDejde a%\E䤾( '嵭YQ]ukN%s*hi>k̢`3);FOi)ƹ!"ja!f6GҠLxh֒or{R # z1.d4JP q&b J -N##Dn :RV3IpF+eT&rHFry6 R#f4+L йӱWNF\ (]%;\G3_SĥKGw>g|!bWSzq׌j ^ɴ_6nh!sBh0pVIGd:2?8^\Bzk "7#a~ő K hG !̊(JAf|_ ,#AЌVt6J0q5ml@$iH$!a6L`P;s1m1'w[Qo"w}6^u-E(mk oQs\ئT(޳Ϳx.\t?#* W\=#zlfn (A$`i  {u$ZFAܙ\p1l $LL1,RRe3Hطzy̘Pr0FVB@961ኬ@ _49_? tRb>9R_j&U#v3/ t<:ϊA4 WuE΅cJߟB~F#x{ zμCD!9x2OENxߨs#+׃c(VL67^G 5WJ"@sxqP yp `ͪ39k?@EqB9_I3~v"\5ZBC CXp ;"$yћ $g䃼pG^~h [:O }X1y ӧ<|MF~qU8<χNnv4}|^N}i:}D#/TŬ&헳A>ҷy,>Q$}piC>aJ{bx(>b%5~h-dTSj C<2RC5`)w`WNr~\kb.TS_*|XE\KgV(c-B.85 ǘ/EtnPAȡly#3v5u7o|†,І JSV:j1"TG ԉ9 AZF S*DW2훝kTWU)IΊl+3"jkF"wgI UZsy}+\=`,{u_Vo<[\<1+iKS&g;z1:ۯQ0!g/l3-t_E~3kWNn+xgAj/g D&99G+}禒jZOӵ 0$X|6}!"7Qm$t3d#/5cה:]}%.vX =\"/0gɗkKD~^/Ky %H8(Wnp7g!XtOx|u6~4!hom^#2^C_1Kv9Aw$ ?I#MA(;ԌHv? Qa.6Yӟ+yqW"Hf^:|Ӓ*gh YȑhMQ{Rz7(kߣwK tRQŻ/Œcyޭ 9rM)sŻ2s[*1*m]m6+*ٻPSrٹl9"Q"{FxWPF"j̔+Fn٭9Xֆ|&zMQFzERФgKA\*p]yt`Td=鰥J\u! Pz'R+h HH"./%B ?MAJ%r,YJksN ֬dkuQp}޿BGtwRs5h+gvLJ*w|KOIrM ȿPh38tG6q!Imt! ]lV`h@"j0𳢸ԓ&{&/rn"lX,;@1 #pp#ڹ;yz^JU{]*%B]QõF1bT4l+2`/Jz" ܔnD閣π`lU۪\d*تQr]](HnC8G_.iƭ}֋݃{ZI 8}+QN3?eqNOn9",.I09}{+(f|9&jbkfyXʫ0c&:Lvk8Yboac?䨚[D]"!IjVҔ4`zU IE'VԫQLK 6(THb!BZeieĦNK4QjBX 63A>gp8 Ygo$3rc9USDS2 JSW(!Gg_H`KJV3եtyntV,桥tRq18x犏E^f7[K~YYYYlW;w VZ:J@(B3l92 H\PZʋOa-e\]M62J+E0?l VF4~ɆKd-;p+| sJn@S$)~K R )?jYT@yn MA a?vgHBAm[}%mx*}ނ8UbE[1 άI#)Y քaj1Vi' 9;SA 3ZW +!Cκ{?*[޽7` @)K33 /*I")AR4o$ :D}9 uq 盻:)%(d뺳6~Uuݫ޽*_w-]%< [Wh`wA~CgG_R_߽:UM=:I碜hn)'B8ʉvyj"-4fh 59آ`K}#5}6iVĔsZsR_W~Jࣕ^epRJ"5ExVzVJNDPl=7_h| s`U0PoΘaմ#IDq;@^&:W{=QDuBThhX! F>),4L3B(Y~~qE)߻W \1bevp->:f:;LWPE3c)8k٘6 SQKK4h.r,v32jy[DINA(p>builD>FpD$(0&jMAI:yi5#/mt˶}@r5 *Z01'XC.r`ZaCvCrȽƖj|erw@GƤ'~$~7F`q1lvYe;xDTyJ3{YR>nh Y6[yO ;q~7(l[Yb+yQ%L}VLrLbRO0"!IL &}.zX:[*فg]Ky Uy,fUzD%tYC *2gD$En˛P4ZG,|TΛNM/͏ij^]>I`/?[} E'+?OsA_E?n\L~硞$dՀH) V`?U,k9ރ7 o,vjH2"U 8v99sS`A~B<4yW5}Ɇ~nͳQ!(VS8k VZ p$ %ZrY=x_gI ϟ6|J2?2-X/i r@Hwфp$oA[U%زPxsN+}N\ns)S!ETus4^ֻP5fe)Eіx [5fʅjLQXtZֹG@h[g ٝtq) Pk~W,緇5o`ŧj-z2j$,A †?fn^,It čHmQ\/كUVꟿHzEIOֻ,淫FX%h{M|bU4F9n|54hVv&0J~a ` բAb̹DkdCw:x.E/INɤOۋ<zdD*Y j&NSG\FkX軏F VJM0f D@\Lb(h@\v1 }N}{W]qio32 w5Jq1xPb6l<4<,Hq-tHۓ4s Û$'-=+u; r14XMNIv?kd}NxK=Kٹ|q8EQi?u)K1n qH]aM_I/ߝ3rrT\g.ogWR:[B&s ꀒ-3ã(?n1x~<ɖ.cԑg<xM'ԀkRqfѴt6`zj~>,|h+31bbFČML]>`* lP1LA^9=G]HF5:GہZGWVmȸ0v@}vuxi~9ryU Io8etB4|>ww 􉓓[ y|u1%J8|3,'D89>#uyIe\Dǫ!C$FϾo7Vs&jov VHM9D̬WS0J$EW] ~+TvGZ&km)BTΪP)r,:;rr&NqqOI|)Jy*뭭h%==s(P6Jr7>b.+QLG;m|1d&<x6zi"N Sﵶ!~oL{ 6 &rʦ%G+b 墚 -vD[ПtH?^E3V{])%IUV\6i -hк444Š:Y?=N-~2q]BWTMpbcl}}yz72@`۲ވ ˮ#\oiόevxيegĬa| FX"!ޜ;/?4wg ؚ ·g+@ρF{b5ǹFif|͙$%d}JEE^MŢ"9fm0B{@d߼WǷo|߆7T(.'}Axϔ{QI jqgOu_q}W1ol꜑o^;~Xr^(w7ug%כ\ awD N7Jn뻸'ˢ谼$c7vs![+mҩl_ˍܰ ltWmmbߔ M_tǮoV[w|seL1ics+Ia+CNe𜃵'z!{qRrTZ/ y&>l.E^zt.޼D}ɍߜ7G|BANpdIB`E%GJ@Ŕ9wj?ΰ:Wi<.YJ~<&'^=Ln 'oq8+2y4c ~^PXLSEG\gXi)?ĔosAyl2+ #x23NϽR̈́,OGwh֫b1JiGf)$!y3]dɠKZY?9{ҟ~xD w~YkOelJqq4R}Q FQ!hrNqG2ܓi$W*|煍6*^بxav0>Y84d$C" o x9L3Pwy.c{yx@stz=(24&;DHa:s0=MAg,|e54RmYEW,#OH\UN " qL x&#ACNDV)YhܓK bTMk+Pt-5*&Y66{)TLEgEx)UnZl]Mdd&y{r-9{nzC:q|P zM.#@L]=*aݣvawEQ2W*Cvry0u&<9) 1Ja?MpہZඃZq4\*`TKĐmb:Ő76zV(QK9*'zb"\1< rlbe땮/Dr۸06$(mȎ 5$yk6Ւ VxjX!;Lj5ɝ4 %iYd4ថP\lRPDdY) yA ]dy8[#x0G.L;ym Utd(`2ΊIζu+ĐI`)~}qt2: ۀ)H,r{ ^$t ['C^ww}Xa2{q]좬_|< r*&?37o܌'[y}~Nɐj;n2lr;W/>ֈФPRo.N #5Mm8硨~+ edq"#͒ 0A&GUD9|JoB0I*I!X-!flb!}`I) y-fw8g$;Aj1nYRb(!/חPI* i ;O{2MB*9ϤHș뒖B&g/.oyQ" VȥXZ^qP)(#njݫ!JR> ;&3!o8[Z\QPBB"+ӉFZenl~y5АN8*kyoN\ۂxVNn9zm!ZbІXP:}"ؐX,5#hNZ-Ȼ> V|"!kO[e5֊uYu+V'kƮMWMZA)& ^w^ڣ/'nKE?+Z *N?=ٯ Z=KKMQ[>O%鴟oK 2'~vv .(./ l "ˮR`Ap|}]ɬUʮU PZ$zBPH܆Bgג=_WzS'>cSh^P6fk2J$+ r6yЂzw|99enGoČzΙrŇ~:m^k*ƨe7zˊYMxe<~Fj:=!ފEX:"VĶ D?)TO%=':}/ vǃxq^ jӝ!IӃF W<; }i  B58oolhdrP{P[hZg& hLΤE&-3IH9b$4ސV>ƒ*p$A?k%cUE/ _lĒ&V0Ldd'Xlsxl`]zs9H,Fф5 nF_A{ć bIUB+sK͹"I.fb=aa^+вӖY+ۓL^"Hl:W~g)`;΅ir1.=ʂ,AfJ šAʃ50M7QFeD2OmlBcBk+?JfMrLHg1җRjQ"@66-jƦ+EK4Sga+d)??W Bz-2Yu&gѵw DrpTƬЙ cByμap{6}{ k6a|V<,%ߝ1ViM=fc ,i.,y=)"rG7h0ٻ6n$WTrW;H?m|[[٤MXx[$*$MjCJIC 6*[GӍF/0SBMx(2 %NvYzPK1?k/ S, K4© 1Wīҥ6h4چOUGkOq]Qk>Ck/,t3w-T"zj`HnT *W!c aq . G#@g@>Tha7X_!i2젗Qo`:gxAT?}xyf*ZϷ'}1kOƧv̧`z|/Υ'`pc[LFxlp5?$>9sm'8kب=\=\e0_Oe ueO`CO8|pX8JpF]Q+Ik47P=3 %h{b@TRBp}msIۆfE|w&{AWK[؀>6^ng3]ߗeQG/ 9i&A_7^ZCEð1}8UF &sv[]3\ QQvdbM27ͷ?LP4٧`?^[Iܷ``=+簘?`O6H*<""~oê[^g Kl4 ߯TZԯ#AA /zj }ųu ;驧 EwžA@ShoAl^2PVX?PEwuG+]4l= ̗y蘸xs$cd c91:@<7; 6`' !$.*}4@',d,;(Utb@L P+@[xy7Z ~P8<@j0AűXnN(X &/z)F$ۭPH5 |Bd t1Tܖ)\ -$ gaw2.kB 6Aҍ<,3xqSAč"<ǫKq7峝3wa ?XyV{z!j۲ _bhjFփ=#f竫5WN1j~-Eթ`6Wg@ǝ+m ő,.UR?l_{+Լ 6^ UN-B8f}PgU營[y+29=ȠJjbW~0*?ˑ"BT((PӅ-\i 쫛aNB"HD#Nt+Բ`!I[j@kE ^jjo4aC=rTt/P{xɹ9Ĵ.P@U&N%Z#k +qн2OP*c6́N:_XNsSn*X82DŽ*+QI-CΈř"rgsM@ZrТ w%v,h.$Y{ $v\4L\eh'4,%0V ׼\4b&TQH/2&3 _׆>tIv^&!0ΌYFt0uyɢˆ[( B%ur΍2~Ӂ`vA;=HA6k>ѻO-NlI3y2#YHcROpAR e& er]LEWW.i; =-T"Tojg+mfCgJM/{˧Q˰ iȁ{dcS!Йq#T" LۏF[[-Iגtd!Xxi4ښI Q`hCWOwQVe8zӃ/dtXMB>'0=umbp+8:gcXBSo/Ůn+,^\C;{vYD.0F>?[U³g59F'uĄ`ަn I-B wXP0"ޫU3N56w e8ΪܽjGTrGd8`fЉq.S̠*Jd?IǙ= _!jPZ0eA%Ņa,;(URj.>SF8ۅF MMMMVf3( !23F"SbG8&PRs!8Qq{@DO!5޼.N9){B 1d ZH45sF%V"iƸn3-W2p#(>$w^Ha[s8Cңm%XqW[OA*00ػiKXmn4QzhZϜ<‘n橷@2rg3smXԊ "LZZsIDY% 4VBPQsUЏLaq, she>4m]N"*HEoe;Ѧڟrfla9'G36>li@s&'^'뤞SzVdtӆaoAk\\e9BOl\,|ϻgN%UYQRZ#Na`2 ֝Z鵃S9?ٌ9zrB]'lXdֵOgtҟ ,oTDρ lvɧ#WC](7d*nE[Q;J1VD3 hC#Akm]=,"_H]n8ë.eqy/@n:ۘLU J"+t®߇EZT9JyyN4cF=J{c?>t5Qw.}! LƲ-?(C_yyV2@gy(ӹo {!Ǽ{kY?:P0oE\0#+?1 H9Yb. 8YJՑXl764KOlr"9-?ܓewh/?C4.w:D1'OTX OcTũf̀L,]~씍OC4#v)L6 = ǀ_y{xe8s>ZO?\]HD7NhEuK/<]fEkӵ<:Eٵnּ(yo.kQqBS}jۄY2GW<}|OۖJ)eɻ:Z泪Ck_gu޾9ĂġgH6kT.* dJa+B)ܻ pku[u_Uu[u_.+Ȫ#`0uz1/, x"E P+޶? C5a/WcƯƯ}mV}mz5RRJ[B_)L Q0mT+eI.jCqۏ7ꇎJ]BTW={ݕX\`hǼ|nd1Z*`*ȬҔ(s'^9< b],rm;6):RI6DaJ(47UBi1ؔxA% s)'e@@i'AwWy+Pd$ZتpN ׶ojM[Ƕ)*" j{w{E kN`|C]_>9 #a ڹSc/qCŋxû_?܅x C D8zOА[w6d:q!dn&s|v$̤Fә 6_wffXz2qC'b"#1WUY`߬-O>FGXIJ8Y~FLS9%jᝬ՟LJfKj:| mQ>eCG)K`3V`;vfh_pw hХah61>Qe]npŅ>ca3ثYҵ)MӦpfbе-AUǥ}\r~~i~̾z@*!aWoM+]~ 9kK=|[T nwuOtwe&%D !c 9|K_Rو9v˘=0|+eQbKg)ԔZ{3N@d-9Vxiqjlb$A$#ՏedZК4MMK_#D\"(4\̱2P@P8CVX+j1/U(h&ĩD4mB=us(&RS&͈#Nu$4i"lIHٴ @ ae7N -va:ŜFNͻ̆ -l2&U26'Zv%>@'ct.;U9RMCd9DR&|m:gӣKD=$FW?ɺR('X0 6=D W@ u=Xa(J#Njϫr=X9PW+ZWjhJܢE4%X]u]J%0 4VN* a{cyH OwG!Is3Fna0Tˉ2O`:;t(wׂxM+Zt ^0ؑ^Zo>_<}JNt6Gm3!vAͩD4lN%I kۃSٷ/@hηT Av'Jbqa*vB 9 -/!x`v:?.4=uҊqySĔxP3u)b,p&#dOr's݉)b+񷭦bz0cTbCu$1En%GTI b##Ӄ=6H '- z# r:6IaK$[]/f  H>ǮN+8(?~;jOp귖>:f]+ r ScVx-6{i`-J Awi\O ;4  O;i;k]Vz` ]! QVG-8 A>@%3Ia[N9a܎qT &`2ú6Ft>՝֨%;5.:0䛡T63+)N))0!%-f=/}6i|%nI1zFno`Ve5 A0t#)/VR"+Nny 2UҡP&^n\ȃr=?eUw0+fneQa?NԐnoA/meO>Uw(:v뉟O/ӊ/8!f")':\胧~=:-?s SHW>ޢ'OyLO 9 k+AOLF)\2T>dT<_,&+}^{} C3?$cZYM' oyF+ ӹR н%q~;J8ǁ*<hWiD=jT':̽6%gcH+܋7@dL250( qH12ZTG+y圛5T#MLюAի;I@$b!:vdD.7g$fةV!YN \-/ЖB@@ɗK y `P{"ڀp[CbsV=Dss2FnIAdUO|+/8ny3h" 8M3jm'}li$4j龨oQi̵Tk| җB_oX`"e/EuP~{X<\O?[+JqhQ՚1m3{_tW.` Z.B"GƊLRgyeyCVm𗖇fV{w\$woj{F-fH'lRM_Ӕ#g?ExUw|NHf!j7~ ڔ5\EV2oGog%dqu3?kW@ѥ ˤ5^,ʳR'&]X; QB;=./?k&O}VZ[ӳZC6P?orW ;mKgMPhL.0Es S` }7eڴV*ɣ y6 3O=C )<3RѼk׹ys̃Ԁ}R'؜qY_R{jE~qK9:ʜ$ @۫$TiPEQHMRآ5 'CoX&T)h/Cys 0W-kͯШPKߨE2O?b Ԙ;17^5ojxռ)t06kM Y ^?K3/N,ϑVpi㌌~y_Pp-a=m}Pz}loh`}5ռhs4JQxv|rD ܎v%S,0;?aIع\cAQ+EgBΨB\Gϣ sïMtْMB8pxoIFWۯ6-n'⯣[sbHd sJb5s^6ꝶ_m]4&k0Y{t4%:QiQ;X=Z3dq~t`3h w)8?]4A'NQjҫ3&]Ak:Д )A΄k$&>`"azLSy%ׄ"v^I  h,4dYK xwIJ$*c`0_n=TOxM :k\z7>J,ZHF`pB4-9Jݼx(4bFjkز).}\jJʳ@5e0P{L & -E6+]-JM)ZA,R^]Z.JvG~oܡU*c-o3{\7}S[7aqgڵ).߾=Džmc}_nܬD)pw%?r-}Omy\ew ,{n)`<{Q`DEK*W_k򝫑>@ӣɣ+ᢴ :`s_li/x˽|}xe2RU7 gǣ3ٸ5̩.K6w}\ϓL;ayfEd@]ئD6Ο(zJwa%)YG-GIi҉Nv[feyOX*?L?yeMKOwUfpr8]KM9oAUEL")'" hgMwCq*8ȎMiDKaHY="X)/O/XN6Gj=ȏ}g$f'ψ8_YiCړg*?eW#iEИtTlB牽&9 c'PV]<kPxEaN!Ƭu?OM`c[kcsܖGߖbnޫ!y3m5Y(NYQ0D19 j$5T;pfin7*8=h""mp&B)Qsb8r.r2#T&!11Unnߑ*ٙwB?!U cZ1sg?~fw㎗Y$PyK[I046]Q'L]Ծ i}6snzh t4s4J1uC:/剐ٿ&:Ruw.Phw ud WpJ`@scF 8ۜ1[@ٮ*P^\]v=R9 !i"xZ4`E{3U&9 UWg`g&0!3 0X˶__:>OᬨҽvjX ,!b/| `C!-;@GRv1Ix8W*yDv[p8[pwuz 9Jl%X1GHtCT s]-ZR~fU$gU$}KʍTdCJ%vPs\;ht$A+)i&EJN]ǟ@Q}\lU'HztdB!$/4 QOML|hLYu>w}:E^  V|`qSX gR y=U/sF]ES߅dvp>j ,s~0vo7] al44e͘ ;ak6y&)vyL. qTG90g?޼/B^25cT3 g3}jwUNltUYYGr%,`8$U 3aSp[LU\86U9A.™>LGGҭ0: ^ZBpc鵠#DγP9:AhRKjKj llb:G>hu)uTs0$.*`)ה=9[`d8()\,&G1݀ib.|/wťԍ&٢pvV_L}Y]{jϟ.zv0)\"3&' j=6`]) ''Krb⎣p@OY{(",eB7V.?u`uղat( Iq-ʪeӁ_rcD` LHf\r.ei)X_Cܨm `Tw]B$S&l5;QO5;zwxxHjJ؞+xS CHP%Ҭާ! R|z- +sCf^f5) nIBTg9P}}(ˣVaI f٩#{~K_gcf6 3ZmWb'!s]3uJ3qoXHx7(G =YonVlB)e J2t. kw8KB&uY\g" m(*>* =L,TtZ Jٕ)O7~SMyiKvy#̊'(y,G1Me  k^C+}Q8k/>\7B<8X qVud\\=(3&r!prqfu ֘^ )G0GBrsF3LTn9%62XXĴ\(cHp^]f fh4$*%ʎQJ!BwrmΥ[Ƥ¬L51M- 9<(e@ ءM͉BK!k!$uUjE;B:ZH80G'  ܖ^5]>7vf…Iw ; [ֽe`5lߴ6k 䄲 c4'gǨRsE鹶\% [s=/9ז/zpvZe6) ۻXQQS(al>M\>;h)GeF#GF{.FX ڭ$5Guu)885! LބY.NNue5`S&]LG%XGxF|-xmQ5Ikzt-Gڷ$fBv9,+e)ϝ+~dE+D.]͝Wo.\2Juyk:p}79ƖBbe /%.NsP7掳֟w\i'vO. ?x1 c3<D H! Vk cPBOa3[UUXft3K҅\!\M'KU &ϚJl4y]o0gxL$&f+NayGfzm_ 3!`tzME;Yw']?,'䲷ZMLQ -eeӯA6ሙ B _(nP:iŲF=kE@2eM?WHTiﵕ ĆFk'~p-]UGrh6i U( TݫMS-FVw9ݯ7{:Iqc;0đLfrsxvʝ%ZHMTe$bpy6߮fІ>QWΖJx:ߑT&逿A\t͠!.|E!U+e X6t[ -<8+,Şzutէ D 3NMB(ź -.,z 2L[J1 %B~K%;4A!4ky mbJtw{tׄ=t{z;]>49y5un:IiZ[STχ8'K'#%pTXI6k1LKI,Q LrU(+m!RbmS;̇? !8TY<ɺ$6p:w׫{;ˇfO V`w~HIcj>D |YG`l'~pH."_T]- V:#4dB& QpV[q{ ?x~,9d7IQێގR!7>GPr4;Yiڤ֡j@H'+|5H!i"UiHkEujl^ mEJD6 qr,b )b@",aٚmG4CWe1;TJ/o_c Ҫ)"32ic^%FJ\8ϭ# -LXsAC\K! X$DL197p9h{΄OFT2*!.RV|˶k, t+-r yGk< s8S=O"{B cу>s+U$FyaX:+Z~k<ևw  ]YeA|k[I$<d`~äCDqL]Qhx0 DlaVsԋdy?E0'6A C- a0`+A!@@-*YCp:47D=5`XS1MCC !1 Z"#HU{.hc{$zTU,` v2I%(.Rpb61uBiU8'鑿LO%8?C{HI#CZȇUM|; G?afDDHQ=?}{d:[+? (G-߹2w +"; ~u)>~{tTt8luFeg-DIId=8x:sΉlYk8cveLazǎk#(9[I( S#lЎV3Q)Ǝ>┎S{z9!Eam"zHq lN4x,MK6Uy?ڞOWTM%#1$OnpбSTnz.n HO8[ŻsMᵙ"вR[61Ud6f[;D"׳P?c:)LHa3k}ӊې  bM@Icr:У{,J!1'uƀD^37A\P(XtSzZF|:KZR F(,4Պ9Js) WpiT(hݧe\*|jJ2$停D&H9+ḑ-G(@V* <["aZz@:f2警SfzR4#N(DE@#D8QDu{8Q^r-,3eh;膀QoFJ 0Pdi 9?Ry1ϰt3!<}T[AłQ )*V.0!Yy l{1sM7oXVk"⼋s!(H$<-%\Rk|ߗ|GvRɦ2u`b[ ?3yMUG*x? y?7gϣ2Z iLq{_nnVv}>fq4ocwsuO@UՅfȹz2N9[+z]h3X~$->|&S;iAшi10 m}  -3DQ)ⅱg鏥x!Ff# "H qyIU'+νfy@BTj$\mxЗocF(5sf >}u-{lkhrD5}Ñ葴ԤKq͘t Cp[Q:8F bWVJn3ՔPVYiQM=4&Dm$ O)$oW^O" i6TSE+'[7*0!|5]`]HR%Z57ra%8|#L5 !ZH]ԀX) q*0/!`2NDi8+ `aВ!!b%զMIƥ> 1C4Uŏ6lJ5՝M*wRf,Z\Jqq9s:(҅]ξKFW^+[EG{Sk >]|msӀަmo= fM24)⒲ 84ާ׏R`=.`5W{s*sUX9u$!\Dgrvޚvw@iGv8xJ[j.$䕋h+$yݪ.Inm1}4n$J7zڭ y""S*S96b~k8wWI &IvrwY?v-KφwfvW3ys5o]I;;7W3ϙx!qe(<%$l8LɹF\ + .ݜQ"*uqU\=jjD@0G-`BE'` ZxB Aנ@CMNk}ԩmeV#b*!,yWn7ifgJ)T' J_iæ ftu3!XO v HԄ7v|7՝T ٯ|_(6kF:5_fK_~~6)vxksjCY!_@ [>4:hP)݋)7C1:FwpaeLteL̟^R _꬇sc&dt3ф=[-Yys[14Sx`̛?9aޙLUb7kVEnŀg_Al6ɶt[4 ߗI}kY=#YE.i#vv;ئǥyRLj":5zjlpz!]w|)Z =,‘X\ҾZ:ލeLZaӥrJTK*F#Dit'Ÿodmt1XzDzݞJ80cAGBx1vF.„ F&>0P{ XЧh"{!DՊRAEmA%1)2#J-=X'|VLb?垝L(g9 H.Jֹ^I3׳q\1y 'BR\Q,#sjl`=\_zp#~|1?,/g0`):f~2`׹phޟ-o ?!P]<JƍϘ'5+VjB v\,R|k/&|̓95çVA¢\A $a@dg#*tJ{ ݾ+ znǚKus91b+eX),F4f v#D)͉/:d3 -1@Kֶ6/.̥_Com)izhH:̛Pyw6ȰRiai*avqw\n%&)q U{jǠ\uA&CσRϦKvk AEcZYT68ttl!+1Q!N)٠X߳d}M ,MD:|t-W%U$1;U4^Ï?c?qeac~__0[K9K2Bō5_'qc棧E+oMg52Dy/n͋";o}ӫ/7b5a%-PҎ_O'5a݆ qGHm} BкM2p!HQu*Xlg\~168 (;MK,R4rH#6RsjҺV5A⻇31YTa^sUm ݚADe4ENamS;NֳKe~[7-,W$dՖd7= x/n4ϑWmGF3Uui$g|Z c߸8?jQ⮌ŋߟ׺p!%-!Y$%p+>]0[.L<)~6M-bcUHjV+Ц\8"M'-3A'5sg@ p)^'ptϖ$fJ!sj#=ԗ>.pC3=k ҁ+BQ,`F]S ֭z;V٧j|v'_)@ft6 ʶH=IV|xJgo; $IXȟ~3m{5ᜌNٝޓu2٢><7hɻ֓˒Ԭ3 bLlN(%*vr@BMg#!"+RseQ*?~z>#,\z+:#g9w LoQWͶ_)EGq]qtQG'(U[Ot񼍲JP)%Zt)20"Zk$y;d/zw, _)ߖg~5~] bZmu4C*dC4AV)uNڳ0R*JkUUQ%8AyNI¨ FZ@CA~`^ԀǤ% 8z&(kTVZw}d>OBҳJ]>mJ 3y *-HaH4T4C˔<I7d$c37/Q"z}}J]H>-* ,$wDž>}s$>Ӈ(Ok7,G~ ~~7!nQ~Go~8a:[rJY5]q*Ͷ]fJykU Z*ۈBƍP#>/jDxԤQtGd[I 5DJ9 $3A"u29f1J܄Qcp1ݽOQALz^&e:8.:'֬Jnp6#r=o^Ӥ3=#{hgW|OW[8]'?N\r;ޭ;4BM֊uwѿ[8UlN]Ozֺ7d[sԉɄp ȵL9"ȍy2M1Ewi~ }n@_hlvmFTN3İńD#cL)__m>\`Xeg㗭Pr.E2~Rv*gX-R!&+Ix <+p&X<){^(ej U<=qCD BÜGncZ+Jda{#b"./[`0-86j4x;'PY`>)=SA~/2uxOz쀴v&Iwn_"*: L,d\ʞ}E>Vq"v.bgwz,qIj%)s^s8)NyÒ (+@avr>Y+ MRLV(1 jIU=l1NJS%續%e98]V tQFY2n Ĵ8'"LFLJ\n÷5q MbL7b E(&вsFMFTo i&os 9 r-ƈ4zc|2jw?xHv^䲟dj<ܜ%& f r-߻eVQC? zV5n5zʏR4 hvNՌ -t 0 2~+ Wm5 h٭h`-@]^U&Ъߝa(k]\IuYS4/ 0V{)ۊ@0\aFQX QI i/ʜ~4*VwX|Z;i A֪{Ѕro- 5ZOOwE (w^SSM%=8Iq䢔E1}x9AW53f E>$TLd(@9=zg1PؾGGu8lE}!w C)-(t;vwWmoFE쏨(O(=J# i R-?e˔>ةavItDj䄄?MVUt/tUiD)S CxAU &zҦ;VWI)Ҽoyӈ =]D/\9eG1q52mG/#}Io\D7ƗMXQ7lH[ hx,k&Hˈf&ODd_hz?d2%ȫ<@I:2N;g;O[| )my2 eO}l (N[7O>m<9;L+ d,ߚ^UHx)~ҧ{X4%$]lTK=~%j{9$QQɧ=Gu.E}C҅1DJECN !gH*mUgk•`+/!X D(m=}m\N"S5A\XM$zgm\RyHs M!i*86CN>ZJ:R ^2Ή#LQ8g]i]qKCO%2d/>9Ua0r$%cZz'0MecZR:bCDNmN@^!g#f:=G>?NߟǛUZWNДL8g"H*bqRq52p)nU,tiA,hؑ3 v'A/H69rЭz==IyCL27/b0b`b"CGnX}+nqz{KDdhǧG|e¥_n in=DAB}Go~8t$`?Wi1>#k3mR?spplBDh/>y3Ac R/lO}qC* I[NBi &*&wP? *ZvnutgVJ: +_QcœarPʂ CQuI|loL)GoltT~oL˫FgDo=.|@$;s?LKuX a,o͌ZM_[zS<@X'3~ @22Ǔt%V FWQ#8eU XJXh휫x2b*<;!6AO-cZpm̌U]b$*Uމ&E6JPjboC1 (k΄z×DFP BXQxտ!ʟy}Il?`RTQlNb<VrE,Ph4hJ鴭k,*][7+B,3%w~ ' j+VJj;A%uwR*ҴvJ;\RmMlV9."GRK (ɨ&ZMĀX*R-BpiJJI"'"3aeIK˘D;Çʨ̈́FUT I 9AyA+K%^Dܗ8{8>7Ƣ$()+=p|Ƣk6ZGjoc$JsI߽=n`%i%63]tsUST!}Q!RTS8 ^ٰ|0:mZdtj@$$KQBjn!&_^O7WļZw*^V)!5Z8B"BQJ/έ,lGj8{sU`3$䶃/n仙T;?i$djG_N6~CԤB+cw |LWf L-.-Ppj֙p(Fs.eFꤟ[r-Z`t47HE/E~ Hi%&+yZEcMpdUAtT/tiޞ [3rrBC p)K7:Æ*%TC %GaD<~\_aDjyެ.\#ׂa I 2eMC L2&3rSǵ*q/< ,UYoK|Xw76xE{>ɣ*{Q0{&~)|BͻOW9gdwoJ[gMny1?_}nai+ ;+b9 7R IkfՕ]> ??+VwT׶dO٠nܲ@ʈhIC^6):9ywvc4I'ݚb:M3r^;zvk[UhR9#}1S{}vU&+~*-b傫Kߌ׵@1:(ºACNi/z=RgCk:n(p]3PkrBua*_NiN}`V|xXO.]Q_#y;oƣfm9>Nn&grѪNOzlR*í@I!L"ʼn[DCϸgL %N[1N>wzwђF}Zv'BhXlP hۑg &( |w9?ڿS-&& ZZ6 pc4:L{r$?w!ij FHǔPq!J |y6&Tc3!*J1iʜAI'pҌI%"?Kl jԑDTz%ZwTju-dv̝S=K]oGdd_zp}gfL1zBRO_K?o6|&&03VHJ]Owov㏟r͖ʽ~ k݈q,{(?5ahDF1id R1y*xaqvjK|0/"=l<@D,or-A M/tS|xcaԥ2ş?L!I*Q@Wd֠Gor5-`tTpW$Bz$)0gGzFRh`?&dYfڦcV9:Vn|fyPD}E!?ճ䆡ɥFIb+˞ϛ6qtWnbGJ3&=#Xn~c߸k -gL( h&"%kIItɂ6tWQjr/QWy_"83j\9U`G˞Gcn6ߨSlGQqٍdB,.AqWK*@σϺٛv`;>\;Ļ\rCJ"xoP|-]wY:n1^/UKh_VRAa2ܗ#PNlV&xQ j1u10Jvd\ \vvt9*ǝTfh\~Pʼ!mJ[*(rzy%u֍ʋW eso:ޗ5Nx8# 2g )RJ<"VD4<%Ec5NJ@ųpZqDr(cFg 'u9T[2qmJKւbƒLm݋^chmi}ZfbPHN(N)d=T12?#$\ BE69I.ӂiS *LS?XSSZiɅaR9)` FLfq;B@?# y]`nkjiÑL1czB&1 Q8S?_L\<LLaڇ&هO7˄<Wyo.ɭ}gJ$b?^jXiW~s?IgO;SX~^bsF>G@hSGo8Gc+WL?n =cT"E^G>-m\gQRC_W ;;:"I/7SEo 9,OIM;GkNҴ&=_hHSeB\Xb̯DUg MFZvz&'θo<^7WjWU\51 gB:oa#Tkϟ,k"Ai{c561a"X8 4_8֚7[q$~֓ʝgNųz:dAӻUx_(: 7 a;(wEbcs烉<:vϝ! R(r=;v0%`Bv2_9DƹB~7 {UU='e1/*:rWbRܼe:PљWeM>Щx#=e_>sve \`,B^k /]brr5Ur c2 Y1Cɝ4 RrcorRs[JR(1W~#{J]@}''F\ʶAb:Iӏ^Ky$Gw:uQ-]X&! , i)i:@jƜ69H+\`:gxO9GinA5S0 y(+޼2s ~,?&qGkA'ksL<ȒؔD9g8\wB{2 #BDʔ0*RBg,ņ"x\vYTcʯ%@g=XL)cd eY}f(E2P"jL֌o`ԌoY>X9-f?DJP @;H?>I7l$#/sIFr_f卨dF S\ NB/ m3n8AHkw5DH>Eo 9ټ%hhF߆Fҍ.̖ަfm${=HEsX+rHC,p  w3P,1 AƸv'm䷰+X1TZILF*qfzׂz>7÷;8J2,RAJ YCO)8u]:L]a.N]H)X+AΥ e(C"(K3d h:jQHob6Am-1kXL͘,Y%Bc'@~V1y8_sðXfX1bȯZa[xB9L)-)ĦVQG{-㔟B،[K̟yMmbxv,B KcxCyƌayqGo+j' V@.d# 햇0EGE|hHWvkt>ĺ/Pi{3O_f_(K2䬪~@0mO҅Fk ;(hܩ Z%+i9hN}?oy|xXO.];ᯑ<]7wٶ D>Nn&%H\yj>'B/T0U|ϻhvg}<lQ/?_ k>vN(U'zW*욭66 wq'TfTP5h<['.Onݗ?7Uɉ!{ 8dԒdibJ%]ݴ .%`w\x2;۶2a{6->э 2nD b<7eu}]5JTfb]5=kF:`h ,6(!DӲTPꍎ ׯ^հd˪ADI!a Zפ5k%t3)XǦjo 8i!,԰|/GW@2"PSug_8LcHo!> U8rFrGAOE;KG *wub^%w#Œzmns|;j9Xӫ}kjaإc)c)C iCԤ,vy=!'ǭ59wm>U$bd$.dm<;iBxțL|z]2#SJe]~o ;(G p0{tVJFu# %|lЁ s?X;H.LtӐP1,%C)ZWW̖ U!Wut 9k?! eHZseQXJոSWduvGYᰤPP biIPY.Ɔa@!c5m^MC̤քo?c03m hnj[qwQCDVFMqM\aR7QN^-Վ܈no&-fZ/CNh uz/oc;Do~[skr}gͯ|aOod9@PUurڥO:d͢E2$]4ƧH+XGMTZb1q2Bzg-LJzݢ+ۻ?SaB?,wbM?ǜ;}wmJ=eon_=8qwC囋W_٫ժWn굟jUfgB9VCw>ia7±L1(Avcޱ!4jmL̡,J`1JPS1!U41 4[w6)\:"+%1F SJAUL2@K`ISTEXu4Zo@AФ}~@/1f>e ̸Pղ5 aY \Uc˚*XQD8}iOylJ:)@ V{4 Ae ͥ}-)VZ B^xtc`" (s|HJ1؝R盭xcǷð%=m1CN\5<;2AZy ݑqf#yAC t:Gk]xs:#Da$Ӧ XVX~Vq^wʬ>H2-({@`l.4`usK'On3R] b022F39?9=jXQOVRa=!Vz*yR6%=qxdy_xC=t$_۷on2\Bu=C+=੉_ ;foº"1nQ!4n2q3cד~=|~mڽNQl( P)k/_~^)8P vnUET ,: P'b˧}lD,} b'};n> &OGj9)y)F"ќ'AL@ϲ漮/~u|p??^;)`QStdm\QF7z#z C2q33a24-#q;Q`aF3$&=k. 6X,aF vc^#rFW#k#}$0J]bLi* "XR D%悮&q;@>@+ Z16JC056@2\q*X2%P )15X yDb Ѽr 4l%. :YU QCQ%%.jm8Lɜ2 XU(H\n/olV7D "f iXI4VVI6zARif H70x}к#EHEbs~Կfu#HZ=:󯋝VTEnv7b "cVv֎KODn(!tyj2NAIk¯;m+mJĵ$I]J=&>{oVQSpHv o]ӽی$J=rzU:~{f(x!+_: NLUBhru3 8!Ind9ө8GP!a(<w N`mruyk{4vBp-vn]TȨwg]l)tWzog.y53\U// !T*"nyCV骴=N>b` z Zt Tv%}vNM*fU8dhB[+7G֞)GF˜ ,c?~|C`,Φwǔu!=Ϟ3, xK4xR9N y ]7W@g'8/A7`ƀ!B2'&a$\n3ظ6JT 5ŕԐTuU+ ֕)U*0@h-fIs?atn{F{$!VHhۋc&rBPCQ HUJe4P"#P9__OfwMG+Nb z=q7N[찪'),2'K*?Wɕ.|?5ZyDV Uk!,k4(2J]rFC,i h07S)@,R۳c64v0P#bÈڶRk` *(hEj"- "T]2N,lRj} I%* r7YJ;@ eD* EkRmU~XJY z&nA'aXFP7Kk,?}{Ey}l߯n䢼 $7]~w6_ڎo.짰hLxi纜/_׾߹[a}6[# oOn2` av~~;^J p~qH04q!(ˉpc>HR'p)CR `*ӡMÿv t7si n8"$u.7a ! Gtwr8Tx=X׶?D <={ {' 26~Enn}SM%L%OWBNȗ3b0Q,`6<#ְ>GLj-jc;i(&NǞ}OwoO ajcr}8v`tیDPfM'[d'̲8 Yd'[흹]F"1=$qǣsÆ)` E8hap>,MذSIB˹`Imݬ'8y^7`C <靅%Z ̺R.ѵb4}p>.l$ӷr*iuqms Ki+Z.AG˴ GAN[ 2l )3 ZN$|go8tȸS 3~Ч극D97ԅ,+HE!Q%(y2\!:j)j<䈒=j9_O/jzd`LKO rd8%w_z8+nĔl9yNjO !ᴒn1SBe_o*@vф L$d!+G ! /6rL p2v[TQ5nA )!ZPXP5}qy^{H y.Erm?g=:p'rj^L&A[|/x^w1z"`$9)|qwv~GOT9Oh8?oט>9+Gϧެ I0}^E\ E݉7'< If?T3+R[O6 ~r6d$Fa%ZX =͋~-ԱȟǷ[ѤA|iI^N@&xSƖ88[@ŪaӷHg&s*ɠ9^*K;ڔVv7^WsJY QՀpƶ dt] "H݇@ M(?m#a\$À?sڈF8"KIMv~I &$Ey{Dէ UJH$;'6tqGA>C 0;dLHT&sXQF  Gdž 8%, =&U EJXEpR-&s([#wX2dJrł&1*$X!BZB9/1q6A*$,^pEd $\ I] gJc7\RYA ?_^-usrގ˫`ƋЄ&Dp|Ԉm Rja >V\H "a#K2Š RRiate`:zqͅnul!Ůo ^RZV͔0&SG{BET{)ܥxXc\ujAn.K" Kvv"u)05ɍVJzb㶛.0*oOqKw3X/bE81z_*[n`!T?܄W#|ޕ֮"]CLJJs.0 Mx[%Ş8RQ`Hb~Y~޽Pn3:ZBluA68 bcffMnwy3_ЩD}8T`rV\\)J#po[Qޗ뛗h"[=vtƔB~nM5 "N᫦b}e]nk9OH;]عb" W%k¡H[;l:G2靼ohCo wfU@(n\5QH0ݺku;qu](VhۣiuK;wy@ij]~{sg=L^*Y`vrL~|1`F@󉗦yp@).m5p$ 9^xM&uഡb'XXBTR>2]\ Ѽ#).Û|js5rKwt»Y5 U}2Cw8% )=q2,. Ya[E0PXrM}`D`3Y1ׯ۲͆M?|2s[#f=ŒTv𛕓 oO‡OSAZuW~zGOY5aKx.I= Aѝ[!eOruFlqHIxT*G<)Mt7.y:)L#d?˞m̓cf'q/GrcN^({㚔`YRn蜏\+lr̻~vW3~d7s'n\A-A.M8*.*V6gkaÎ %-Z]b)em%{bEL“sڦ?D.M U(1K+U_ݭz}ʢ0~De>lG̑oFf3O\t}tm>M~0.2//W/IѾ{]Eqؠ~ "*uD>5"bо>ZR| (|} q?@/ڦ>4oW/wSR, n!< l(ByS( Z1VYfT N3d}Tβ[rCf&"䙇hJLiHAvKŠJ>NvЧmtPЪ٭y!S'nՎPΎ!Y^I%OY''bPI|F:tX0uV''okZ'7!;n0}Afɳ4^^|6,}㢒W7#-QÍF˸Uxٝ'RI˃pNaTFB W:GHL*켕:0LdSWD_+g0 @8.ɠ1L"$yVP$Dtgd-ljlW(&G^^*oۨ1*ǃ y]u!2Nܟ"+o˒Enַ6'-zꁿ/ >sۇo }ߐ7>*]0o 8ο0^^Dao(+(f28ݏPO03@E/]JTt \ fVJӷ^K$D`axȋɷUo㖛FQl7^Y1X5(P@X'Of~efmNЗ ?>RH6'/R2#x1mmū޸<`t:PZ?FyHfp 6 1-)w÷љ4!_0/BDJGlf`U.z1{qs22Î+@΢ ;RYT2#X?AA_m{7%FG# Dl3*5_Iaz]-e͍kel!dvU:u8Do/G7ē@{ cVʻ^Uc^UQZvW$!ȋeCW}2w0~iAeXl8 ~> y l8|?}ka} A^O=j Py{+\}0]"^,vW׃&u;vq=x7Mقw6{2L(ѺGNW]ֹמy4xusae&^NjDiC";Cㆌ[=T򓲜,N%*9vL#gg;V !HTA~BrWٔJEoԕc~A3uU^줒7v<|VXM x-F `^::F6>F5e(Ef ZVW'R9݄;HQ|r/z,d,fΛKyޒ;۫Ԣ۷q$R5N\6nkDL˼ "78# Q#Bǭ,`1gX!îu0Q *:N 3id ,YQ$P@#ld 9[CRnZDF¦F\#ݡ3h%1)I)͈eT/cFL`1\c2ae&6 FJa<!@(2o[ W| DFfU2s?^FzWyAnfvsT{G=ʻdZa jϏU_?VVxz :Y` p0],۩(BãrT ʿ_#0f4]=Xjak Wat/CAZ#cLaJy2:PD4:C̴#[l1d:^|:pAB}q2L bHf QߟCj7xxʲ03lωPZ2KFkHImBT.KrudHn 2 +p_?CapY{#ZsU-O 4yX/ U nO8:ٜ<잓ׂ@h<iz8&L3]HNKGqH+~ 9j`8 $6 d#l MQGGM6IqveޝŮX˸oj!.Et7"T.̤Jf)v(10 ȁOj#[]d . "][Df`^OG& {^+R@ Sb61L'Hh s/Ja}/_ky(yE90??JdmhWa,fظ_ɮGL8a^pm:[?Fe|+bT<)$ul7lkM}-]6GE ^ *R@B<a7D!AqJ>5]Ơ!ݑ-6jq"2rUXJY豂`',#.v"9j5=yv)s(X;ܑ-7*-8w[}Z]lZZ=q^<ڻHB  tRo{W֮5D0?Ng0'x^8ahQАi\332=(!_WfG0з % 4x Z-ĠB%J<ػN$Yyn1^,uxevB1!m]DzMspoSHvSkluI+ozu3M=8 ®u+[qZV!@:ocZ+ Q$%e*hF8˴N3#Ȉa+iV0n"*ec5\j)$= 3#T5tRsAڥcc,@F,FBJ䯯8e*P_EQv4WU;RⷿvG+mw=@vj˸TO.?-x/b\.0|]Ʒg=̌K>1\y :'o=El[}ܮ_iм(/$%"yxu{];+]o\vm|٧vu^%yX|lgkJ>{|Ac!_FO:w tbۨݮB!ͻŗ nmXWn16J'g:- 41dwBirdKO+$vj)%^b7v D\aԁņp)IR:\ȄZLDvt KLț^c97x7d,u #+ڽ/8@3<%C% gHՆF <\[Y/  ]wC) -T1pEـU73%b4_.~ښK~}}Jw'uW4}ŠcuwE;OMk)A/()0.'y[/sj#L8<M?q:ZK+G#XoK;sʷ l _3@D~KӮ\Wk0uO~ۢ(r0_~EėowjGk0QiЫswq#A&dٶ[-,AdXȷECѶ?h3%p?E((`XRVJ dwΞal){`a=煝ܸeWk9M?I;zzz-x |4wyfMJ4 ,Ȕy+0"-a*eK}~ʱߏ#$б ݏ0Pz_?O&f8RA!ėFbuRDA%&2[@TX%`jQ%.@Ze)#Cp~Oܗ'h@_h(/2H4Q")B(r4CT4̈V"*(8C 5;,lz*Pr5oO'[ 1`9)wۆ|ۆ= .opslϨknfLH)DSt?'v4A[,;ސΡל;-DɟH >V1BS5N_68kZwEsHG)&H^5bSL^JoIɝAM/q PV{|a0}xWX⡮,^u}]LXs6V+ܱgBR2yb@2@gD^SNH(@ }kG":3T!MakǸ@J,D}S @fSɑzXkqE-Z15vSKGkSG<^ vt`V\M/O\>v{ \/Cg %P8TЌʌA'JpEH R &f<{s9fzIpBT11, 㻿~= 'yzc}R~;5ASQ7 vwǒ!^l<3<y}0KFK8Oӏ@7?'|͠WO%jo\"0` gTUb勬+FQ*A*iޅ3*)MRV@CRRtZqR J+\ci/?)9Z^h>\)8_mŬ'&Vɨ pK7k8@Kވ &TT+I3:M?,9QJ~m}L-C Hd8~c[(6pH/NVNfX63YA24Le,%K` 8:GD;o"Q1Sc-"L"!,0&K7NB-U*_^< % 'ɭ7憝o>; YV- !aغaHP0l]'kj> 9QEEoH)1LXj J ɳ<C˥}qXfif~\<@,()?͋պMN`7>YZW_ μ@{ xN6?9}|#O8FB c#_K$2*41儼 ju >UٌO.Ne(ĴQj7hWz{jif܇jt@:2NZA8YB<@g鳶QM>|XRyגN3]ΨPF P:G&E~̯,cOXE˫0n4죨!s^u6$&wl7&cÌ[ N9Is JNm?n KEoO9RRt<{P*¸PRcR M^/\OIG! 4|QiM@B$6ŌA P R7e(, ,1F/u ee*6u0VΠu.w‡ h|ةSce[rfY GܺpT" f*{Rg0UC.!}W9?1v!1[98Pg04Y^NR[e{Rg'ٗl悱HVN?IICr jNʵ^KLFSTe,T@fQ! A h֩e'4 zDgnpAYt#Z[ohxdGƩE;(SJ#aљ2ӌ1fLۅ!24"F&O\B 3h"!Бx"C} -2^@$h!tw4wX{[+N/r]iv?{^?|u}*w)q(bu?ݗE˧Y, Wa&tgOs߇}XP %0\yy߮ӟG [[+y;B w纮 6qJS)ZWվUT^l?SOPvDŽvǂ-bZrAc{c\/a#$ !aBn5a Ow[8`QiVJ%7Own=  V[)0jNa1c4폯]VpW$_z_GrPBrcrreY. zm[U!p;xtaCv3ϱK/櫏!9&D.>07NGJ 9=yT"SȎ e7խ}ۑ^郱dZb :+3];vGwlFT0G`";?,1,G%Isw"3%Ŋ O޹ZbIqKz۴r1mĨ۬ijB sfH$'tn h.ӘSʍȄ.\UǃOd5Gz?"ʋ&;\Ob2 ?:F8QBtZJ1.bZw WSXD;m/\T$ılmsJXTK>ན6b1J>(RJ i]n!\|s" dxQ"0~yzT=PsͩZb5ĩsEN)ҽ*&I?6ykHJ񹹚W̾ZM ](`|x̕ bL7 .s<ѱбTz~|kc"=Qj2ˮs}{/Y/\LUpY$ NDZ (hhKHats0_%b-Rc'NBcng#25t9g+ ڙ>N[Ir ]6LCN2 4SGN<_^Ĕc4J խbf57,2`r{;חLxߕ vRnrբs8B%l8BMQphLȝ'|H}˨_w{:*t4 ,T!5.]=~4]Q'auAt6ߦ5dDgf:Q0'nXTiw,]#]> )kaO}:x\ޯ~;~J9?^[O2v~ߖU(5S;@!2RoGGdSUS&Wܹxsq6yW>`խh#< okD}TNUFd]qUݺ< C2ŠTNj:p,@[\v3hOs_pFze/$$vgmB6 OGY7 kl2+Ƙj-rZn`yp:.|x֖~.aB,#4u4E`UTKLV^59*GJMMTxQ|0󋚲HaQCgz8ҹ| Ֆ=wqpGQY~vG1 HL`e v਩J5ށ3:Va%UE"zhepl9ddO>RX!+g3 ;k쵂[vea6]WN$T(M /% shSWN^9EzW4U 3~si#EhI )b3eN&V ݾ:EAC_^{K\FF Ff5hHʄ !q"2;,%2Q4)X1єwlm1= S31Or97;h$e*G) O]_IBHlDF0F\j) T`Zw]|YФ22ʀ4T %0VSeuQ2P'*!Cuj0qcu'&Tڃlj$h+cvB f2][KH9@bJ"6סq5/6} onv>>D\W%_n>Y8ֆ}r3롽b$i$jHKǍa-A16f&&JdԉH)#$Ψ"ֈ4KI%7=SeiJVGa<{t7߯.\c3;bq"ĵUEd mV8}7İXkGRDĪX r"Q, KuՂF@UʚtlWxj/?Vva_w>}INg9]C1wOxN,/rX?N! y㟈oӏ/Qa2qq٘:_O;7r;w[~x\,着E J0_=}'.%^^LظUx‡ _| mΞx8d<૙;,9:w^p<|(T 9;x 5iL 0E 1>Z}?%píE%zH7mxjy 5aRV ,Ϩ4(RȠ˂vNd /JD@EK7r^>i//-ɾ\wv]mvR`qR%>Y64&z}ZG*%Zj<I92KSP 4I#k,.Fj64JiJ5| M,&6)aY&#8*:Aqdd$BD G  zڇ) AOxSECD2=@q[(>S'U[v=zڭrS4hNژR+6>Jp1;rsWӞ0 +A*Ka7oz;[sURh{gep?ٹfmIdnv^ݚۍU>/tOz돋[X!! tOEą<{ C4@N4.98IMxn KwmgPpaEJ$@tGD|g/6λd?9Fj Ubx, ku[ O%Uŷ]|s ^ͯsobEC>N$ZbBZ vZ= )m^)itTBN)B=%6Y<7x☻ WK ٕTHEB9I0]nSQwTbH)jbkU)g WyUZNJcӠFPl:v=l0z%smIfFǔ$U " yO{S C-E˩(cR5rtQq";\rW-bEMӿN_d/_U v3GFؠk{qh(a AA=`Q>z%`E^kK)}goK>ivӃڮe$8S#@uдy /@dE*e (GF{-~fv"C4=';2 h gj2j jre%\wmH~ A˼,>|&dK3v֖KN2_R֝Hh2Q]5$X9I) }CY ~7w;ҏT+2vM֟Zp|ʷ_cOQlv*7SIMTnʼ8~,LRZRZ[B0 PK岌ex,>)j!]f^U]8? Ăt0tPse1t \g[bQkpȈc~8+TjIW90{ppYBj|% 5U,$+1 ^ᴐd Cfcl8.ӟ)H8zLRd> G3? -hPO_.[r ˎ.hȩYPI5qTIAo=ȩIVИ;qPC$$ '$w*ؔ';_GOPb4#֕LRQIn䌖6(DC KCYݾV횒1vrՄ*}9 rՃ ^Gݱ,vej r'2cC"sK]2rN<6\f݃yb}/W3_IBQdM:9y72o1hًq.yb3z;rk-k Bgf=3;3鹖_spnf-eƨ(\`穝j;Dlm$UeHm6y4<Gqd<-7ҵX-!pA[Ч|XOk&HtS8vj+ub%8Ȗ;uZj{^ ͩx-\{'F| (_+1R8FQ`8F=5 : S؞ooTPQ5}J[7m-n7i"Cl,jWZ`ɸrJ]KO)wwIϩ=[)s9x|M@V3VV5L/^z6ȶQ-6tB(SƄˠ˕ Ե%ύOfi^"3}P? ݳr+J"(Fi`b{$BXVRͬFqS5DEZfL3PH-_Qɛ!nS†+*"3YCS}Y)zZٮOv(Y.4Qxk`iGat'.Qߖ[=9'j>fkZuT=5T0:>PCb:=^+ְ x,ϼ p4BivoɷvDpG9S ;=7|,G q$=uj:0$8Dd?0Luױ':k@瀶\9Lk>EYi;ҍJW.Z#zrW XvD&zZqn4}"{2?Ky2iUi7zX C"f:݉b!F q*gjHPC^A pm4b)Жp3()Jh4YTzW PE;xg( w"g#") )1L(ʑ0O7T tʁ&.YjuZDX 5&Xр^j>B" 2W.gg\t,w5Ɛ6%NȕvT0J$osܚׂWB4J[/&F[aN`yM\T:,zΎiFȀٛ&ysSFkX[Pr`5p=Zq D%fs 7J-[ВQ= (+Ԧ0o40c05Or9[rtq- \aTGٌΞy7F5ZM=i8ՁBTrgN:Gu)81=a ΄D) ȉl<_ bQճ:ez[^G Idmga8?/}5~5XNӘYm֗}^u*9UpdU+'HVF\:AvqA}X.!nއ7[!]sSU$:Fc;*]at/d f#NuMB?,t6LI_QBռT,\A$rB xz{0%Mq%ՄfB.3Tp,^ >͉-fdM5ZH ) ~7w;gҏDLezi2~8sۯ-Ǔ2DlvK7[IMnҼ8~, ʈ R8FB:<'ˤjP3pV $O>_?b~J鋮AebLj2}`r[V^vp|?x!*i;"9{I=BQ",9gڂmjO$XkVXl>:3xF9AkC%/,v)I4&n i򚠇Roq}=(-H'LTy;[˴S5ɥ'MA bN O"*<6VFIlCNFVPXJY Q c SaJaSL+[@/ lER/7_ul>E[馲j꛻Z6hC2Tp8t*tɳIgZ!ə>t:9z_f5G|x?}k2o`EoDEǻ86OIAAXQQM?"x2:߿DZEӿB߹7ɷ9V?Fd:",%˃%Ə$!2cvo0+"(A'2Lw;&αlv5brb>:FH2WMJfһŶ|haNm_m4oWe1 :F@Ů[-f9>1B.iwSLJuheUfs7OߺMDk'=9?nd9=;h%A=F#Yft 1Q/fVVBs}2:H֚G{m }SHTx\kjnRT*OQQp&iա<7Ι:G5MGBl&& E~I{~<j tI1׮I8{_e =dR [92I"o5G-N 0)8JJb_]F_i .Ѕֿ+2w8ottcnVZqq)j)j ^#Ѩ~I}]2uA`W&Vi<sg䱨7~{G2af-8;Hӣj[m<ATvy<0!)P~4T*ra҄5OLad$B%+ _}&M{7U߫ Zk'ȼ%"!ZcHH"AKKq6d 6}Q#a¦/ۉ@c+Nx;'(|5En7fbC<uXaKFL6tPZ92mrCޕFn$R`wRGztnA!Gl]VddUIKd,-۰%őIC3l-QUvqWwtsߌb^0\{C1;T3r/))Sjd`O/i ʄ Q֜P2LSmKE*ԊKU{A_*1T QMj!NJatT#F86R9OثK69d/PB{f-l$Jo?mzͨTbݖA|~zr}$r2O~8ЩtǛ΃a+`ܩBsiRi˹28uɛM.7ggLj1E(aXox,X![`b 9Y,Cs '^|R+sx7բ>mgޙ)\ j dyhov{6FPfx`[G+E ]osSz/ =3mg5Q`pW%W{IMxzٯ&ZIz8ׂy5(D'w3{y\[9>=[>)O4KIZcGƤu/6iy!MԈ7Kc̛y .>ilH7N~fle֎ ;Z)?#"G%Vb%5 Ckob^ĬTNlu ռǙ'sۧA*MմVҍrxvSޓ3:5@.FߎtGp){C-qQnBͨH-1RZ,P`\y$&8+V8̌V%s62AqKP)%{PGRﺌeyA5YVj7؆{}.^sI4*9P_"[qhn_UMBYtޡ)Z1Ol7rMpwgշ ʚOuBG*Y|^*x?y:$O.92k5c߶"b -}GvDx*`-{֭ hkb35_}JfQ^zvrsws,1W n!8?9{0wggiM nΖnxͅM̾Ӫңxg+܀O7_s$\CD75֟U0/6\D)jEy骤4o'|~Grk?Mmb1]rJ½UJ:q68i"/)ZnyVᑍce7(ۋ!f1 PD/ѭ{x2+DPbL/<L+/\[CKB)@%", G,J k0[(&yHw;Xc(?UTS B`g%JkE9LkJ6j(IAbWc6s6hj*OAY(E^K؈l0 JshMPGoLu%:^7ffh*`Ɓz aP&*t[fjKIk4᥇MMX#)J1W#hIv-~n0+8EV,NF gx4qןD_7N`L mY\u{C%%_wNvCttk@S >CLȾwW$ jEe.29`U\JJ4Y Oit &{ٯ(MT:($KW TVY ʘrEX(CrP3)/=6L݆i0w OzSfi $) HSe,S-Tp<.|x3yUT(ֻ[d9؅!1#Z,`?͖.XBB]1\+O(XJ;qX7qX7յSp"դ sJCT(T.gF LK@d}Qv澼L+_U: 1.K7TOaj_Vx:0x:0UO! :qh()$.aSG4%b092clɪ,}yk2}{CzO ?.4Ӻ _j^HD^XV11шs Ie@\`Nr/2Ys_%չ/ߐ(,ä́N\!Pk=i쵊Рh>c^aH@`gie Ƹ+ SqD(s,(v1v̔C ,xw`&c`;)(ƼtױK)LSc, I-v>x9S[`rC&SĎ/B:>|vG9fTj ^V"yrNƃ4[ F^3V*%!{q2H(N6-3k`M!Jb6ӫj`B%;sǤG.7|)77NHۙ+lqX*1l)월QLPOphx5,-zk7օ' CZ `^ lہѠO?S--LT+!Pe)(dϐ r! PA ymzup; YzM9wa˧'~qXZoV|ǝD%ߓV??=6~4軋8ADEޞ͗\p_M^=?B ;f6jivRf ܷ|$pm<)H<Zx)"!zG$#;>1'ꤵ$_$i[t8fo,ň \k@`|n*6+ w|Fa!ѦDٸ-?){YZ-r) YWH|IBQ)KTl ˺HrD'”=JM)fy@Rƕ 6p&FRr^FT:w$L\ ZlLs w"LI)uau#LJKrrHy-8[H  벓},Ӆj_p@.Jt(Z|U"ԢDW=pt%Oj@׀|p50m畆ﯯF;"uY@k#6lgL#В 'G?Fsjq{ wܽz]nE~xW[9Hcru:& ;(8|tqN6Y|?.c!ñՎژiK&=Ҧ@44Ub0#rqcjKsyGhǸ9nQΓRx4 [,Ldaf<<;r0璅ILK6dT(9sxzp3g h{.8G X>O n2LG/~?5ZAHvɓ i"httqzZ9Ul"Z9Fӳ&7H{]@Z\ԝ簶o M v`Bt 2c֘"51BLQB>-V{?fyCFe9ojlnOs5'e֪i}=a,p ػq5+jB & (T2rPL$kq`h `侧NQ>H/ow]+ XF݇U,48.w8д]ЀV> eVjNw Dd_Yb0[hƦv\!&8f-bѸRA HK#sl-0j^3I~QQi1,G~X l(?ShYVrZ8T 3pqoYƊ0RQෙNiϛ3o>rȮ1Ԝ\`<ڃ ZG'#P;-_8:+b2MbSe)IbϓticMv(.媱jE%M;t߁vb=p~u5\b%!vB+@G4`+- Cx/TT&@aJ OQ ]lc]F3דF J`uy,&&+3g+dSBh5_Gk{4{WɎ ob/UUcr{j'ݥ237HyFǀ6}>,܀;yB}9,?QG<0+,ر b4pC 5ToUo/-3v;xyaȎ;ZF~'QtUabMkcvP"w17߇Ue +,&lcO؁ G6bQv?zr!^\;u/ '7!!<< pu3]'0V@kvT@~SƀAG/^6C4(}axV~)ud3XCr }m P8}P p}Α)1 " 9*k&!]}RYPGkh`߄ٽN᱐N|kM_M rl&)G]ZBpׁ$-ʞ Bu9-'/LUWtґi`T}6ҟO9j7SCu-T˥-0*XZ4~K1_M1fSU'[y Xww~ZB9ocL c-|Mhd!>a)'Qa/SŞev{j;>YY0|`\LE^BIQWX| ߢy@k0Œq.s(M,+k/U8r†9|= ]r=g\{{!c|91{c B1韡g܍אּ*I;d;Ѹ%G^GEF |d0)M- G8fA&%pSԢ3 "\X3lC8uCCւ 4TRmY؁ rDmѧ e_} Z(v_eg܃"gcRWԶ GjJ Y1?~n嫴_ɿ~z.붌*iɟi?yi%!e(I" nZz A '7φςэ^;$p {H!H&>_\ =׵RA\x߲_ҏāwb2WߥI~-8Śb_9MpNS/ vp}P ;k%jZb-*r`= P{Po}~N5 o}vY`NR&]'z7u~MN5xl>dGlLl<%_xʯXݤEUSwPv׺h<X A'yPS6,ͷux) hg4/'-]yDСMX>1eeM (Gm=ȭw¼O?Z!؆9^PXY<|{sJ%yn{}vŃie9R(Z?M'OItyeqLI訔1:JNFWUBI.dGY]'euْ<1xcvTsp!R^Q& Ȏ}[vP*{, Pv(}M,C ]Upgm햐ԢώR[:Q&Qt,FlSIF(,9M/46_5% r5KH2VhUU\idA{ɿUV8$ZvS6ϖb dĦƁRw )!4\#D6l4`MVEu”epeIsT>e-T<.£v&(!elPKPU&aր^iSwW֓tA{ga|høw+F֕g?ޟdMjgcCQADڡ Uk[[^fAJ]P%?f167d!X~ޟ=Xu]1&<ǺU6ht*YO9%CAP j9ƺT}*ڲ%Vsr#AL-*@ ֺRCI)a2k)~{aN+(C`$ʖVт2Xيxv-لN=OZ2.6Ȋ8TcF)y9 1"bg)Dbf#``ЇJDŽ6Dgs,7GEnK']HׯX/0ֱD``i*- _Og׌m*S`|d5Ob>=nms?_Dɒ黏?>uEI/py̷G^Og| :9Oe13O;~:6Z[zsy'іLB=8ֲy°ɪh3=5cbģ-D lcRtV7 r6eoGl8qZZ:iAbsAҒ; CMSko&DIgFyY>z9aU3 Ypk kH7N%/qQaHC7#*9sʯȐk-*w8ɪƾP ٴ^`KFW#0D :Śnдmn"O`80wX航*V h '0D+W(Q3j5x칕 {o3eٟO:^hE2dqJ)ɻ&_"m-RR•e>Ω:bt0uӷzN)KV6x6b'hU5`K8AV  m4Z[f;q CR Vq\7RU"$8ݰ[=yS?vyh\o.g ]DEϳ93Wk/8ir#+y?Y1nPFesYc0%3ze\A 56O͜YY~ǢqP |G .=Xϧnwo H䞸BZZˋA@2hkF,6̱<ښ3FpkL:@NBJ g{Q"mHm# 6#N£`ߢ~F,^/*+[3]ys3~?T3ץVg0@ddW_ԉH_'=t{@|:|zp_O'$dv(ߍMbN'S O'ѸtbHݝtBJF0>AġI]v^tqyB" _` k9;W: ReuzL/kKk;%'+"=|1/R.5Œ> b3ݜw2qi{Q `7>7kv\y{)L#QdLUWS)撫lK-!dTq mHQ (h_iH< \ 'BBnʌ]^ m@ΦT{-D^nqQNݿ6l\[źdܙ`g8;!z&{gPP?ΒT兊Tu9* ñ*w?7EK/=قJ ?|`Jhffsu+O%pT:KA m85DU\b?3zjՇ oOcy,_|sR&k/ $K}lQlJ|{0rZ̿" f Z=;I~t)sg<.\vβ~|R[ j%OlkAu!toOfv[8fSZ,ZN%Cʅ)jSꂈPRTuU+w}c΂}-# Bp? W8d%?D$MD m=#=C{#\uyo1۹~%=ݹ~gIЪhLW}EKogXGң$O)ch?>#TǙ}-Ps(3CQVz߼)}|~I% J b@"zBv2+@yЫIttk~"):yZ6%HWPvɪK)¦2wfàg }9Tx ẏhkz}6p7"Ezu'B.5#-t8 |wycX9,1 ы% 18 B+nj):#7l#0Op;/`&ϔA)9Ce] 7Bb)V0 |*{8 uj(ICyo[_*p9.vPiõ]o0+r6fhC~;Ipedݳ^19ZM^=h8Pvn79s|Ҫl8y=~n\vIhq]Moj+w1j3xɇ|7OJA(J)]%% d69ճ˭oB^Xf߁I*t]5/Y(3_ڮEo X7`ݸk v{(L-J.#TZ0`PU @f 60r⤰^4*Un>߁\f sA6IcC>!XǸx96hӢslA^Jb b@Ckqyf6Nk80B|HH}0!0G+ j_4U,9$5fe\UZJSV'0pܼʁ e?,!&C|58yvI_j98)\Bn;BPFIw- +MlXc" ###(i9auzӞ֒ ދڟ^fG- ݥ M%ԄGac-T$]:6 yyBF8V1CdxhFXKfZ8OQÕ#ׂѱ"e@țC: FsiUxB%'*Tx-N|!59xN}¿9v Hwy3ýf@g@/4-a giGTkJ@1670Ay#4DaVww˧W~1rUGh]P;{r+ڂ" ǥ W]?W"THxt(:6j Ã&.sL )FSB 8xe 0"JvBp+-id%r&ѵRHK Rj(uITS9FkZTS@ /4d@l`bjJ W\(M``*9qA=8bn@PwF4$@i%sZHaԢTPDQD j<;Љ9m.6%9)F1*Q4K)?n3 *4;57/ '`'|5H|w~] yr ˻G @9 &?Å6gJ?MvEw?g|sgV0tow.ןܽ3^JAA X _. .HB5^^pRAEzM΋/ׁ+ `W(7sb;vZN>YC7;mFM]MJ h)tmiY)_q_4}m pn*&+knc(ָqJv#r*73yA sJ lVS26LADܦ810ٌNMXJEB)VcXjye s_ܵ 2 iAlds$ ~Tx z_+ DJlO$Fv ZeYsc]IT+GU>+f ,P2vil skàc\98=! z-B'jA@1j_%`UWit\k x7[CPNf!/W aq$rfC /S Pt5uw˧W~1rkmΌTĔ et1Lqi+Cݐ만.B$uA]rȺ8^N7h+8.!N է=FԽMFĄ3}) @KPj%I ׮ Šb%) ,*k0ڥMlD>F ͺebOm?oj_t#$^SfC= ޚ/:^֫LT'kA &/@@lNvN9)HnI@1q.@%/4>7 lA PXRv&ڙ-L*&x4ԦL5Psm,\/\y}wTiߴXyl4CIk#dą#V.~ K1ԭ~b';Rs%B(1h?\{dcufR֙Mh[/ b֔M'us݂Yǣ^^qM$O)\OjYʬߟ1v|>Aw˫3o~db<ݨA{,Uқ6P)$!Ē?9$]ZOrQYx\&,䕛hM^Mbb:cxF$Q@B^ؔty.\o|ޕ sR+=mOH^ =O%rrv5JQAb!U`-u'[newl{&)g\+Xo2A. E+J͔P`h'^Xgrq7x Ɇ=mȷZn߿tk;BsfT^:iPb~YwMxymLr/bvWfݗOòO۷)Uob+,a]LU-4t%knH9N (#\>=Hm:>j_6ReM\唋:z.. T;a~y`h@ڗ`ڗ[aw}و%\\ÈL^eooww xgCk%ZwQ{ȕ:G}a+h2tg`̵2ZZMF2βS YIyAfsp@'jU'J;ut pz9:¢9k̅[rΤ:7qv|fmȺm# )oѩГOx݅ ( #+kJ2@ 0ɀ9HkMV3_wٛ"D O~X'J[exBQ'$M·|;o"  ᾮtC(K.ueaMݸH&J)? 0eڅI덬T*N4\`)tQQY(!kL0WZj ZɊ ] 5"sfY\Аbj C,X_;%zA@JFL_P.]׀ 9 5'1>wpk (ЭV=HV56>};"Kxwm'UHI"6)^!ڹ3bbP[eV.G.H9͵ʽ[B~T+*\,'pjs;F%2TptUrVu,:Ɉ@v[ʝ(Ava썢c#FotD jP 7paM͒cNڑSp؄.m Lus6n*] ټV ݏr{YW;V]X2BޝyD:]hǻWMH RNG@U t4YL5doa+W u"97s]sf1zRO7뺜:M)-S  p@Sz|ZNu{V@?>"`}SP%1cm@9Q*wLSAB-$-M |QmLJSJl֕FQRqTZb⢂YW8>3PٲZJb(.֪Ux]XRJeQ(Ln@Xq8@xFͨ)DbsʣC K Gp.&uz)b S&:AgJ=Ng1]'mp4lӃOȕMQEb;2}3[; dkD̖NGj87Z E#uERe挑A(c)6Xnʞ-NJii+C'E0BRДTرc+K5͸ȵO:G+& 'k+8JfxV(&Eɧf'K͹-Tc/R/X^ȋvk 6s8m?߆/e|ً]V$5޸O_;'vV.ɏY04mc~/rf̺T㩦u즎K© _q} tVkz6WUKLhM}wkѻbb:mx1YcZ}{4@Bq=٦$hKrdʓgE+Ժ`70)Z'8NSd&թD,%|ӝI51||T?9&[]q;[^%lV]YR1"327~Sq4+;mW1śtjVBg2r]X;;дI^];"0`W0?/H$6D(TGځl6fN\9) QѺOѤ UikheZgSwb)\NZ&z'=*O;bNLɋGx|2WA]B/Ng@&S}Gtę ;>:G&FhmF>_x?\ʍ/;;ߟx*6U}bk_F|y{Sg]lz+++o73[, 4VW`[HtHͨPPk*dLCa5ڔȹMOߟ.ukԆ>]lSNb7lSVv7Nn<>~ܵdJ w>.1]mrm}k zw?^xcևbuߋMymw6jaXO珁nS \ ]`x]IRrn`m +.RZ*I+q87- R)$@4`k'CF5ͺv1Q[*5i!3XLL"aw@ P J2 P/@t P{qڭi.pSQ xड़IR 7Xljb($//ZH12r$1uDZ$4iҒfΤλnv($I *K jj4C۔1VDTU)%۴$=-1]9 x$C E,;QKɓVysŅXoaEcq0_V'A@w1i[ٓ !r8f5n$lޒ=L# 3]FQ<ƺBT)N[HOL[]NQ*WrJW䯥8h4+8ǁ 4`}MYqMfZad9 f+ {|?8N/;ܟ$Ҩ[o_~ Axw1eNRH¯&(׊ W)$J!aƶqW T*N(4w6ObQ(56QNuwb%ML[v]?; *J(EWJ "WQ]GFL9e9# #_ӠKV1^*t%ᖰDւ8g!Ӷtn(")L4r/1!.4" R"9V %B8*ɐX8٥UW\ qanrc*sJ?xB/Af dԢ:3; tkK s%RKΥ͇ڞ~9]4aH\t]P i\3 &z3jҨ̌WP.1YoO]g 7Zd]^H_q?sH(CSNeD_:b^KS DŌ՗}%g1E(}ob(-zd&QT|b-C\FϲO$ިc_d⎍Z6 %lg"7FݝGI!S%m܌cX_'|&dSDgiZ}i8pBL</<9  9,7"չy*uQMh-Z4Ӛ;i+*M BjV`FHrz$j39.ao$2{ ]C)45bV5͠,i|Qdyn /<C9LMq㐵:vS~],4Kh|l(| Y>\߆+$ht1QAN&y#dRD͉w hJ}}BÏ흀Z` LջM wb &S>=8hcC^7Jae1}W:>{ a{Ph.{Ch}6sÔl810SLT)?^%<]vuglՄiQn"mP2ܜbj=+?#=۵ PeY՚JX Q T5ѼamE| 3%XxUZe %*Km<>~|m+aj"nL'!%Rշw=|K ̄SHtC+]n165kj,@uUU5kJ.R 1T  % OKRI3`F_h\0>[CpadԹ;v·!hG]G˨?s׿.w??EPUր;xmO |]mҘ. \=<~+@^./9sS9g.}V[*]'EaZ㨬wƹ⹯^yaخ˾^;.:{E],~{ƝqPn`(Fr(9])0}+`)(%rf.5^Tq:# ]J 𲢂?{W㸍1n'7ߋ 0&`qArP|1mwIrGn[~kS(BdGJSbSQ*CZݩ) pN`mXH՛P\ʒW byMTGf㡛*QM}qb\LⲺ:oA5[tqDn%#ԇq?5Ra1tUpݿa णRj+WFpaTm}cHc9fQV. l * -uI~Փe '&Ƀ%2귑g=>僸@]<]dbm#k%B?SR!w0V |Dϸ #ꫪ`H7:A.@QytwIWJHepL!\c&f;GfP@y9H]AwWb&SIj.X+o J}.r%GcR3DjKI{ab=s@FOcKnTޕLR4+v $W!x6gX(xbx1m@] '$dr_H ˭\\Dk̵Ta ^!<tͅ$^#u?Ö"kS# `~vbr6ח{^>'gT ff<>v臇t#F~x W`ynz?u _"ϊ1hƺp>2l(p'[ȣ͍ `DzRe.u®U%nljAA<[^o5603qL1nl}r^ݮղ*zFBϳjqCD_KR!=?krB227g3{y!5Xo?ZcRۻ5_}BěۙW Q}KVڲmT*|8;kLvdC,X!s=~ݻ-jT. l:wm| 07)5nm1Hqh2]0#isn G.<䝻>%TБRyϵf? Ưo,w;^@nG[~v釿>=Oazɨ 8ڐ\^ZG|GcCGW]`TuWi !J^d#CY T}觗XP\>/xSLge,(6}YPkc""=/tឧ!nc)G|vJ1a S1RQElDB:pŽs7orf~a Vd-l׉QCo=-EI]~W5ˎ}$b`Y{׍b<$KhbV16qbuQn~1r9D @zRyB]8%g u|4еlܻbca nY 'C:D5DR[+ND"b`*w e?/ٝW["+}zp:'_b,fX TPOh8{– A^DzuC%>OnVE7Y uF؂<5;"3A[C`2\9"3%-R`u'@].:KK*:KÑ/8岟b1UI@5?ugUWT˂omxM_8FOڸ(v\Δ6NֲD,!*ޥ뽊U {k^0am@"D;Yh ,an |胚. Y _Y2*M}t62T!/fhby µwjfB̤*{e<* c<]tԐU0$E̾zaRq, WK7}=ҿI5~{ڵWv!!UC'j0lb&C0#P;h$ލ!vrVxvb3)IY.ff!ML C- D.Ę3FHweYӧ۶}k'S1߳ʤ'ɢ ?Ӎ}iMRk[q@%8|N:"X惧G&K@dZ5[OlyTzdgc yͪsT7Ӣ .O綍w4Ͼ̋vߑ`cf+z ,i);{D!:E i)pYb}bCG89|I.(|NH3n&#~.&'Z>"݁Jy, PȂc%ؚL=~ Hq  =0AIPK4 ]o>5ݢ똄]oZK+R/j2 z?N]{̊vo*ӘDg7R@$Vޒ">0B@ެqtA\2wEfaF LJÈk8o#%33^"LV#MYOr!CZ2.<:9dw]:FzlDHu^WSws8, +GЗInXT]/{~͆ql=N7/pX&(H&;=&n9)m{BpCqVyI:;.vU;VK lU!gzt牙_6BQx¶-LC/r4Hm2FLB>USjǟ3t? s3]6Q)s؎@epl5kø#d')Io|S x%y8~2}<ӯoK 3o6bU8 Z-h!٪* EI3Lz:3ȵHFW/E!$Ot_a7@{?fbPm5TRa:C,*/׮Tc$C :OL2/.Z}sek9>l|z}4#jXJݷ `S|ew ](RHfJK4V{5hLXdFX'NZ]՞[KG%AAcLP}=M H[E4g9ʐ*KD{1%9r{ Qpe o(cFRa+G0)Kd) :9*^0>.]o#7W_b KIزW"[ݺٗY!Ab[f~bY0 T#aw1@Q-:qjkpkyey㺗nF'~%` ؂zm]v>)Y!5]vΪ(%ڔ҈DW, Ja Y?/a: *[i }8a0;bMD7#n7Qaȥ#9ex9V[O'3З )o}Qڙly%V .sEJiO> &'DPZKɧДwW2>+_u}DTKntu;|~ |~F%ǜSSƻKϹ6|FQO-@{4{8\% ן"(Cju^ ;nՐnߙ)LQ%Yl@ݬqإ$L KPR{7 E!̏6/CBBd *ݟskfd3I? ܠ,r7NϨ6A*΍NJrIZ؜!ޘrr*&Cn`+wu邼@zC 8 /bƗ<%H6yV.A6HGIkȶ*=Y~V~Ӊ>_O(zTtO v<L#Pm"DR)Ro R y2f39L!0Xyjpae+)aĭVRwDzc'3`զ`ao@&k;rќUb# 񜣜)a,vsrr QD AUӠ}XPqVx:D1INL0.Iv3uNf*Z!étr"ڬU8XQDÊD .q#cPdSn%h#B\<' @}plNOhg1O1%r)тv'e- 4#InuXqnEHKyLy3+b^2%bT`ESBԘ%IreejƠH] `::?Nr>MY=fnR 7MOތwD0ԏgCx |NRf\N{~2 k(>^G7oً1tGxT` OdlC+"lWTs~vZ}D7+q0ifzs4E1sTd Od S&>MKJ\+w9B,d=N1+OgL^<ɒړv LՃ(>FKY|ڑm&VTK;\E<CIՓޝJ rPk̚RIJOV z2g,^ݟPQRG%W=`%t ~<&95 *6mVYe˙!w++,8+s@>@x=RA(a;a*Q5HI&VbG? c.Q9#c$b 4Z#Tj*(j!cھRu.Ƴk29QaS[3qeT#6d8raSĮTTJ `pK sLەM` oIժԓ!xFJLZǿ,ޔ@j͏;3fkVe^$YUU~e2^@Y|*k U5,khT)i&P4!02b5VxTs;]xOxhJa*IVs4SGX$:V};ʊZKɌ92rcQl˃Aن10t2ָ BV 6MN", ƚv@ˆ |tH #!&HIY -5;9}J)_1ՃyAAAAY,`>G-2T.(pXB0Ed%Pi*$=@lr=5t%H,KuHZ/koG6Msz%)VGi(V&Kd} k o_u(*a;:tS9V"Cьa /eDVX.̄a -(⚄nM+Nq+)6ʝ#9q_b*b)by1Q9l.8HƱ[2*+@tqF\O %']&2Hwڲ mɘ|_g)Z:þJ*c#y"5W4T B>"2s5x"80Ĥ2TAԔ˥ kqB<$5ѹZk ß6N,vc&~[Y@bi!E^[?z)scHs “|VteE#} %'iIEMN42cKnx]vZN 9"{K>-\SRklᴣy eNFj| 9v\OFKUU,ko,ղ`+Dz^IGpr?#3CSۗ&8K]6])pW~kV<0lo~qg9F#ބ!qqNE-Uv*vh{w*@!ٟSQ_$;z2r*e0GR-[T+1:jo1fQ[[&VFZ?ѯR).܋Ƴ=9-/C"河x9F'xI gu9AWd}z#Bna-=_ŋ*ӫ),%&*Vvzū4->Q܉@K?} b g!U&R {g%0 EӸ.P^X bmCwK3 PekOy~MY5o<sd.M,NȂR,&|Wh%q4Vj0%:DA5|xF %L㧡.^ T~R߰Gg O qg2Xު"xb-+ Pb:#"~Ul-nzqpv3Z̞ɝm*?x1F11 3\ܯKK,u>h,ɿ/~%r M+h?FU,~mX7ŬYT,8֛|>h"a5sq( o>}P$>2ڻY7C$v5n@*Zlc%CBqO)lz;ɢ*"+>ydfTcw,8Mє Ue2I[N{>gwd7wnUHB#8;xlgxwGqgynqV<5pТ'gs )qj욥_*՝.:{خe[nFpN3GLcMtN&'m?n-A)R?/FuRϱvV;Y#H!"Ι1"`R+ bH(:M-7%ꝝy{1u$|XQ+ICU?~K޾yYkױ|#|.ZjwW!O,'J~X4OqVCUo;f2|٤~~ήG#x>629著o*}R~132%k esN&<GcpZ S̼$)as@qι"|#XjeU8q92vi4{t3}cl~G"tit ?|i@3@nݧz$&Dj&ܸp3,&ޭ75Fk*I:ĵ3Ȳl\7-XF46\fZ| (bD^\ o)EXM+yP1XOZ&IWh[l?l bn6 S?<Qь.MK.MAh> GOvz{|7?g3@7yv_'IDk|mUN;Yֹ}|fF =Y;ay{?o6v7Ѕ(_@ Q@Ds_HA6Jל2=> 8u:dNfNh|MNmun'cGPp*e-梕8mcf(>hep5ch}=/5kk~$RRMVSO>hRb+̧~ȁJ&iZrcӴ>k"^7BJjnTfh5*MFR2ey3rakW]Kԇt>}ՇI?*ILټsƴXdͶ7wT-r3XͰfg-IWj3f|0GyY\T6tg.[ކzͻ3wdTnbK +.f= S ,*ܘ |AXQs /yaZܩh\zUw 7';V*\[ZǚR54&:gsHu: #Zr19%9Bld2 ,7d$h3&ijҩT*"|EʎXM$U2h2+3;)5],f$!vhte ݩ:ܦZ<#> ,ue6x)c-+h &GJ9 1k2r9 g u(dj䡄0bSz| w6y}U)Grv$38gA璄`JgЛzѪ܊J-0,SMŲ&K۲5!? d6vf<[m0;>Ƴ6#8-(0GgqUч{k6'h%.{ o<^v| jяû8}v7W(tnoO(OwnIf^ BۂLݻ{WI \=w%niMEXW,Q 3 hFL98m0qM:,U&rke磥ftc啁Mh޸OUb.A5m iM8]3H/ fμ  (kҔd2ݡ56+X[QXـҸ{'k(q?>bRr=ûz%1B̀HtetMKc>jUBCcm~=pdώ\& ^PF˴Tk.sz.$ 5R/"n93"{ 4brb$[p ^ÙTH|6Sf#.?5=~\bh v=^8 hxKٟU+ʙ{GnATDTZןThC8;cY΢ĔYJc3/r +2Tt\!ؚ9Ig$ag`oؙ .J\_J(r 8?pbS5Fn8RwK-tAorKt^Ѧ#U?!N^\)dOq^vӢPZv1;qV25R fo;?-Qi)˂@:os\6'ͯIo^ŇFۡ O?mLC4I~\Ǹ<>jeV`N,Y2ehu;!qTi8r +χ9"G~>ꃮ{(^fo|Lepq}5?7 3UC_G˯Ts]fU~W5)28D565F0ix-u}In4κr(1ZW{ί 5a/u{b0҈B=GŗZaE߽JAyHj;(|u v"U6,'@m& s9p+sH."E1 1ۨ9܏D1PNeK\5D=!Fl ꁈ7цD 50OF*KA‰TnO iĥxrXA=:ZyZLdZ $HK^ZGc6*M.ޞ* /\&zc&YlcѮ}"9H歫pA"#0`TlJ9MTljPw%[nj6l}$$|5\ ?VnM (";8om3EɿV!Sjx6~|r#њ +i=> =|! ҟZU hw䤯^@{W>X-ePJJ:P!d*Pش֟ApdK_wCsv;*pKP;ŭl8y\z|pRF9TY#KOi'0je"R-Ÿƥ' -E;H̥0W'QoH߄Cg/`@ꄥV(:Ph/(&:&Zʽ=KIȀp.9F/i,M}N"~O#So8o9,9h[35X(RXli\cN}yٿfnj\扒ksb~tg6FgӇ0E~3>\zE:zݯnBv;AA{-ĿQ^A(V?8G!q%^bksF @Qӭ.f_e^0 я"wx|K|#8V^%:Mu#ELNFw$؊t(VcUlNui&dsVZifn .b(k/n]K,^qygŻbŇ.3W״啌ܤ M PSkObLQ]Бi8Uw 4O.j}ɻFr_eXhӲP~-jsN R\oEPtWHcbg2\Q ]Qk;ߵвi2FrNZQ!לf2RJZ}7j4$=XAXL|Ŕ0>T*i.a\` x|2E &g4x/>BTh[:չh(1T$TSʥ49]Li J[YO50Mv2'燷| R &{A ,+Edn)Zn2$U99\`F3vjNOsejތ>3e=ر[4m~zC{{?v^~veo㳷Ke b9Q{>LB/g2E2`u\}>y4sK_кOaZ`墛HMELnt [U RD3h91S#Xkڭ-Q!!/\D+Tv :ݪb":UQFp<5VES[Eh ?.B27!3{ݏnZ {1uUXC6y}s6~u;~# IWg|bp)^ t9~NB7s29@!ȐAPb$J kPhP- ᥱ 4Vxss-sKOd<D:2`rPF'cxil_o\k=Js@zrPՎk ʻߓ8lCf/i&<5Z&EH,̲~JPLQ$?D%.`V!+d;͎1pQj!pN߹R J07IvEz(7ZRzyc s~ťU߷VCtuxVn:2|y57ق2EG< NfխGUZ&gR;-4߰u~WKY9ŢxO?ty~mQNפ#w*Q|m!^jXfgNm7=o *^>:8ƲMUvt|E)t]_dU=j,7M,Fe=KѠ/M7&}jsu~c^ lW wS:rw TՏMOy\\IS}>Otcu1wq~*H} vDߡO?|m-[Kܵs&!)oioj:Bs|iA(t- :NA5NAΣ*Na?*J~8Eprfjס#܋FeOlAI2E!oGЏ~zh眗0L ݳ6`5IW) 0F4vc8/d 23}0r\ErQ\l 8E5:Ǭ+PhRXJzxJ cS'Qx ( ÿ uVH'Pࡤ̗2ʠI4x#sRj1_AQg'詸dKӧHkcHKG^&B3b-:YJgK9*K56ȼ RJR]괹"y(RiEG#mԚhz&n*A,U]Ξɋ|R'~ӷ/ʏՏT͓HՄ*[/U%=rKc>m/a '5$O{Sq:{?|6Nz͗_}G6z\^OOtp0! ˕`ŒDp ~ñ{5g&HEPXLův5ܨ]?ө=eRvzTy;֧s7.D4􁲩&w5O^:RU9fr.ºx [{3IeW]R~WЅཟNE{nW4x^8euL@M(7Wby-^-J}+>˰cW1SFvYYXڊ–FE'R ex^ZpkoS6=H1TLBg !zdbۮǨ4rtL>&1CYf)eT!C&]c XJ⼕DSQ:-OS>D /d|0δ K.j ; %5t@̤( ;:/t҆33N4 2RN,Ro| >ҩl(Xv^vl\L8Z򣜃 )AO#;]r$7g$,0+(̂pJ$Z g"P!jk@@WNr3 R9r^ֹbk+終@#b,C>Z5/V|K|X1 -d3C5mX)'II 6҂CH̕/&FRӿib6QRWʂTQh41:T -gAZ? 5A h{PW=lL*Cׂ痹* ^L!d~v 6Xjek7S|fo&DP`C[16@ƨvk5 XCV\-Q5b0,1֜ OuG,> ɸNV88a+7!.wC$9!m T=<'4a8Y1.R$:0H>oD(d![o$i8x xfl{pyQ1f:L}R먅ȊZǽ=vjMͷd%Km9oվ}ҭݷSZT^;-N^)6Za@mj86 Od1Q>Wel PgtUJW+]սRF[ƵQ$yiK]iP%`e<ZvTƫ×Gf_ NR8eTڭ.,"=P ^Cg ̄Yq҆HJ@#/ g੩)Z)6At"&@5o幟fCoӠJaAd$29g|;)5 /+zɅa' :J ഗ#s_\sDe@ .K5_\"jb[/Ee!T2`f[ӈe4?/XM}Aj$ܕe5t7~[dE0royfX鄢e bop<gӭA+? W9^gXl!&Z; R].cl_⮿6NR`ϒ6_m%ab/ E|O Ks'BоE faǣ}{J{ "VD{E|wc|(ǛY9 woh|x7à%^IRkp*^>0- Ds/%-E@Secf@ZN1__>}sn)D[ܿ$)-'Fs'sZI5ļh(1F &.Cj94P>zf9imRb/4M>@40=0S޼@JHDҔBܢxfVʧ.P,Pj '/")el(Ύ(e e?I )z[/DܪV9h}zY.i${NnBrW'Smj4ES˚gWA Hkڢm>C~-@(_@v[Ii jkdE4t_ JKȄEBs_HAbm;GoAA XY PPXgI\cS@Y̫-Q}1!?қ2( |Rk%sk@D5e ],CzTM/w>O~q+`R*yLXedzlj'16 JAd/Y^FP,e4cx򘥅h8niaOfONz8|YgJ̨WxZ [!5f%zOs^\Il9cc/a,AU9A))kL3D[jr)P-SvH7]B킜F8*AD8z#J!mv7"\ e6IK'I Xn- !>m4cCIO-(T>QK-0Bg!拭( "r fbFc{dU ^YG.ޥΖwsPZ+wNkw@-iUl&V\m-&Pj[|wűJ[kaq9ŬLI"ŵ[*8:'^P'q1VT:D9 rT3| 9Cy!3JJ@YphG$Šh|9Ad"D &.KMi-?W;c()SnFDB}+wWu4PV 6@LjSe #FdVg֯tBAWv[1iFcL]C(У2U8?[ Q_9_ߐ͇[&w+$(CWر{YBw/^ǻq >nwM#U#:X |&KoE L:n`m$C/@'ZH s0q]L?çO]ģTb.5cp`=OFL:^BjEdĿ\y9cNԌq7^UޝJ&sڒzAᦓY(G/#1JZ1Twe=n#I2;rއ?̶=Fnyk$ `FR*DIIQ(Ka["EF~gfdv,\}l9O6X#a&]m9V2T/|yMSSH`BF/JղRǡ)bmӖÉj%t,Kڷ'2&`LWoc6>kJyǑpsla3 '؛u:=!%=_q?z=( e/9`ۗy9-׬gv6Bϴ{sʵ&aXo/$Wi"/ 3p ԉ7,otF^5g;?[2kl/"#EQޚ!ї؉/665vR!Qo:T,ĸF9  .ҹ#lFY#83 t፹5%r9䚪H&rRk9T_[ˍy-$S [xQ#54Hv/&ʏocŇfbG/Q~޿Xqe`m2$<L&,>J~h0D1mzHtĝ,>2;y)0h+15&{6GvMdKX-}0וUnof{?<rzXzcr#wLG @@s: M^"D:08ǰWWer~Y6ؖmWVݗe> +S5@3ԕ6}R!xTmXjOBBU֡6^]k3}LwWiWܗB#՞.f%,]0޺Ӿ3, Z)R׶e[>#M1La~WY*y;zm],mr[`hO. xΏ͙i/,qͻ.OkF| ;?q^63[yn#$Jq+kU\G[06g/=)Gfĵy9_IC+:eYݝkj26 s9W'<̝sݱPs㻛s=sn~wJ9}s%sFVTs'vL{Bʻw$naa&G9/8{>]`fWկD5q[;Nm#-v tc&ULdsLZ\a5X]] B V-:UX#ZG>xkֆ+7v߭PqhbUz?=s]siUbe}8n FV̬eZŭ󮂆Y}gD{X* 9 X*&[nC2vUDQNQ콡I鴗JaOET;,5>RN3HbApUʆ:1\{l"gmCE(9uMTRț|@6BN0Ku u2j E`{F8a<5`u,J1@Q,8ؕ:a'!}OTEF#Klί1YXfs~#n.a,UԸل$HC4%(" a(:Æ4BrYI`^!%VXN6kcJZ7G|IF@X0 #^w+DN9 Ǒ(lTM5E, n{͊ĩ]YC'.ûptR 65<_:)Yx"D` +n1EIbZGH`,|dMA^H_߾LޭJ 'è8a0( IEOvhT7eFNpS J6|?11?ހ< aݧf(׻gE=WKu5 F0F4WO1gfwe~P*̎bĽqЁ'cRۦ1ORr:\EN|SM?Ο>Y ZoU yd)sL*Mm{q0,\UbY 8xY1fIEΒ#"kܥٱA/#8i9t\bl=pD+Zc6?xhX*R﯃Q<&W7v{3zs곙WWJ-N_xxw+ACzCQ#mBg c0JQ(EQ 0[ik):9]c9y*y> sS5_x>RdBǷh׃sa:` z$,oY*,koPMT*kE. Sk+#1@9ZcMX Ȇ.Ƅ{~hJlCA{; }q?z橀?? yvw'\xP\*ADO#jgND/@'ق  [} 6׃7iFJP\ojӧ)όaTqA.w xgU=4* }pJAP }Cz"Aww1J=UGC(YH( 1o|LgTz0(8: CFJɽ5\%qލ33d}giMr&#Dy %woSEX99H"N/CEWHAyTD:K$|XKRR U#/ h I4u]5fm!~XHp9JN>ي[%p9#RCN Mi!ŀ+BMk '`ox)T)oX @֖Gt+.oܺ>^uӋSo.G} #DIژ%ќʲca&UakERcp >ȟ 3C4 f ^AKΐ'x l , FԣcsT\,矄(%^9a>$%2ΐd8p NRfћS5?5}"ā(Nq!ÜS:=: QԘĈ~c\D('1K**xr8m=% _9>xk-i?vr2-Ȉ+KJHE 8)[#!(BC$9nVVÙa̅'o$D 8s`"(h_rz&uN(:RF?Ғ0oYw-)GnD~teT('LS!:|פi&rJ)_UtY ?ϋgAsJ&ZY@UY{+knH]P݇b^ò/r(j #YY 6qє[Y_UY ?9`p6Y.KMXݛǸN4"Xp\} 3/rxW>,t3H$/cpfXڌyF시\Ҧd?5Tu>D>x(KNRS}C( y&`S偸\^p~8??Ts;OĠd|XLjR*JAbWD9Kt#LDzi;hxԉE,ҫ"a 6pCf*C{8x3]1Е aJ}":l-ny)miF-˧lJbсw| \WdyS7Z=8uHKx^^RyqO^bxMg;TWL[.VW E.B90LbT~cn`YʧnL]܁Q%¨FTb6Js!NY)rk͋ Gx/DRRZ;0Dc,S6j Fi1( CFT+ٷ(ɰaHYv8Xnxi ǭº0F+I@"h7m) *`+,gRH?ULSIXEiT8cg|@\eSF #xsǭdվF<üN5'-p"E{.f  7+}$p̍gb56r.pu,5;`+IJz=#D@QwZJ$UfHp@phTɀYRrh0wKhrɀ!ec-c{ _d^C H].lwLג[dW5i7u2~wգWd}?Xf1WKx2[R:t\O1nPir8O?ok1>:6k$veaCvd0Wdžc$˴҆9Wa1AQDH֋VT̨:U*Ju֡Th-eLxCNTx6H5 0l:M`Кt+ş5bT䬅Et'2$d!9([˼p4#LI2J.(Ӆy f냑{ lO^n6. Cϰj^ȝ17 1Oo#p:]76;Ӵ;ӤݙE)#.,cMh펑K.P$ʬ\'@. )SjyA^ysF׏5BxCiUUЩ{n"Uǔ`3F@ Ag}/p_U~ x&$,{/պڎ*t6pX[D:]ZGخBs)UTK-6d'6vRL_o/ݟ`hɶrsm1fb%ꦝܪl%q,6wow rﯮ>dc[m%|)Q k^M> D%D*Ȇ>&<^0j?]}jϾ =ɎQڮXr%V?fȔ@ ʎuºb":bNI<-emC [^C.xb>KIs(Ǹ?QIDVo`vb$hwڒΚV6, ច* ;R]a99pt OFJ*TG51BEHsP0 fwO"֪l3R# SE'SuBy/p_϶.vSbr,*;][Eqs:WhM:Q龻ũpUo@P.oyREңF8q& O>-SFȟ<i_KcU ,SB1lk0 vjlVŨDe72hV!^r_v(6IZ \l38j<ׂ9Xlڻr 1dqWF iq?j3Pbހ{b:F#0/c7O \|zş'ūwMcs5X`g 7-/c?BJ[ #)95ff`c @`,LS d\r*o<ophQejw2'z1TF!,U ᱽɧph~#F%yO–5"ryTaE jq`d/\t_nfl.R]ijAԏxB|?Ocga3Sdf58U 2oWNg`hۢoble_z(f}u/m-`XZ *eZyR~?E_؉d(u&E@4]P  *1C1d;izm:=*9Q΀վN5?;?m .c}˜W-2\Q8!U[?*%. ~9J9/by )SZ&wvңSDh)UdYLJePQf4Ta RڸED"ab,0;0>+,u)BL+v;>Ͽ Rt__ď )ҿ#9U(u>6#bnq|1,Ew ʨ.K,(@@SQ45 #![|N OѪ;jO2z!P;ٷ8^x}y4Y, fޝ_\]眎QaY$e/Y7su$3B@}wUݾlU/Xv-d@A/Oo>49&SbTGa} )״a_8]WS]Ԩ6v 5՘f9^|$3gTsopJ8(J/冝^TweG)c3 I̕(-v:=e5JdUVTC#oD z+oprAWjG@__pp}α6ٜ֔ӗ}sz@JC8%E*g/lWt/!Xq>Iq>;8P:)|.9r|zh`TMNؔ UaGލMLjT $(LEZz7\HxCW@|hW1d #a&J z:|M)B/%t;Vǟ=*{HNʿޅtTr9->y~][}Iᘇyᘇy{cD:bS  ס·.Ƶ,$+M_r}ПAXof3&7?.3i/(~l?܅]V_/ۼ`oƷCD!)=T6 :1\1S1DF0l(1!BZb0 456iZEtҐ N\AN&F+EU9 ԋ?.\ "N)hĝX11H:eP]M@9y+0H"h{Բiwj=N?RSA%#AXEb1=bcjqւXL q`BPX2l1Uϋ1xS$w!vC~<]pɻu6SDzK*Q_bw]S*Xl B~{D CnE߄7vNۃ22X<\m_O˹?? {o[\Q2WnY6j -m#ŻwB[vGc[ y&cS1xǻq&bc:mx<^`-ޭr=ڦe)NWϴ j~-(QQkwoP@Vq\wbZZce /K)r|vU~x"G%٘Dh+%)t1BU}"s)SdNn!23 D#XcLSii\E!/++j yc6z غvՠP UrZmNv_bRrV2@e #L>~}sΠ0<<؉>H>yZY^9qErTE@|O6,\iR;\ k>vc-(#^4 a'%9 %uyͻ9'`ש(R}oG+%`\B~d)f>~~z`|ŷG}?BP+.o//~u^Oq7߭bs77 >ټ"$߾~($ srJ0>O8{΅~7>OAT=!S+w#'/>fkp#;')!3"wSѡ_8tDx P%#}fաDl~Ɂ2t ,rԺ3b[c ցD- 4K#h8Qhl ociIQ|h:M p{0BXEF󑑢k=:: 0j8z!Ƃ#ŝ@.h$Rjog-`=!&hۙ77D$FTM_'΃kZոV5.Z ^U}j Bj8ޓU,Z0 P k3?>3_==x/F:d ꏏ5zz_4ꇨPŹ]ɤPY#T>qk>!AW}0^! NZJ\P.OХA*d aDl'W7KZ< ,r{w|0X NÐ^ $w9Lw"{K-w>o]yYwےaEh)W\J>E?w mPFЮJ±ٻP]w/KQr;M+ p9u9W( 3fJX&Vކk{_]?s{vT[zXGO.%I@ZP4R3HA=;O6Sgkv4bl!JZf~H+8=lЗH PSպ2eх"f7Ek$];Vgdz5Ţ7^x(}E ?;ČVy~ ^ьI"'3QJ|?[ ́a{TgD01~3((GR@(v#F;]}maラ2?3a |{xF7ik7:';jmNCc#;ڼU=HL*0g:4Al{ ?oGl"қ'Ma nzKD8.xVO&uSñyN+1 9ȥ>3gq }k4urϣ+r /iq] O=ߥ٤#9 no`3#X0fD*-M =$D9Eٽ꜄y܎s/͗]Ҽ_\Yd2}W~r^\-sgOzݍ3O>o~m~8Ԃ` m>ـ58`^41c IJeٛᬀ0_||O|KQ{*m9srF_Vy,1BÅ1!:Q;$MTE/222f6Qe.[brL('PSB Ԯk+Z$FP2WSdN.$Gx9U6N^Ql䷖Guف׏lyp9u?<_@Q3NG'<`Iٳke$Mv5(9mx9W\+Q _QzuxiTe?d.i(YyMTżU< d=hmlPUVj(>zPz[2p9Ü.7S\"֫cSsƿ/:@9Da/&j͛%&V̕yD#N2fC-V 7o.%f̫zg"jI-Ex8lqyV> 2rNZ >ztaҙIg&y{i>-C1<0KA NjFČjMmQUt܆j6e5\pQUPtT1w{F :la$H `ֵhԀtQTDn1]t~>J=s0zD5' 6R:]w~t 1XX9?{qB_.d~?(+\lgIjv4C əPka3zWu|} hZ*lZh6E3@~f8-K}.h͑+57ag;+[0sך< :GTI!.']:#ZPAv`\tz*QwMy0 tOSD{=V"ܨq@3J_dR-\SMC99(qJ!:a2Z:&i$u!5n*lEhs9ʳ%;yD]zHp9+av6O&d0_|fi|6B-|?y&"۱ 0:y8(2p3&% +_]?@JSvSPݚl/Ƶ즡5Yڜ\I?_d^Y61ӮIsz0%*m䶘?tPȪ AcjuT-fkY=h2 -{]Z*(ẻgMx+1n}@%! Т&.(+ }?ZZTUA dCMHT˔5\r2-K4 XWx9ںW=}A %g%*{!1 Y01D(EH+vd#+?.ꘟ d 赓cFWdzC6hPߺfRQ<Yb Ò  Nb[ùt%ƓCW!d2OĠxg)5$E]'ڝJkD[QIJ(V$yb*ߓ JΠɞ^J5%+?I'! -z I=TFG*DvEO]!+Tᄅq|sBkL?mWipeIBrMmokgiϙhD9a:t Ud.!ӄ$r>+~|ߐ^s0UGS†X#6.r(3ydM)4*rޮ@8RsQkѨqG|I[YwgS̴@c-AZdT5R@j1fb[ !%Nt 7ntDSh~S@5hd KmydC*M`tot Lg-- (H{??cwRd'ئBx&̅ɬx]1J%? 0sیZ8z>\LdH@vȔI\1_eX݆B8:NN-2 jKVF$TT5 ^-H96 -&"S]HIA "e!WZa9y^Ng3PYH-IpLی] K\Gۋ>ŔKdf"9izۉFk}Ma2vrwxuwd+~f?Y3lorfeHW[J"Y[ mzi>P[q^GdYٚ!Q>X.&3aI=\ ̱S%}}k劀h>W.qr$$V 1a5H @c%NCd{* BvD{QIv-h !9L}B]&Lh[L# l0gL.ezX!q| 3κO5'1SmbY utsrkiWrjKlm&+otH:) y(* kJwq> c9},H03#iu?ߔ`D_, u2.ѠDa: 4N=>&1yU &9Ȍ-4` *4z%D慶V[% eCKz2]ÑAff00?A(ZMXx]r)v'(g 7aHPr@`d0cac5H_K.r;b% X`%L'5>_>\&,6Z{+_p\U,$ V`̈D(M 3Cg1X8RSOLE2:wH}yڈuS\,}ƟmT3A1p](@6{눁yF{:D]L+}G^{P7 iC3zqF^j uu6yF|e 9b&!cw`L3! i+wFiQQXyZov"G6T,m {# '֭x '4 wSm*VE-c B[!.6fX }gE2$,?&l|K^佝gHSqz杄Z.-E?~4ܟ޿&?\0Gmz-ADU,?"3_Lg Dp:wqQ ,57 /S?Fp}"pm B*ӧO(8W#)g􉏫7I 0ZWɰx)L(~k d%U& ־'_ܗ8I㋭""ĈeV+Qq#61L+& RBZ NH5KDE(JEPCP|/ >،{UȪ}余}:4iOˬ3LF"h"#ޕgؠ9@#H$(ָQQ u֊(F@lZsSMRT ͻ3\3D7Jj&Kx+SRQ+(U7:j%$AoE"HճTtS Kzz;LEv0(u繥xyu*1ckT_/S8Go11ZQĀ֤]ʃ,H3t"&1cJ0o11g>g;nux2m<*"c:FU1M~ 6e K}80)A1H*eayCM/uÎ_ptQevUy/ 4AZZFϘSeip:NH0_zOߥϊ;wa1{~,qsvy=SUD9RӒ6ҩWZ6P-q3eLB`/LލCڮa&'p<|I/sj.|h4}4/Xlj? v>e D(1b"l|(W/db_2M\Ok|J%Fy8Av( cpF{m`דqLy"rQ\zHBÁRb_mAG!CRˎ91᥅TMdH( ժ2V?kS߽;]oOlo`$3@:|ژB Y YJRcP:ɒ1b!q+-՞%iA",Ckj1G/2j+fKcTUa伧QkQ$ \ ڻDȾihʒ0 X N5pĪ%R)CHUP[a;1 )$*MʲO6ၭ*.0Ì:޷Քiwm7zߛFAq_ad: pynD+fg&In@P@Ye&xvb{9gNҸHwp#M nq[|~ w\x P% Mī'-[|) q{$MÆts0O?%/|JJ z84sMFN*9T_O#1z8 s%FwлN 9CB'3pB V3uve E ׋a9'JHgφG;G㻆G'CV$e`+"q(i'.(P%͕J;3h̨cD'4' ;*Fȝ*BKËd U0`i-߶d"N&1d6 uM!:nX;du}sS;r;nhHYEQVSƔ`BaDaU )0[ j> B`h-z{!eqTUQ=UQbt]+n+־[ [rCto:)^gjݧ:^{/黺`h.Kwuuori'zx){z?\K?\݃wRy5~Ih#h8x$p|^܇&&0WfyьN@ K N߅~N]7IK1j!T{2v(^` w ~$oLZ2_›E "O%?!˛Eah=/1'go }XUytׯ@ξpʱro@;Ol[SrXSrXma {axbXxpӃDC%fWXytp А"[ V]k-4Ę _ Zzz8  ͲtPTr'܋/y<9!a`@;gtK(c%>eƼ̘nf4 v%c˸`BYPS2S!⠄IsԬL4zUEKQPLa YogbGyØxƐ?Z4*yQi{|ƝTaVb#0-a\^յjHUJ<zerAEbOzv $4'_/!]v_rA1o#D3iw%u}/kUȖ!FKP c] ҚjFR+bE(U9pnkEGaSȻ -Kq'YeSBQI+J$55؂q0N HExAx]$Z2vV#;;Xb!.)$7w|8GL KBުf$+ˀ2ǝLPW~i)x VuגW STeln"hjoU $I*^[0$EU .X\JR+:"ȼaaw؟DRTZNuF !d])ŕA+m)bAV[;=b̷w U9(=D6ՊS]daJ>.w#_r9\1bsAc"D>u*&7Ĭ;+S^˰^}a&S۷ھ{aXӷn )%f}3O~2( h؛vjp$B%(8Yԥo\Wـ>D:3ξ(J{N&=:D$pbXp.9޸M%0D:=GQ}[r(*”P|.ھ9D~]Ȥ늭ǛS$A @A k?cȿ;{ `ltVVp9Ct+Xs28"TޝkbS z~Ä`3'9/ gf'Noib<"JePg1k-0|#u䕛hț!Bhs:T(ug&3<͐Nn#\kS!` 88GԲRϔ#X8._sJ|( Qz.F3Jeʨ;1;洔I5Yz z Յɰ90d/c-"S\XURR jTc&u0:BHit<8`8|9B^CQY* be)^7 Ֆi FU,bmۄ~9{I>um/g美l qr铹\PIi2^ɐP\tNVPgQ x2Q¨ȴT &ޘ"Ʈ9H{H?x2nI2L;E7˿6p(Q%Rlq AgIpKϽ<,oV(y“M< 1@DQB7r ~m *mf,1&X6 H v*Y02ea+| "ӆQm@aN'@EIΆu?]E%)L u wңIv goqRY#ƅmNecgyY1ɬdUQa.+NXQN"aPLӡh0P^/y{m|KV^F/j Lg怙TzԎu[p.!4-e F ,*Q͔O,gF:(e%Z XC 46D*2҂dPQ 0 ^a"AOֆQ$әf  Ea!H@PQ)W`!M nBLf%yEF̸dA0 mh, ĜqHlB]6#=$%enW MK!FFmHq&9CA3움! 3TqaVNf6 bts03QS};<&ĩ|ݩ;XiI40K B $b̨05 Q(D"ƴ% ʰ YD#DŰlѾM11u?%օ͒ +x,{Ib\ wj;XsQHdTTz  Nc'qLZ'G9u6)#ur>-?YVаFΟ V#*,=RΜ|bI,;)/]N4>z%C9y2>\_nͭK@,ÁkȘ(j5|v)3HEfv>L3XBhjv0hh# @hAw|kp`?GFG!߂` .e@̂ 6WȆIi}-nO&]5c#]{Ǥzg f,>\ fsu=T qO))iM~ZpwO͝/_Ls^2Y;T ksUxI ˣz< Tܪ1Aj.YUD F!QsfɪLvWL G>|y-$Rkmˑx<*%T$&8Z0oC~Eq86Qbncݍg|oW[iR_~Ł֊J3C V3/1rGR rq)d?^¶|ȗZ;\$K~ZZb"gqgًWhwou?Z}Y4,{[pv;=ÞǃI dz0 ,E4}5b12a6~36MmYݷOIП1Lgqҷ?dw⩃ W1hL`3Wg3OgyYS/- a` oV(> G@a.hq;Xa*{kMM|tC߾ͯ} --:WY-CwNf`В}|+RS;8]L.RsܙRKڨR&Pl[PQ-E/ le&T _#¬q?Az(7*@&$G:ߨչIՊ:~7q/XS5[ B6Y5[Xo&uډKK82$])]'˭*dT0!¬/^lb\ƥzO?}LLd 2>2NXBȰ2pSsEO1Z ~ IjFz(Fи|Ú%A2u#O1,+uw4YnrHB/圢5Aq GH = *(L/uqDPO)|?5f 0Zq81ǂE6G2SҤ^4>XW|XOV@wExRuz*GJϚS"Q/AJٹD뜈R{箣=8HzvQbITsF8u'0XٶC"n.- qAS&֚l!"Ӑp'DX2} dkHTbB!KaA̽5?,={>Lp0-Gπ(4wJ/becg@*vL6gu ]1[ *랚Opy<隅^~g@%VVqW0S&Or^H߲^UyYCpN8=ΏV:j ѧw7گedjz)%QS+eZk=!"G-~Et >EMjEbڵ, rP:rn+Nui[M|sr\'(YszwL_Oɷ3З&6'm8O+wY)<mPA{3};in3x RYw%WS1s7dE&vW `xlew 8'O.2opr sVXþ5\¼DnӆHJs݌oO6D|٠-ICqMb0Nbڱs܆PjU)':ww H ZӽKٽO0`[71m np{3 I2ίG C[DG^#1@~HH%ۯf:;sȕ%<'.q}*p0G `'d|Zqc)ZxͨwF4Vt7џUy?[ʔ|K?@c8|b;X 2ŖЋmĖ2c.%3Vr 'g~i. U EJrěqAWG&ܗ]2.֒t ZZFOʺx߻$N}_.B'UHBȈO@F|ddG<(JvH`BGJc%JQ()͸8;;/s)rijJ^kO 5_ZguI,NH3݋f}ֺ5JIڨ(\ 4Sf{^&RȺ~3bOwe9 d[kFJ"=ėanGѺjM t2كVRRqZJ-iă&;G2ms;)`޻o~31`8Hm@w,ozRs5^Rɪr/3v3݇Q7 %ۭ/Z-ˉr#EY2zȫ)iH-QX|uґ/kP$HVh`J:TX=-z*o,i8u7=_FBكLh#y=!Y mfǙ1t7w>yMc!'& @N4x3Q ku"#AjFi9:uoMꄹbpz ~AI" !yb~>T A䔁&Fo0/UuUzx_udX z<1em8jЁcp O mЁ`᱆c-aWw,C puT xTp)l 's<`4tPKp7{Ukh9odphur\EN'5ZURZ ŭhFT8|mWծELdp3r B /$fiQ [,h&^mݲmDGe ;$*BW-,sNU|i|Qe*I |t$?Ƥ(b 㷋ŷ*p=@@$;$ I/?׿; =>Ti9)/q :pALv&䔊-cC&NB@ed N&gh7NjXCON8߼*B@_no^ax8V N~-ݤݯw ,ض%D,:3r.)c5Yّ5Cf3S(uIÛ?_/XV5@?ؤ-So ~zV O-x 6}ǒ2exGяdsrpGy^:]5iͬF_".&פ!={/g)JE+K+Wa$+2e݋GȜٽN)ܒ7}e>#Mٕ"[ (rJdݓY=2Hnܑ"hN9cI7mRjs8=%QLn97]bIv$GhSҠR`2p^AP^A; kVջ<\`S0ݭNm0eZY9e1 FA>+Zߙ%sHN8EFQYkU#{.XLe]pc:Si{ $G`»->RL^"/&CZ|t!}Ƌ|dƸ'|S:Gcaρ6va>tG (v"4/v`(Z7;D,2PIkۼp)j޼{r12z/C}E{ˈcր]Ccx-m%pQ/Z~CUzQayhd/;?C~Sk6t%5Йvc+{/am*ybLoJ?!'9}}Zr#eE|ǻ0 0py:ZO9yqi >_^ɷ?N͌/_L[VVVVY .'wZH!֌iθpB18L6II"s:x<{͏͹k볷%rJEVqQEt=jVV󹈽_~ki@!{k%5LNوzoVȃi0هSxQꈁ&&Ā!w) Lh>Hw`iqvۧ]E>jÞ%['}uG/VЉ!_)!Tm9YzŒLŅjL\.kZ/zL;_Y9ډB@L+g}ΐẻ%gtA'Q[ZdI.IFrlq+ѽUѦ 7_v@m܁sa 0S]aݽ2ׅg8E4mGz+ ` .r`ygXD'|"U}PID'sr  ?j'Osz>U 2(4t;dHa$9$@&Cp >QJ2GJlJ9_dwgSbC?Mtm6es1@dzHS–"a5pO) V~)t]>咓VBVSɸ"tHޝ*8ͼpO!<4H2Flʪ aB#)F҂B5h %>#5@w-Z*I#]I D@[ִZ Z4V-0Za XիXH@bkDr d5W$뒜YˉJ*KbR9LItdŇGujRasY绫zol޼VUlNt/f'!o.`7DJff?걹S)ҷ__,#RX^|9;.ksYQ QV{x>^ݥ i"S,A,0Y,Nfa.#7 Nqt~0ݲFiG]_508#\i2`ȿK0 JHUh$@d{"V{.bu<͝Sdhx-- B|Їu;ժsj?߿jw3~s7ni+,;^kÕvukGn2i^CɦLh:93쯫ѹI@+ 6|c9}qx̦ԅ<%5P~TۯKq _ VڨS#qAVPFGNK_@ 0$d=:D/@5C8|o!>jkuH;QTKZ'>IB/7'E~xz&?Svj^̯g%j%aTvg,fywt&V?m~zu),7We4")³Ĥ{4S'6%lgŠzwtt;Uiە\DȔolk7u ڭ/5]ZFV[ʕm ELIxSn}1ݎ?mcnCHW.˔Zn_yw$[.[8ڸ F9 26O6liD*lbJ;p<>,Y)i/Xbt 1J`\"{4>5RHWz{8TnVٷ~N/tSn-D+_z@׾M{ͫ}s?-潨TF24EQ'%|уRr"]>26A9bo돞V7$֗,yèoB $" SlbE`]ߗſ]3f%o;K,52=>^ O}GDvSGZq}Ės%b)')V>ָ'TOT}Z'mW'u-ek˴=mQh+Gb1nb$_MX]=^[paF7 82khO&?~XolW1h;? pt⌆0/;Mwn#qe[df胚uН ͅ@2-eJVOAj zVEs1ga:eKktPJtkxB" NG}fN\}M5v|QLT <+rΝ7[,lʅLL˜(D"߸P&3CV񸣠lup%4@p ,,Y* G\ʨ5AIje{eTqT۬KV\p)+Q # ,vZΑ(U;ASw6X*+^Rh#A- "<-Ra P 6,=D86^6⤙*8b, XZ*1E}*ȅ& nYVX {4A4\Ors{(r*dW+I;UbB!CJ˞ @mrt*)Ǹݐbi%õAS^d$ŕC7h$;V͎ϴ4d>BguJZs^tJR;hjI7 ^I/S 1l4RKՍiMtjk<5chVE%9Ef{<;EeH/PZIwǠCef!nUY{N1/[/DN$VGRpI$;AmIC3h-kyîZrtɓL^º!찄K4R%l5FǍYƼ,r6b`$N68Izt"KZ]$%5ֵ[;H<#}d*Vq>ps!BP\y+``FnɉP!"<&SF' UPx<;pU[>9V{e3uh kDb.1Hv-yR݀ P!$* I(l0洨ib$p*2H h岐$7Q1o^Sy{P7hp:}G˩~wH= 'zChWnpc XkD)4 .k58&064@mhnF_%r D}BvNj4'3C)kFs͕!ג?mU1!HuҳI$PZ>0QPEc/ݰbY{$[hUp&BbN Nfj& 9a=;WTp..bi*cl+-]KM'd'IM O%L=j W$a')½"QT$8еx_]}`PKQ6^2 i!6O+|y(*b~>@fDM9Cikh4E*eX 8 pB$J U ԕۄ((Mk\ %9HCDI?&ډ &nbX7!cGt[wHO>PT|QHRg^ZLT&"bjCV6M,HaYj7 Y!€^kpl *;hؒv6y^Wq4셳Խ镛SMA$ЫUpP88gj<^G!p@2 vL7V18m0"*kvbZ=AtQab&H'ZVzj<8 ];Ot=(g,n<\hg-"Z_}?x|}J7g-"&Dm? By!}ny{h{mi@|)W]m.fϪQo|uh%nWx6̘K \;C[yanԄͱp S: 9QHR:4Z!R˂qq S(!)Nzo /N!1cyYYc(~) [⚙3HM8մm [ Y+ 8\Ԝ䪼 {-GPzH1{:B߄]J ꟒Bҽ6 ꅍZ!rⵚEJo..,D4QZ iMS;sBrV֞ 0vdҕ3zNҧw3˶#뤊&=TLZH{6;hO!F'DnYrfؖЖ4Dj_H  '>7uj?; Z]eFk5[yddgn"(\h$;tCv9B0%5jue!Xt R,0mK SFscJHMElY1' +/4нD~2ͷi6[srŮ0Dbb,0 Vpjn#ݣpNʛVmt4RuFUj_zK%y38QN)*Q,ʅ 4vg t쎦AP7POG^r\qط*vL}U \:e 5M58)wh6uڸzB Ke>o;:ِVꠇ:xpåNJ\+sqt^ 0Pɉ0IqHenoK-8y޴M>frV}~ޗ1Uށ}lGA;r}^뉙g_86Ey!.ZOIeN{Wz֜ gW7XK3E:%h䫃.'8zˆSxu]A2ok]T%x[F)(U`غzC/p߫g8*_Zơsu$VGk 9s@~L~)@! XӚ$h"^;ٮEIAuC^7S55;sQߖ~_̅s(*YuwnTql0N- 'JԤkK5$M4ƵHayI!$QL7] hey܂"ٸzDž6rh'/#=#(9@*aX p5n13&_ U' eT\[d'/ jQ ȟ2t6R/bim'Jxǟ:.e!cIORpEmP5$g:,+vCloUV EP B&Ygq:+Zl>/@!_T2Y'˙kP22ݟONDyNDy(:QVpLb iR dB!D*!TJsk6-+!JTj*[._?^D>Og]*d+9J>CcSxWQ!zU{ɿ˧]{|usܱ`9{nntmbn|r|~ʌv88bӑrvf(~B";E9'Ȟ>K?5ZnOԦ3Upzw-63!i6b_P̛r8\i,)jVU`AR`4M)L4R';OİRYԗm@5g/6 JPjE+-?B s ۱mb AalV ZH8JZ +DB*DJW3&9MV6qnXeTF IOi@/V0$)ΌsR$4(FKĝ Y}0Nz!0 [>|nqb|l ~6xyouuC]`].;p=Myn>\Oӻbe1@qqwE|y&|br4rq")3R7=h?,cwpl̘t e?@@97uC8yWy`?puG]9dh mmn/fxEq! n^6`^; .Ropϡ"b0fb$6[6;tV2*@I r;ن{Vn;=.hn 37={M^uQdzG-_.<0oQ]r :ɥi Ҧ2nV $5rH[hz 4zI &SfY kBt$Bi0i*^}ro w[ ɱv| ?os]jI-RIpu}btu3O+*տ[§<oVԇ?GrN6?.߯&fkrVPMz y7!K*ka`t4*,.H<_Ep1p‡hјPunLb3tJǻM{)hޭX\Vc|ChL :x~߻ :حS^m=Z yRE[1I֘d|uz]<+ GzPY Ro?]_{`do$N>~N*.IdrP i1~xwםu_?0WWo& SSN]QTJuD(q,fcEF鰃O`*0j\K7wƚe\NS}i&ReD`Ml"Ci* 5/kOTSb\=l擠سXY,\M7I:#uR5gj(" &ߺ8q5SKddtt79IH. )Sc~Y::4&uuJ@ߝ}pq?dvWO~P,4VoV7G˺_H+rpf(Jh γQYՈw|vM^{Nk_8&1l uO54&| s:/}:.@ȏ/MNS@,Tó{4y  $XP:N*!%ߑ#v}\)Ru<уe%S+`2ƤM4yPL;ʭE&  q)*>ɱ+3?Zn xFkE& L N8:YQ+.ײm·[Y+HJ5ơ%űșf+q+mAz#]_ Z*AiA`%PvrxBN:۶V) o ON\㇀QG}Ba{G8ߗ Ӑv'ZRRS{[y)>5n+56DC/C&2 P]Z˜嵶bu|aoLzAiU׾=%lفww$QIXz85j-Qm<@2ʆ"rf[P+3W.>l^tAWa0-X\DsH.t$nKj$<˫ׯyHQOQթ͇:JEZdY8v 9rOl!TΣ-W=b7ܘx@e# -}7>b>1_=ќ̻EKgݬ2cw?{ˡ#L65Yfy37K&=4=y7:/J g ?syYbԀu9"RiV\CQas_"r3I&e<5̣PePF۝` N§D;Ԙywnp0bn `i!uSg Qǣ:}M~zuX*_,)^& ;! nQ.3ϴG^ʀ0ݹ v1ޣnOM(8w+l/Hh!NxO]mc"tR:x{7l/Sy[{d=8Ե'8נaЛqsPAya d( {&?=a"2iKQh=&>LߺeScg6(Xvʁg]"R<`6@3V:q;J%*'SDrE#4s2-C Xõj 28 (2bԏJUtqޝV{PI*9#>'K}5GJ}f!oVt`9SPty&AY9B(l ^ i zvFO|Et 'R=AAi3xfrV۩Z,bsc G?~>1{Nޥw]l4JYG FR6{]2<DTSW~* ēbA{Q7F 7b0/Y˖p?~oQ*cM3 }EݝXT pq~h9*Kuk!oPk(ULJA:.$ 츽fTL[ %r˅g[-_p"ehH2z|@"Йc]w+i-{9Bpc'[JSW6~y/'.k7^T tcOzjtmmnN_OnbJJʷHg48U+~`hZz8hHp3^2 7]&7ZՆX7 >1pymC M1DŽ[A+QNv2ՃPFATi tI#P1֩.i4BQ];/o]mkA($\tHyñ|5\ZDz]Q,2E MQ^CKYݙh){[uT՝X(ǔTF\ƵZ]_U`*%YE)GTɖNma轜@,UDdaJWnä4}0-3e ϖP@(5f xJ$vM'GSH$5 %HϚ0 )sAdL?." ?Y/X"3G\Wc Ajf* Xp=$4B0P+*R,}Lφ9,i R#u!ׯt6΍ -V4:0> X9¿B|+Q1abq#־|TL P$ m" ᩆ9o8v<2E59_ iv[s&DN Qȹʶ ۂr^V- ʂABި7 (1aB jc-1,71`pFؖhIe<."V>b%qC뼡4 rU[i..g.tK{|tlQ&El$RO^rq|ܚƼ_pn*xyTEK8sC)"B-A(HFG:)[&H͍Fjط+fN』>#ʖh PNPhP@NBUdtPQlhD5zTpSpqo"+Z9.@ GPqAZ YkAIkʇ5Ǫ2mzmjꃪj4?T?iՐx"~\NAHAj**V %]V@ѓJΟE&PO"9QU<wҙPEJ6*s> |FK1h%%%doE^V53>Dq6|77#"<1"+.N4E% x7Ipi8KhM~'9Ӧ9[=4P_ݼ_柂9\<;5B΄S/ ս9bw+a޸.DO*/3r@>ȓ/_~uH 2#VLd [y_dqwj'95*8MIڜ<|&?ڤDa z飼z7[8:nQ'Z bZ!a?b;BaZ;}Gna)GbLJ,sqRS SG8]X\i"*52 K;M˾sF:PPb)zbK)t\Щ200/: %VEf6hZw`МVk2Jĉժ$ lPi\wej/)%r`)#);FoJ˄Z[EUljq5RS"T;KtfMRE4_j60}6W*ż04p`:N,!Gy5hB}Nȓ7?!! CusR?܇,|c zg:GJ쑅YwukHTCs~_T$tKd$dn]i| #d,@`!9xnGR#ӵGPڽ_.@<+7 tR:xvU(`^MBwb2[@LL,ޠ%t#9b)4-H# BΤ WT*̢B Z+BVC@mКxA0 (ĉ-(Dr>]8` !Rj|GGf̝ac?ަ$ߐX1$a˫KllXޅ6Z4Ϯ]r+x\weqH24G2'{`2g[;Q;TC&+ϒe֑Y ö+ش<%x_"`cTXpBI0ǹJ"JT2bg`f)@o77n91tf< =V3gqehh;)ۆMI Gm*:o ;U:QVK<Rd֣%H%D th,1[$Ib53xe| -<%yvXB 4NYՈt.y B0\z3$XWJ Z!L*PbVv.X;ݻ"vw/7-m'h=KQ*Wjb\K]J֫KkA! M8hkj兀a8&sMfR}Cޯ'c3?!Ǐnolݳp^-y&}يv/t#iml.?}~{ 9abb?j|áȅL0bӳOo@Ԃ`ٕ]~7oMU*sg^ӌKj.f70rw}mj/k2bZn{|pLn=((wCu'b+jCQ9 GڇcU h;@Xw@=iΗ s uΏRbRU i)Y-8B] {當J(O1 lJ"*' @iq>n*41g4 X%O=iݻ; 0v B ~,XM)6R'|IBk#e_ڱOgzc1"Ҝ>IIb*DoN! ^,M7}E7^-dI^I]ewb&5;7Ga>M^~}Z<,FG45Uhk2냫]hH2jhwu\W{w=_~s[ / i;Q'=c@Ow>B@٪1 >u=#(Ck+&ڛʘ&CޫDc ʥVj2:5bi1uMV^#LʟLRtJV)6`-E4Yyk$m5biI4<8RC|gDYb"ud{E3`je}pl9ZN#Ϙ"3 .R 49ڥB &ISA^N82gNL Ks/gr:/SpC斈.틾u"2c2eYceSc(hXj:]4U[Ǡ%I1 0>>-KRhS3kv|̲%KlnXbKZ8[yA60ވB5)H>лr Z CAQ2PJ Lrཧp"VQ p"IĒ5:]+$as# -A4>Lx"v#p*N |y |B&Tg:/ b )J"`OaKHH@# 1KB>+"zyty[sgL|N2ڐk3(ZIxgfLeHb(1:p+,$]vNQ!)Q=2̌@R##d(B&08X_4b M+ MuE1\?o=کn0LS\џɅhҽjxM .ژ(( ݻM{jM]*鿱ރ*AlȠ0a{tKјU=xRȕ: tVjjmr!:+%u7t#v(s J4`ej!BdI"m"~6*% &D (2 b\:i;]jۧIA{ˍcvkIݖtfg4[1,`2fG>]j=Ֆ]_`MS3Q]ңWM3*M\6l`Bbu&$@?Rqm~(iuUB&M@jgmJ9NW[\td!߸)M'vEr4wkA|Fw-Ab2wkbց|&ʦL>OnMŠt>ƻ8TS̻hwBqmeSBN~>d ޭ-%5m5jCޭuGK[6e(>YՕ|lj37ʖ=:))")k!N0LGa"\LS etZ2{SSl~ͥ,Sk1:b= !ԌRT%TMGSW^hMzGj&4xO2nweR-Ũ蟢]|yАdXP=59d@NaTNK2aN(m:'2"Fgx,FQyW>J03_ŷJuQQ29pTtހ"~czכj y$=,%%Pމ(PcNDN IR?)0ı`ME*,I.@ Lp2T)ΜL7)r)'':hI2gS@4qNо7RX.+CFH20&*sw; Z$Dhc4hBN>I9'Ahd0v;\/??u!03VsyNuzAQa]||8?-RwXA2*pŲ/kzZ~7gDHY83{1: >b֨ лI:}ŇS͗eXΗc6~ k~^q/% :sWq\u]uDv- -|eȳ+붲׹!%Q׉W 9 =]{}ſ*̧}׌~f:[^3禠j]M ETRM 8ՙl`J >Q=9OHj& 0TiNNR1°@ɏkWⶹ{(r?{߲*(p={ h$ilxwK?_~o/arN|{wY]6*Z8Z}z &Q]̮rÆTÿ't= ƶrʘȧw MJ]?7zF`CMKV_vBmlT~x%}%7dDff}ǹC }9|>(!G9 |؎@#,D6)Ibt+%y2@H`4}f@f>4'Oʗ>l$vxfrÆqe1\߱Shf!0#4V$"<;_!<61$2@NrΡA P^j @8J]pZ3f$$te&ķ ۇ[ [h[c{ GuF~m9v ()\!GTehx:F x'=㚋5_^`ņzEFd+>krL px"CU_P*7fim`ϔe(e0/I LHf@WekծQ<&f9 Ml==S\۽+p? r]yfա)79#+e~?$wFr=%ZI'r3wg84{U]U]TPM7W:q@dkEs\}534L5[V_ң‚iaVF9ObLq$ݨ9z+Ԯ7p ĨHa ;1jED`AF Kj>3{K*e*n8|he.X}41*`m..Gzx>udF.sN f7Z6q^<gB}_1BWq%*]\*e(a6B*DˆsI guF I@.bt4Ҍ)!Ey4G䌰}Ds&1=E~~\8'J7h?tdVDXaD O5$&y 7ki{ _$654bɾb[ Ԡ*V &91!Nud,x3 c4NFY[H| (14_+LFV]V(":MGsE"\<$ńIiT;D/Lciq6X)MGRrQ 'n0qI0-RyPkI H[uǢBVNX3yߙj{|`'<AXrWk5%?y[ƶhJ+z?%R{"Y xx!ט+z#qOɝd&w4شDKui2 1Q48i-BVpEqjYDoAc9K#4ROsP$ZDύV\SʥrFa d"Fbо795yrPs kd"NR[pjwR`m41 8`C{]?OJ QӬV9:*w;H_)?YsSAP~\7Zy:CP~\I DEzp;C&o%P)%VUR,J6FoO*a=Kx'ךZE#C4z o<4>  ::LbM՜.6U=f&uw_nfOe/,sdOO[~?ͷ Q 6wwϟ_9 IE!GDDH.o?MgKGOۤmI8YۓopJ о:9.wf:~?oq};DD6הttcmR[[E?A%-.@cz^ Rkq* p-v{8,9B\r{ Sô9i)cR&53\D%51ƒhGh\({\ϏX/]f-W2sg[y{vY*R xptj"2쨱?vO $yc*awqQP g_lkQC+yo>}ָc}P_!Sn0J,c?1C.e j _X?/@?^f|(#^Nn}}bpn}} %TP yNZ3䌰5CɧʬoQ=?)ϟ./'Opdgv*ɂ8ʑ|\kn4exruA+2{l~3v=  _g ʕ!+c2{.{P6mah˧x\,|8$%"Ïy!>'ܖUa@GV(^=i*bN+8tBH]J[;ӝ-,h2`Ŭ<,I 4عla;:iK, ?ZE>m@ӗ67(g[ e&LNV~ixҋxZ5l#mF9U2$U=qs(l>x. t2?~JF& +uWߴ}+!3۾"J9=D@Z ZY#dUeӬiQy{?A7GӫbH:`&s|S9v٧e%I޾u] OOo]S:avLJIM|:{t'nӍo]0^Y/-i(}ǂ47ip)r;ImU-n^[S@y4,wc |8FMtń㏟|@=)m5NjfC"~d]&뇍q@EM΂O50u"GT{~K.р [q~|^yգKr**K\N%xea}uMQkWK[J+}ե]XVZE[Y1:i1ŘV ,D T i%2n1/N OI0:~}~KVod.ݭ /mlBNvߟ%&IM$Q7mZC#-1 lE #p>R$ZxBA*N+zdqpupjEXm=iJ]DoQzCjuG5| BoڇwrzM/B r  :kK 1PnvL!)RLJ_W3®~UlgRJ.7R2+RʑgBab, f`|&E*$4ZSܨE0;a!w5&Og'3(XØ'`{ Eps'3B䕂UIu'\tAYܶٻ6r$röb9@>&ap=,([dr-YnI8a B,m݆k`f5(yDvY-E.vYO ^3}ybY3ՔMF$Am"Dg"g"ͯb,pd~x:˃J)l`Yf,D:@Ť37TEKS;Xo\h+m)&^>闊,}XWҒxseכ)nfbewFm﫤jsz?br4&èGV3_Z_W-8fk|+Tb<|AFx}f{qu7cvuͷ0JlVx:_6\}=%<]/U KImwg{OFtr-=DhalZ[EeaǑk~l[uۥŷF`Qo{{!$Ӯs;!ݥo[ttI0`=٬ΓfdJy Ɩ =%۞J.ggMuyT`?'ofAw5Fe#?:}_7S0*C^ `~s1ف evGcA[g ? ;XZ~Phx.3;o-*z]oęw FU{z}7ދsƾ^IBX.EЙ29Z|Ff+E~gD?:wj[awZwl"$ :( 7sD+R$%מQZ+ma,:PiTSw_P-Q4 ˃.>}ѥm$Phj h/ߞZ7hFuu;`b7{m NAe`!5I6-y-EP N|b-ϫr | `t6Of!G>A7Ϛ_?fr!]D䟦yެ=<3*ZoSJ ۂRnvO ,-p=a ~ o~Jbfy:[}b)ӿ272Vb޽g@;~eωQ53iP16BXL\N&٦'G#?|iH2o!5YT rm_=/6~,.*Ќruxb Sv sMҖ ͳ`kV5#l ؉˖|ERKb&[/7_4.[;+ON}guL4c )S{)r(?"o.i5.~<e~?r77ٗ4 ࢘Φ^2V=ʠ7: /oG/fz###5X=f'@{r@TXX)-P җҋqBo޾)XW^z J¨Ėܛ4Rهff\DHK);#4ڇ^a56|ޘ_!Yf1zX` rvKb'HA2k JL0)Ѵғ!5l?%>[v6/w]MIO&s*֓VP[A|j~I7%Wm'JHN^{/lwHE[8OUV;CpZɡ[ Úz ajs5rSWYՍ5R6Hc9(yG gb5#jy,Ƕu:WBeu@6&΃Қڑy 1Ǵ@$fXuξ/ծwtkP`_M_FXck?eMB0A1ϴJamaaNِfSs:u†lD^C@0ie*AD.u5[Ridzb2Cm#"dUhaE0mFbw`ATL+Y# Si;Jռy~u!֐.Qs OTiоm*ވL M)1R[ceDGjT~6_l/_>w=zENA8Re}Kt,qv&{͋5({35d0В`cr@aѸm5$ [~CWhvf/cF<~Q=㫾39g2'u~zvA/+A:㔂 IR(c66iLH -Ai_aKN[B(ӫtل©۫fXXMex`v<%[n 6u)z$_wv[»Ix[ŕv"%#F͵]5dp'񢕥1'L/<T<|o 9p̘).Һq(":Df;vbql1D+<^d*Yn8O="Og*ּ(q @&V‚?lA x2/ FxhP'WAs(S2%C _xr/ P«;Yn)9XQ`G<9Ǖfh~NF`ˑmWHQL묱wns̖/', 0r%XLq1 \DNUi^":J[/?p.R[Ge+܃0M YcS+ $LK00]P&ehYvd&2/$ ܂Ye>rLeND΅k]1BBXS IJ`-dB+O $ȉYAoYT%{7ZU+_WvwAPW8L64PWYX`^?{7wbfj.V П~^\pXe@GfVm=._ 7)RHgju\Vbhكʱ)d@r#@+ 8WJ_ɗ.o7@}8n |y =wɳ *YdLZE"=2VYYT *=xl2Q%FKoF49"9[HrC"$)dB;byğHy'(c%wq5}I[H 5"TvECTqd`QkR˭OAD֎gJA({;R0[DbpL9saY`A athc֌*RD5 \}P> WغQ 8pWj)TrJ6<Ilz{Zob}HL11S̯ۋu=BVjy%| =nJd3AYN/"Gr^B. zQ>L"DQ F'b؝jZѴ<R2/,eF}̆2 =?iAQ*AZ툕0Rm:ɱL%K4t I]&hY2+T[yɑflglQܯnQ-q\gȬ(PrXUA 2k.Rc\VRfV1l ӧˮeˑ,m/M(jowHfGl&LQd f\2,KX{PM6.yON,)BP>rMجȗ |2gz㸕_Yd&,yX{,16'[k[2Fr8=s2==$@Kbb&p%\\V.tV.Eʥ-4ݳrT$c,gw|+&;WȦvIZp6!Ji-3 zcOO(MW?7&&dY+-Xv/c~~{qnk",K,x>x@ٛv{?鷵4XwYu1WT9I& n.p~{`|W.W-aҽ~kWkIO.kE1ݫwZ9Ae\g} b~npIt՗kT#3ʜQz(dQ\B20rjm(=E.C),ιt՗T٘wF)T@Jd]JQ}NJR2"9CɅY*%R֑"fM0&N2I11{PMcGULϬ \d,(䷢\A%LH=NVJWbCp bX9-c!RUT^r׭* N5It䵵dAq<*!q'Fy*7BOVue!2k_1[T_nPyg{ڗ --jQ9&JߪL_;P_\P?~ )w>3˽?ɯH%(ܬ":4>F9dg^]B⻠%yHYb&SBY_{dW_ɻ#Z~_\y "ǧ8{g?ܼ']x/m)Hr4Pm k~>eJe H|Sy_衛ψr TV~KdTC ;J W.ػF;wWq'oA"`|`ȓ_>W ji Cw@v@)WH*P q޳Av QI `*sE :۲\""=ܛCq~x_/ǩ̺QC7C>BV3f@4X96 iVmM$Z.^&=Wl 9_f;d#B_]sk>ֱ6vN#K[ AȢυ1XsXFF8֩{OV,B>SwpQ!iR=,IԞ'sCjBkQt@k'es k&Ӈi`x:mm'G ):5L)NT8Egygn΃}"&X(9̍a9Pˢ):KMb}4'6V8E~C$r8 $Sy@ǒ^ QOTS>S")~rj ϩH/ݥŽn,|.5$VwMN. F_P<\@?@9ׇ͊`l@t,' u01At?ON?/V89{W7_b\Ob͟?_"lIk/~^v>^|NJ_|{|߮?>2?AnU=}X&䍯OEV`oW鼶HyM&( CԝPN>X?pePXK)5d,ן]Zy˦{sYO&֥?|-n&/r`.©~cIy^rn%x2D4](罿t(k+U_LgEN@!em V\K, FK y2%RW<+3LD؈&欑ZH(POб?^PȕdEܜ F_P>~q2{AC4'yRުBY%|ٟ{R򆭹'iQPxq:MT+r 2j Η(<`erA ^i.WK< Q+m+|[Hy4[+|_f Mù9!ncĝnU N؊rH{Qr;**b|ΕB\ږ^5,SY{sU'TًDEU,D:Fzl,^3׵Bm |j%@;VbXy<{ڱSO mܬ<ԤoCu}_wZյl5E!bX^{N5qI0 uZ$ ;I}yP4b7LQ?nʴQC5&dƄ~8e{Vц7 ))$~0%sST8<2|$2$`%J7ژDGVFly)dwS7bc[_@7hc^= j!!SdM,AjΜ=ū/S"HOb͟?4U/ C8Ko?^_^]/c/XoWdiD<0wJnӣKq;$A% L&杲\nt)0c^&ҡ#\\. {mś~Lz3aˠC2Yߕ!G9saCTN`BatB%er)| ޡFT\G+ UV'!2RZEeu=ĘV,22s= Ys$b ]fK.yî4%'^|J(ϖT1G϶0d|lwR]>=1P|D;zbg|v<)(>B~ Mv=EZ1DBXd25x|vT%s gLFqasW I7ڔXJ2m}?t Ke$] &{l&cdՒ,d&Hb[ͪzEY/'NoY74ҏ{ )ҏ2b!CSӏА\E{s˺I!ӺAu;!L~ܿ qu@CrmTb8 H N2HyÞ,x !DI=>G&QW9E=i)`7< -}wMM=%/QKeJ:s"Ii/S-P7(EЅ"*nɥ<8kscw6",E][xn}o\dBqd\8ߌAFw\΅`]\N`mP 7n4QP8L(d܃F%4D'R]aW]c"6=^)q4}q0V / ]=l4>Gg}A@ⶕ[͞EU^{˭?]{y<"ֺ?kag 9{R(yEH7={\_DSW  ?P8oן(# uNI]"=Aw[Lډӱ`i1!(Y؆9f`g03<M:꿦23eH㥴8<kRG%;PYR1}J:~kN@_Q|A" ^[/vKܙpwU|)y.q'0ܚ2`N*.1ϴnȪσiODH1 1٧9$gTwsRD{f_&8PYi'8ߍuٰ(}`0*FBl!Atҷ2:jEg9g"uR?} 4z hD 8C.hKJ!MmjqR;{m =㌰P`9/|n45;$sKn+ps$LZ+{| \AD" ACk b9mɸB(7GƖ9mDkLxFAQ^܀TwО%ky7@kP4 93(EĶ7{mY;/ݖ C(@E(z E$vX&$(ъ<-8j!T|Z-PHr[RU]B&$^֑feY5#v?%rO6׋S-1mZNzxB.6]F;PQM [!ʣ lғ.mJJbQt2:PZiX螭磴h]λnXFSt>_s_;p^S-H8pB;Z:zqEu {_wZDǨ-|o+ H,Ϫ>Na7!Ts >bѾ2Hw6%F,DLcI,1:|*K@mݨ^֭ RTm]Vfl?r3Ѻ !_ҩZSqnY8W)w6^JNf$Z!4+WMBSfA9pyçagyxJlRKUZTT+p[v_GW׋P^~Zv?uqm#j%рǐUc3o>.org,/Iz.u_"|1釫H](K_ovSd8Bf7M,d"‚>U 9:gIM1Ƈ Z[ Ÿ2%`FJbU{`k59UŒAFs˫4MyԿ Don3ߢ2׿)PfSR3NJ2 9<$Ԫ "^ 3l^c\3/=[(fyaA C:Qre%YVB*hc.AQ jq7"Z aE&,Y꜓h +! ',r!1-Eu/c|1I6WEK;j-lvn%Dw_0TBN϶¡C%js9B'HQetoMqD 3,LN!v `DQ.N; +`A0mZT1}8(X=xxqu]~cooz1W3§~K\>7e4o^nn8^&qIm&~Ā~7@A@ax[׉Y؆DѠm[,i9szAV߯vX܆rY\疐; [^?6$ۣe&{{>jdT77'nȉ)krSԃT)!_AM귥v)}({Q,=?g?V2ƕ'!ٚ7o4 7eϟC GWvnCfmyޗάk4BUw6E6̇>Y*1ai>/ҠYE|1} g`)n Ls2dB;!x,lʸ͆S9x8 a<01\8 yhpF=NnGkI3)k`~b#C F+;uﺓI s H1\Z:͛i)(%K#c8E,1*cV[_Yr<7ekzy@PC(J>ø[DIiMCP"7‚riR8;cW$=EoR+"6R>Z$I x/*g +޲<°D)i Anϸ(M@#xIMeZ4RJ;_JKKbf\uX՜x,2ͯ![EȀr!8{Bf{ 6b=plTN2 E{_` -9ܠS\L0g .x^ Y&IVq ؒ-$fI 3I+lk:{L}N^sXڂ~}._$W})LVQ9\]dԪ^ڎjUo/}6EAJvW˧'Gr7C0m2l6/\m߻e;@/'][k- >x%IQ#%IBs.@Ӄ#zQj=ܒkTLdN΄&şG0[{u4ڂ `Llu4hzPYKDk2cȔEB>K= zGJc^c5{9o@p*\n"NƬa6@:,iiLLg2lPz5Wϕ9wo5sCHkZ=WaQ+{VDo)d&1gZa!p& Wdl>TzZ|y#:5m ']XԠ[^/onH_Zn^σ6Q1"i3q7Å >*rJujxӞ6ݎGU<+gj9XV+xU i}UGtvSR*x g-*S3k 'LFy>\5" U9r^Y:(뼌g;*{l贜}ji9lZaOIW<`L"; o9Fj<+ϋlKF=-nYgc縗Rbv?4?~]_dj/U}K|_W 򻫛 I +$Ƈ\IUym&,֩!A)_X&p %7?V\//T'~l@?V'ɸZ<ܕ-*՘6迮/|;k3rIfӼ!dZ/YR 9T9 q&r"(9sn~_N[ÅߤMFME=d勔VJN߹%F(F=CA!Fgy! E 'ipg^Q.JFSi %F(F3CFcތ7jD*j,Mfh:fqEQik$Q#/\n򂕝{ V"*P=Ĩ"N#l#=5 )!%jWZqZ|ʤj#.JrI5F(8j֌Lb)jBiFAۑa\\G%]/TJ"okrBrF@)6&qU4HمF[YlE<ž_͙5P/S5Ҟ< qs q=@{'SSw:JfJlS{^Dp_F.sU[m/>e%\a=#E嗻)@W&ݝ}L@=Qb%g3;5,SdIQv&vFџ8Cp MC1^y|6 Z[hh32=H {{JCJXƇyu)0`է>FȱѫkZ!EJ㠓8]~x =,[OdtRQ=l7Z 6HX@s( L0Byס]8$/WB. b0DaA8 )J֓Y7ܷ>9F?\(Xhw )zW(ֈƵ_HMٛj32xѢhܨ\"lJqF<vк޾=7o83]n^h|^܃i>_ 6|E7vQʙDhTϷֺfQf~? ;3r\~+ ZGE4ET~(8햊A~#Fr :~M\q$:*"U2c%x.Ԟ ?չư#pqfLim5-e 47~ⷰM&E4MQ@hd0""p{`ac(=Q{lEo̩'_᧨hSAug#uXR8LY#Xcq ާ{S/}R]<"ӃX5)i>2e%laI%w}$(u}$e#;~`- D_%IXY<#kK g~~￿p _ʹl@ٛ3z37_V-Kq >4_&Jc/4)oqrO,m-8U7~@v!{=8ZS NsT dLNibʩ̧ dD=zn#vAU mIx{! g&(4b &6VcϞkrd:-In']FMJw̖Ne\ 1NVsrn:8B 3-&]C$)!22T9`0 RQ^*hQN˒@2`0 Q N2,7 9 sa)sr9UN. <#,srm)%(rN@{XT#rHZ}1\QرnphNEqR΁RũTñ˦Tq|b.F qTbHS1푢٨7X=+J ]: _=X0N>zh%.iq/\ U#_wwkn{St÷_fӄ_ 7d=?B!4xצTm[ƺL=o# u(; ނ&f]RbI>h(bwWצ]Sp~ӷ{!JkiܪrjS" \3{e4 _fRKKDW^ߞoL=KA\|6zyj$rҥ=Gҧ8w2 uYIK0Pg?;v.栐8/]8 u]`N6G|ӺY_?R\1Ÿx80|*[T/myk+_Zꕭ:m)+3_ӕ+]n?.Ld` 8cR4SoCYHLsZYY2 fh .XiP)KK1+RRsY8E UBJ)D#Ʋ"J&2Pj˥*J`SHt ( 2)l_:ЙFFhglmaZJ¦:"[%P1L 1M>(G*~>9}܍W6F' !#6kQR ƧI45¡hQ 1a`%\5ZR0"L l$G-TI)UU%2)*LJk<5Z:n) ku"BJ-}{(v*>n)"LJڿ15 )}6(ORzRhr9SJ3AjNR7s74f u3Gv4k{ ' kzS vIN%3SZM05$Ϧyf1W9-I߸HBxL;0llRY9ϱ~Y6-n/A=3h=Hc4;V^yy5QR5PVIErFowR^P칓 ƅ> 'A^#4j!Ɂ:<| Rfa.:\=놵0DZnI9Ɍr /kAh--!D(XN? Ny9s#Bd(G)PRlKuVredYȌe%B{'y d3e ̴QD"h(!&rʊRX.% PNxaXLiɄDR@`\ 9FK^N5ZRŒ _^dNsB:sQe H3I U[e:E5k&a8A'*iWJB$yƹ:?aqhtG?wY)%vZp<\ IYQrT~?<~T$"^6jtV޲ %s3'vC2˫uu1y.m+}`ΔFA.c,q* QN95Tn:+]06=y3Ѓxk aQEJ1ĈCӽ|,1MGhy==.r/8 qby8MH'q<(xI?k3GL=8p)4{wکrn~dRQu-Wݭ[j6;b~g|ǍaCӮ}/7n 쨷mI:ҥ Y%X!2"Z8NP80YB $a"snKԁBW)+2u}UoOe&FmDz֭xX6}oo$g?Z[h;&T~{ {B*%uo#\ +JV3*!̚##Δ{*a'VwM~S3+~nwdonu!;98F[rc'/ 6Hs- {P8sk\\VՃJ1˷go?{WGdJA_* h;=miFL[m]*yl_2b]d%W0ݶRx1__O-շ7^`pP ui(jEC׺Vc Ɓڮz$>=,?v6fNƄ[],F7 !/>gg(F30cmy/&D:bs80dL5wLhĜ [Yy-PB э' od!Wc-ߛRDO[1%=cJ(wfg˛=px6" %ԡj e^6 J%3 !AIu5 Se5̀)P&RVR+33+K郥*^ю4f&䪪2Aτ]ਈǜg,M7h>V/m>B__"P[g^~|l#ڳ[gēA Ԫ6ZuB\JnmtʼЅmyc9up~ ݉/__hH僟ρ$?~roʥZv^@މӫ3UzoacFDQȹG#D$zwHc" ;#@i?Q[oE/r߰S*0mJ {;uKDt?[WjcV!049Ne3D UI.sʈ*$+3k*bkT7=YoNꝩXwFۂi#Y{} !9p_FY/=SV-*٥y<9/wv)>d熝L1zW̮-MJۿ ,'࠘Sfto 9S9ymg|{S3cjH] =':N$ I&@BkR@暷Pl&쁄54 6h?iӈ=8SzJ"{Jĸ&y)9` })ع&,Mqut&05CF45ƍ:i a.qa YXYR/M##rwdk]6>h cNox kDry8]؟_gzwv7~Q1V2م;^w穼{IhcۺoXi=-u=\/[N.2ap`cHwWw_O ?@C3S}Q=wድMQ@T(E܋诡eR,=`#N)3D :#:c*xMefˏN$1l0M i5jpzĂmxҘ )D1KXlfuOnzb-U1&%)3V\?bq"ܼc=Aeѣ&NqV 8SK%QD^Ze IFG|#£ T%G>'{r6${[^94ӓؼvч 6;=\{gdQ\?ՍܛgSDn័go&|\Yc؜d'7\aM+f33* |uQļlrrU]_R&gr>1#H &&Ζŗ߼_ݽ/ULI|PzʭbcSAo ǞܶdMY/ j !5rj;iA@2JS'eU{=Q̘FʣN) _7\rze(]hdnbϯBӛPwꁗ:/z !翊{ayE5NX7 D&b___O|l'; Q 9pv$ڑ:\ͱqC>E43F@Y"60%^fژ^Dn 1뜛M&c*__ "j*JTx&1 YN|q߯Υχo= Tc!lxa!ʫD)9^n{^R@%4 1*(tx2.z:'X9'8p0)ŨKv*ҳNªcV>CpF%!<3DHwr`晙OP4X&29.psD(RWbҁεМf9pVUZ䲠%rslKPǏߗD2%jj+Ya uf xKZ 3*P.uKU2c| ]Bg4nE4F'Ltir8&}(>J"F/VD2}5Ӆ}.N!(1JRrJ.M4R*PXnsϩi"a%PkvkK8j^.MWv(}mM:8E1GxcRGff Rקi"ey|/򩭋i79!,o 7@;,C6G\Ed_Ziy5?Zdd(f=M)5D%WYA:}\*yHWRòG2\]^q%RtfyBlk;B4}XG? #aHpZ^8OqO/:6lRA,5vL'}Τp=T且xE2JT @d@UPVE*(h\9֛gR7Ӈde#}Za qǤYHUA)di2ABѲ:+ Vʪ2Ǚ<$}I@!WX} B{TRd~=ƞL! w՗VAOTH3'{A~~4_2YMX\=~ˏ%nX0m`e4uXx&)fbڪZp*W6ۑOF-&o m]/Mߺ{X?,/&W~3Yc­.ɫBO0n/>g'V`eQJ%Z@SVx} ܌ Pyt%{c/V^O~4f9z c0+M?9/V?IԓWKLrP%!`ɵ+ U8-I`g>x⇴U.~مpT\d&%W[%?sf5 x]Քg_cCh;v6lfIzb>sR$퀴wyIu#/Ȗ0) ] sfcz/ ;q-f7ct9u(R| !1YtjGO#~Ԋ>!GZ2$-)^>}HkYxӽߌDlD%Fˤ^0nGONu9ܻ%Ƒ}lY4~uH$ _`f`u"DHű;0"y\{{17_{_u6+`PKv5XAsn0' m^՞cq$#OijEjq꣊O|:|<Lq15Kԋ{.խszV8ٛ}<5}yy_Ib9;)!殉s|0F٦{ Gf P"gPǷҁ9)K!?yFCefXN7B[mSpyntxfhTL9uƹê[|B'!mUQ:_RDOc2*vw*>5;IM%Zg6z'w/ۭՃ{Xu-\Wv?k|{DU7EElJa!(+`S@Up3 " ʸ(.hEM5.LqQo (/v|Ss3{sts- hR΁`ȣl>e:qOWBB~[³Ab871^?g8?4Ưު%咨JMV>pWE:~CHabƌ$xAbw+-z NN00CT FHSPH k`{uҍJKNc:XxEkܤSqE5NWQ.=L_K[]ϛyc<"~+RnS*Rވ=8`CP$% (,d##$훰:=~JRTE"5ҖqG_1F C .w|'zH }G !!H%^*#X. hiτmK)t[sDdwm͍Vݭ%!fST\yXT}AkH )Iu4ߦ}};h§j&lȧ- ZH\k|ΠVqFch^"[++7Η2\AkZ'Z⒝/|ȍ/7T& $+@HZc$E "*133$rqDbu %M?Rh9K]Kե3& )xUJ*hUJ%J%L)X+PqxukT3zf`-3%.T,!$t2_uHQF(qҦJx8ØJr01J[YicO嫢J3EVic>2hs>%QPVicϡ0KW@q%9TRJc oK"+0;q!U JNPy(A9~cPqGE # "i[A`uN0\fE%rbY2av)b~x-)ڮv*,qH;Px}k(Ό@H+w$F2N]!W|آ%VXè . )Y@^QjࠤhCAIkChzH8έԸZśܴs.BBVqRQ0ӱRRyܶHх)wEjY<:ؓi8PǹHtDA(e~& C)l2x@))1VqLy;בBrx8U~!pSuVS1U(9$3On֎]>(1 yR wrb!ڻbHV:{1~ؓR zSR؊QiOI?#cB$)zM$tSsAjsL 8f"7(!1G#Lϝ!Ph0Q"(>&(&gI`U^jA*#$*\Fsҿ{H"G$) JsF !8SUԬQVꂛ?ͥ섣 JFڻ"$'3$YS3dZ$`wɽƼ%[_&H}.1)ąR@)0B}x#B]%[*1a7؇g~?61$ =>_fܸFr4/;VɽɎ aY1o-SjK4>M15]\~j?t_#QTp2&\I){׆>tέ8L]lmzU:_.7-C2ӱbh=I҅E鵩^a"?zcIuy;dHT^6gV  c2~sڑ {v}炍-)LjNYTNUW ؓA *xqi$aVݏ#5 ;DVn_=lz"s3N=h1ᢥWk0{r_v[ozdb6_)B'z`"=̷+\qvOEnhaxA%)ޗ˨TtJ& %F/@鏵(~<F{/|D<&QTp82&:@Nva}qX@ui Kr[l. 2PXڜyw,tdBF.x'e@ic]d(b 1m@pa 2yo7_cvL_>.-%4PadDa2~xҀ+E3uGvSi>Ë7473v 15eQ.4RmwPл(FiaY=LqOsXW@9)?Z#3_C7SʼnK}`U򍯀Jp~eS~kzvw%I#|t1&EwV!:pAH?|y$WD=Y,6m쳎3.50*d-{r U\?1R3C;LƱ Ҿ($E9ڥ;Xl\%.w=GI!2_By͸!SFb QzZs~(m@8ɜq}]ў먰xS(EˆǕNz,s,|}HEx+TvE0E +q|̭n(9\6_*Q_lmzP揥G[YSYd|\ٮYߘe@F淆yv7JOö>Eel}=>X0[ ]i'/pc7 Խ j4 ŗbuօB?^]OӅqJڀi"ﴝ؋[[dWT꟎rtCR*j9urS:ɖB('r;g&3d.)fE.T W3fM h$ixSVC%Ʌ 7# M( Q'b 4){u/X*h0)b.Q<ӺKI}I1Eչ OK$id!AR4W$g]TqQc949`b:[}eѬ燍^L^;~kxrڱ&X,Mۻ-0_dd."EFv{g͐S눇e@=lz5PRCԮ?Nbݺr0Dn3B%5Cg~}ƠǽŒ'>[T Σ6J@oɨ$M۟~;?eTߞ0%I.h%Oz-}S (na9;[8PnwxJrӽ3l1@J÷}x˭f;bOGt %I†f!X]o[lHŸז"@3f)ؾgŲ}Wp&mC!lf5뉱|cLM^t19#fy^I9@ 01}_CV96~Zk\e)tfLϿ6Xt]υEœ66Ƕc`?l`vXL J!Q}bi=2=cb힯74 ]$`ԢEڞDR!йFoK+^8c 0F? u??^ϣ63*N;42*8KE':㔚21*^+ aL$h ݱy k0%@}ZQQ*x{Q!3)!G(,paO#prAZzyuB'ÒìnωngOug ]Z17K\L[שqJ4ȽE6@B:Lm]|s.~;q򔩟MUZ?|]v6mn{Am'mO]hk_.f}Gׄ](Q^FMDo/,67#Ӎ~_צϋ/Ɖ,v,R` ,#V2D1euh!~lQmpήƦsGlV'je:`b{aY-Ħ$4?`* J-dWN Jҭ~|}l˧\y$Y݄Mx?,9>w+qW=nvWHԴ0Tiiytu@6z]Ҙ=ӷ7M/ٽWmyg #2~v?p6Ͻ\ɔ5VoMD٬f7Gr' =O9Rf8'@-\JÑH$MxmAa %— Î'Uvi(|q,Xhj߃낤8vǝEJHZ9@(|GւVz>h0n#i-N+W}]ɶw-olv˛z,UYj%^HL4)iY9)3ӝ*cE3  D*HJ9m[ ?wRI]Oha=a٨ʧ@lͮ|s;?~?f@3,$,ѥ7*]KIY3ƙ. gAsE2 I:+ LBrQ3D4FX@T/z v@$R1:TFUS/xGTm$jƍ#uk'GfD_O2݁bk5^;@B@Uw96T0a>3tC>7S cffbWK;i?|Ɋ[3emۿssߛ,L턷j3=zrh&87N[{Rćǧ:ˇƬn703C _K[mnm'Ukf/6JWk[v;j/_?vYZ}xwoeuWSEYXUSrD4*](MR{c;Ğ>99r,ܴv d=a:<07.̬kIvZkt^6@ԤogzdtoT)K_; *Eq;?-:Hfe g)"AVq!'Zk5asL(ҥ7O *H.wlwOֱ(%]֛Q .0^̶?Bc%A8z S6=>Mvbr\o@y-ۥq0Zv\OFSDH8#TOM#Y}#yrh(\7.T:⋠DEMչu]G=Stb±6җKRKz_Tp^U*e+['kҒT 8 @H+i!"!LDu@ yZoy&jgia'Wby$6]PveBX$nV\$OG` j3t_?iG<;=|| }_/76!`e{g6&jwbQwJy1;Zsy&3n3L;wsqD&2UDSt!8*8?B.c\anqvDd{ >P hӀCdo(A0x7h%fC5=o_ VgZ^: LDMg|HK/kG^6jHB_Z[ϴ̂ˁeozL%ഠ 'iz X }T=Vqᖚcm%=;ᄈU9UĨxH5jl)EL"ha=dN0Q# PG% dD D*Ή?/#oƮ}JLO]BvgV_ݦ|_ʔ A?l_k1&/mzy (:\3rf0@(MEb\ e D(2 ^EV)if 2b #J8wL,5HZ5@d 0dOxt>N^)Ae􏣀IQ QR@;Q@.R+ϓ!ʩAa"Nƃ  wºG?ɜf#HNKrBDKidN!^,_AV ~Uj>},* IvRnVkGԼ/ <z^Q!Sqayۜ!Kf"ҭshr#nLIݤ 8f տ8Nܭ+pT9gZneZR fQv▥{~Ke15 [#{ͲWUAg(5/ 2vȌv8Αm! 2EPE=X@R,N5xV~p&i3#Cζ9# QC` Ci%qVBh5D7] ~4%d[w'K ({<ƙ.g w:R t`oN`!( f1l֋ĦqFt½76Fl$#a3,!pCL0rA2le z8`# rN`%M< nuP݉CPG >*ԟ ׆iys@s fK&U` ! ( yC?8DE-c<ưZV6W[QuܫFܫ0y^#IrA]{3c.JӦYǚUU.(^-0LP.}>/OHK~~ŽiY%7KzdGxOdxeOB^{-$"XS8GLPϬ!2bG9B?+!r\r/ʕ%J+ٹV!s2^"Lny0sӖP9ϸpӂcX&S*!o:Z{v!1궏BlV!SEbp2Cq[CLr'BCH?ZSG Ǐ" DJnHX_:$ (`EGXD5zFx(!qOˉXG?z "9MQ j_4 M'$&iL˹ְ8 )uZuR*H/rõT_\JLp IPY!~ċ !&G#8WS J9gQcb"C, ?HΊ2$k3댘LZ$S#28 Ypm&n!˸`H \@&lch ɉ08 -#9Dx#!ÞΖ4_&P8D cȧ1$ca %9$<%p#A!s¸>TP0$5 -\$F?>.x6/+zVJCI3B:0ĤIAl(C7׀fיE)M mp!3x\g4ceer ٸ=w/qs w: O!p0|XMbp*`T"WwiZNc"v}+4a9O[y7P#{#qKI0Jqψ6$Ppvĕ mLD֏bC1(k,K ZJ5bR i)1bƚJ2膡hnQ6W(g(2 2 )ST+)3m+Ԛf9Py݃ 3@3#x k4Z2t&>KOj4aH=Y B@*!xQL QN%b 7ܸ AJB(kY0QQ,DKΊQ&ј܍d B0_÷y9gGoO1(=|Z4. ϡ\V.f Ux]/^[eC CzC]v ƽ'c 9Jv[Er3w $g@4>&1"wQLdcWCwfǭm.y^,JjgY[>pABr+P$|4wj6y3]#36|<׏}2r_q n"^_g5? gerEi]l7⏌jh \FIJ8i͟oz9qr@yr=zh ya dT $jmuv{&@o > ;?`Z+==J5I&[zwOE4>G?Ga \mB2X ܥۖBwPCg7õQjImʧHF!UgV16I, bSTn~̭Q4 QrAތ%bwoMn^0ލ{&/1mH +Eћܩ[l@ ߱wٓ FJ(;_趗 *ХV#L*TĵBſBJdWtNi-5PV2:oR{IBUۂXb㕛Y__s/Z'&wwVl3t?/mv۫e/6OP; R;)~DDUi,hbn-{W['%u8n+k{C-|.ɧL}ܧ!uAt}GLtcJn[ ]OQ>Atärx!1풻L,uQn)<w>E[6TTo8Ǽ;Yn}nkԨ{իj^m_)ϿھKl dnUޡkY+?aCwO>*'?Ԟ|<(yD2s1[pznDzpD|?*SoZYnjK'O yv`dAiG9=j ̠lNi[d@̕&S+AO>'4U,5 ?('vX#69|O(#e>u5]0C]1(qsj8 ,pQn|?"+`MvFoۗ,#{x:Bo.|x74sȢM Vշ@Cdɳe'.ee9C< K f"2WyM"a 13)x(,g$-.u"(vXnAܟ]%)oRS!K5!^[ E^WK'|e۽ܫ3ps) 't˗z45Kn qz0}Oj5Wr!?ݚKy go~a6廏E΅T8VKAb~@2.P.I{!a}tPt%nOv߭VAfLxB3&!(Un=e^f⟳7mDʪ?|e6]A\:Fyܾ͒. aPkaΛbuFrA騇Y4$dϳ>a(VrX!J蹔I@yOZB',ݗ 6 mI|V݇!엏 r]]xQs'‘$ 4D$34Ʃ i#*w"_*x%ᙆNl B~!MQ2.@$%!FR뒞╍RiH0T`.s`KpsK H2{EZ=&UN)pg,MWawް-6 !I s8D 2#&``f)2/-\@b)c:Tb;Kz>lHr^ѾZ{S;"': 4ƭcI?5_O&v`T TӃx\$TQbF9#Cp⑳09Gތ)Ş\p [?ɯ#@EiO2[9mO뤜|nwY|_nF#"JđH(d r v2'(²[,1PeO3 |QF!T/QFP7MqRg]ʃxsJFOI}ӔZ2>ձ?o/,K1xcGᥘyD$K[P,Zz&<_E9!Z͉1"G[`-7),w{E0EK Dil=s~fiJ-) 㟹@/ ݜW笍~j?o~}cs7MY)$eqodVYG<}b|X{yJ{/ ˠqZ'xDԣ). 6#F>ϸ@ ͳ +&|\%#G{'N#\:+x`q"E jǟ;( !`_STnvEo03%*|0LjbfHIT!CѿFC#qP.ײhH hfP*(JS03XBecm0d #n?b43baUip+nQY&I-\;v&Z[tRK@!SILSf$#ƈ$\nQrj&1ozS*Ǒ[=Z}5fYDwmxׅ =33JSGt*$A Z̫j(2*T`mSüwJ`@8l \!Աx+ƥx/NAI;cvS-I;אR$a 3l)MdŐT̗LiLjFX!IJI\x쯗K'7n|K"w6*轝_n{>@0(oC/o݄Lg <2\;Qnabw^Pn)~X,OC7ao5Gܠ?C/85ǵ9#vHn'p%i(_]xswLs,ڜrM`Z1)-+u5&ftA,'@hgՃ8| !B<i_Dbΐz2RAB}Ls :ެdW4-i-@sC-~!@U"J.Nj5R,^ #o:r8뫺)OrU)Jvxɭ؜|K0aIg6£Sͼl_Ez_L}?O/x)Ԣ]?!'GO:Ǭ "f ˣEHCgibx_ hU:;A TACWzu83ޓMk9HR!S`Y2$Qi YT\k!۹l*&w\g <ܺ™@YJn T`W"8儥5{H$esyqG.o:Zrk/J*$qֽGLus{ζx͛[ba^=DT!p'Y*™_Cw01-'ie<rwn?eАr)ɯ[ڍ[,BD':툶T@g->t?P5!!\DdpvDRGn<>hc"('iPڭ y"!Si! X^Rg͵X^Wk|4tHb5\ 3b!-,Yk'"/㦘 &6C&9e0XJX"+,b ,c`̍fmO#dHbN*Cݝ(0S0O?QQ+Hͩ5Q'R t ҩWpk|]ʅk,qS8=Hdl~S5Oy$+8i+h] &P2nYels>N".;{Ӌ"+轂!JJZBU'9OcOՃGqEAq,0*1zs(QSBtqpD稭'}ؐnlhz *?l| l7}cU/n2+:|]d|>RR!^@|RƖ2^$|$qIgۿ~ikPrQ@aonrUk,7zG$Kmf$[0&~XivOJvCWDy"rXE堇(~*jykBZ+ZӝjE@S+C%u&H18:q'KPlõ o%`,N"]8(> ŒںIM5X4H*HR!S`Y@.=Ib9 d LnP1RZg~XUwv7{_|Z,%\v@{; #g-^;)Y'zyo?)/Z0)8+DaiDn)jM%Hrd]cDY1)ӏ: Ej2VClGx"Tllմgޜn,7d7kLܦѝ.Մ,O~ڤZR3sUJh<u?z>y^K}ߣsٟ<&oeyyN{mQZT=̓Mqb♿Ǻ^˅o<i󧳓1z0QZ4[I_Wy<鏥yJqfiaFd+JPN kFB^FThT͹Mz.Gn<>hS;n nMH+Rw$ڍ)OB Dtb:-Ln nMH+уe%=oqZeu{eF@θn|R.Ȥe",OJ-D'f0R_pF)".xN9bx3ޛ~}xpZC??п{c6jǑ{iۉ~'hwYϒT,zMݬ[I/w6dᣧ&X%a}# 9:FY2ezg{of]n))eX̛V{vdqx)ݟ/keBe=0,~܂D,< <"DsExk> $qMu]8˭oۊ:@ujmVS-יXZ5 [n`#^2"V7Ͻv~^#).+rt/JQu_͙sw.m @2cWp>cqHQFLb݃{ }7m~[-n'G?/Y}ww$|1p~X)<g3!p/5ǵ9*X8٣[!6'_\_ nt]scaȘa*]ӆy )¥Ng0)9)P'Fi6Vs,QJ/ARP cpFDӔ^mR4[^59a$cAtڠ;%f;`Q> uTP3v}\ b(lUoXI Ex*ibF_)I;sF:EH5FgK)Y Xc3c︛ǐN6[*I{9 'p 2yTg$!$(y5 GeMCQ9/{/j:qיNykA|Gr*׽ C.F8(a 8X< K$0M/j%iFhJ4bhpS,d.dӌӔVEd*b+z'ĸ}Ѭ30pD'cp%V$)5T!ɲ$$C< CJ$.)oN)@s`å;Ӆw=nHe=+QZa. #fY*Im/}LɼT𸫘 H(qu Qē$y+pHCaH0lx _[99 DH_8LD):@tb*c2El#;*ıjT0&ALU\{a(ꡐN2ؤT r`$ʵ a75& "9ӑo$ًdru8gp}A@$Z1RtpJԮoeK~#[cw:^7+Y*Y?`}[ 2醌7]@rG'¹@?+lQh{}+05`Oj/7?G/dBh˙6e.{<}۸F[7w=gc-8r xƝXQp[R+U..qA- JŢf, +`Q6?S֢H&;aQc`c*r- X\WnQE!'Oض3֟}W~bW98i*Fg !rf⁢maWH6-t p!=͌t=a]j&ָzv@N!'V?p.!aSoܽ^}*j!N- xԂ ե":eC-a{e: 5W+~*rVQ7T[BqNmZEYY<,F% 1*"T2iur.Rդ(8Z FkA 7l=Y57kfU?AC{:,)rG ǵ/:ǵ"RmLFaXmiQ_ qx%N u A&+/5FVQ   VO}:eJ /-5LMu9&]I +b.p_ \-~H.9> V:{sW!߹Ǭsu8E拁~uk2܉puBCs)A:q[79jJ'N%*=^SJʣ)N?UUX\Au.mEATC孔suTC3-maWkb,kad֥gJu1 ź49&98v o+xq}磲*duӥgNkeS%<7FN߷NXQP uWW Q ъ.jF')QeXfTEqIP]4N*)X]$j Ď) a[u#0n˼}E?\'Sl>ud(Nۗd}+9O_f_6|5˞ "H?o3̟lXC#l=]2Q7 \(<>C3db57Y~9wَzA [ou6_"/z) [H;9* ډ}̱n*9Zi);M-(57& f,O֌ĒÌ)D!( $ʐ*21N AM;QpDl͘u5m70sG29Z]xk7omo|7C_c?~$xD+GÄ>& @xU-Lz-2ӳ}\cR&[}b +#;bve\慩>mгnpl`)p0=w_FI 5g}_AFg0b1D B:->]/"[ŋfiqQ%*Jj۽Ȋ@ 6yWgH6̑ rAUK(|!]*| /f""iFV<wz1YgpmMǼ vLAa͒2Ww(_p̉"h<%q#aHqbW xfb{_v•vypjdhAZ"Ï&O/9i*&g"i{t~Dԑo݂V͕*H]aWVƠ髽*ʍЋ)RuupžrK8 r;H}D /H7aNGFݬ^:Na.jsXcHTaLc$*S%\J\X$8":I8<_7X#/̅2'F|0d+Wi#a:TQbP8F"lȎ,Bu$@DSP6"f!s&`$27c/eONf dSU}ߪ]hf&b֏MeWS9R`[!IҊj56o4!>!eH2Ɯan]x0Xe{;~@` lojx(d'&@ۿ""I!Ʃ[B4 tHJ4mƄ`ʁ# )6"H(A(J  Rᑙ^,P!{6iB[\흥ŸHNVns7F'3̰R<~_f{:젎QDX>U޳? 0x ;պ}0SɥتSIY'D#7 VNmV.3GX`nsvxKv6Df6Bxfo'Uwnfu3\0ceP[f}io2~Nmaz7f6nGv[KuncX5oNIΪt|b+mɕ橲HG@(gI׊FSPnr +s D4B.BP&LHIh}ʈIo O,n*/t^|Tʑ*c}tCujߩҜ:pr8(D䲹Sި\ֺqVQթ\F%'wiȞݏѝ{qJA×D6*K6٨qmryAddΑM'B]"QocT¨rdT b([W G9ܴhͧ ?,c{C%ÈW+NR֭WՊ}6V'^W{]"=R}tcǑZ)*-UUGKb1Cg>7׆&rƓELm6Dw|{9* U?|*X/ɸ? '_NȼPbj:VhMq]f#>,kadvfKε=^(CR7Z"yA5χOCu:Q)u6"w[͋\`n9ZrYڦ:)!&)ŻHM,E 6( 4,o3yV(lHI;YRa|1PE9=}1OZ*)7LK,[>K^2<&]7](Ve۫$;1{7G߁goսV.A` VWK:yDՆJ89Pћ|A:'Bn*Tqu&EpQ1)٪N¤ j:K!G(_eEZu|Q]mo#+D2/EҀf.8`ܧ//_Q-YlzN;&)Srx6Ρ j6;(Xjexw{@rK`Jr Qs>@ d>%$ăLr3qZYUQ$Hm`ZGeP:)%O &XEc̄$miw[q謁 C%]r/ǡXZroiG[!keE<druYɽe{IgJl2&* 8e~b{6ZF5p禽rA? 5bܷ˂hK}%(4 h.=JIof_ȣuZڭ'h/f_ь'aU([宮nٟ=o.ρÅ֜b>̶X2~!of47yԦ6Ze! Ļ:rxy\K:S[gASdAt0zdjbs]gsU [eӋ;[^w}{&$Ȝ M,c^o()C2Z|c@v34tokZPAJ+L-EZT±% T"K:'UEWJD| PBsgu~Y0Lzauҳ5\m03|&J,Z9 )Rx QX;gWa v#pQtm9Ee'c/88{Ԟ;ܰ{g]+lc ձBZakdm3uh账3(FѤk1, ;(թv~T $:}Ki>-dZ '[hD#yfs4ͥbn\H\VJL\m?^٤J+V)f&z$,n^H4R^ȥo 29FXH pἌre̔F)=íY-ʈ!W2i&Z=&B5 |Ԓxw5jJQu%T (?.gaҚ)}p-CZPoz>kA"RT )i5P&9>MYrC7DhƇ#&:*&&Z==8 1Q#D%%eIC>9$NS a%Uv<$v"r S bhBT8B)E'zbQ+lUj(^uXp\YmCB $}nBRQztmB}SD\TLi1B֢vf*&m19`E38?`2#w[iƍ,&>T`MWkިH)ɹ4RϽ>9mEpKƟ]n>))F!EFk`zyPAI DEs)z2\ GHZnh{O H/P)c:  qd ϝA22JZ]MKm ʊ12Ҟ( N^a UgzɁ@۱^eVA]# nt0v.s#%ʣru᩹ͧ=Z1[tVK d ULBDi@L  \ TWcݶ z"屪\|4o$fquv , d˳ۛ?d&Pr/{11c@nr_k|>^ݑ6#_2g?DWw|!@"-'j'ʇVo3i4tRsio>ަ F9"y lfZowA f{r?:͸0=]'ތnw׻vs}\}\znS p 'ZްB=*-{LXfRS7Bw0_l$5$ z3xG:Ϋ\jVͭ0O;;IvO]NZi,np~E+5?W9t9_}̈́wp`_=+ąLt>a"km?egXɇ .,Ʋ Gܵ:][7@ elfO}G (0B0+cs>GS(7izS(0xʍdɠOD0Clw*a ^95%@L"'dDUMDhE^# ɵ:-Zc_vyOу;-ÂD};e0 ;m`kOۺgN9ALnw9pjPA&rvmQˮ#)O|RL%6R+毴Ԗ).DԈ'j@P B2e'Ґ+=YGڂRcR+R"9R֒k2j@ڨLCt㻋7ѱ`sқ"aқk7#7CNK0E ۞ݽg-\)P1aTamC[hEЊbE w=T(Viܲse3 Jܢw< Jڨ%ݬȑD~U_ۯ=Jz]u+Z㕭z/^߅)[7{JeRv*ek7R9TUop%fsVxtpчBhqA1˯` @{zhYiZuiR̺Ld2ѫ=, c cYȮRG99VSZ=;o"zR&yz=%.$*2-DVw.i^8I(JU2 BrEEB(|KSthH wOZikrD+`7 n'z~tP(}#{yﶶ;GGiezHc"uF68]?YVm7&RXj+51oB,Q)K># Ǔ<La% cb.9S97ON'n8NªeHK>tFa$=:.>1C2>=翔EyǷRD0٨="7qL;d} k(II$-ET0IJD儬X%xr8_q"EƀU$SkB Bׇ}Y_¯o]. t2$L`,>,u"_r Vm~rdqݦ~oq{Ff|E.%)n1 P&Sf?w^i,8"ўZ`̜;ax Z^g( A~q卮ŻQ5G [tk4Svl9x@tc Z|׏^Owgk}}'%]7 ũ1U^T1i}͈} i)1W|eef" ."̷C`ciPH>5IUɋ(]iA*I/I[ Jt-UZ]YhGⱙQ2|]}_rZ N lpo@ ?0_1vSwˆ WN"l&T0bZ/5 )0[H\䖴TqaƤuz @;&  (GB#ٻ涍dWXz9'giUavRݸ\^ O(Kb'= EEBlW"QkYѐƭ2,Ams5/Z ~%tgG|?ƀggc.;2Y r-XF|p/Rv<8 Tw;]|N`ꖽE9}żI-}D1Wa>]V;fE_P&ʘ(^b+D-"!.#҄E$2(G^"_3!o#(.yF"ޓh. h8]:n'8t>#T8G5JI+*kk*>FSyŧ*G - Tzu4j2fe0-3Y&`jb&Z]ݹk9,FCZU4ʰ"; FktuuC'ШHX\A,֝"["BRQb]*Yb ڒXPtZ@QXjW׬qtMpb'`;&* A9 kÄHG_*_wSN~i$.uwhSRQpT{vwIе$i4Gb kJI46Jy·:VšS*S,Z QCVaNROQ&&E)v+b#11t豳agΧ~LpG21)&4%CN_u_UWiNHǛk| yA} 1ڤQ-hWWIrvƓ%D:j ٥IrƫR(t1)J{L);[EmN# HK2NX$e.T#!mxAeZd(0?^ -Iq]jd# WNx2̂ [C,+3ʨ˔r-؀p:#ELc"aNGA E_E$iC] X,]^tLޭ㺔2'Wlk)# B5((%#'K R=> jI=@Uj#z5FRU碆*B(pE8b\#4 P ٺ? +ϖwq3/ifW_Lf>K&o -|o'qb\/{ilhq3jhyQ|P[kl>OrUصef὜"wt׫LizVGp1[UD4d!_mSPÉ`ԡ$RpndjPi-̴6&BBDN )oLX_˦!ƚtݑc49rKD'BekKPQH+*͌ߎq2Iׇj^52(&4~d%fg1:V!aJ yr-RdV<75=G%}cxIE:Zo(P kQY&wRI3P9Jcj@ğK%6)llJ hD"ʭ_e>KO`!1.&V%1m] ՘Ԣ9J?AO:??%KчӜkǥL8b8aD3ē+xJGi޷N9C]Ps9Bd/@EMH9r-E}uaDýYMls}BOJ[濫愒gzZdLbIwH(/oKg׳ppA~!'xV }˒ Ҍ)A%a&rgT+}i? 4Rhj$N5-U,LF;gmʘT8hʅ3xuj9]rjq__$3a.澈{9vZ`"S_TqJUp̶/8*;̟f="OSgI{4E&>6CB*SLfNųP$Ӓh%m?z撔*,:!$e[o }~c0[0 FhٜV2 Z*aN AMs#S8ќ8 VE[uT+ܪ (sx-v@.Pmjo@mNپw`Ci -f/WNcMgM˙O..u/7 y/VR78NWNH9sr}dH&yb&Wu@a1υ3 Aj'LdK@*IY(ϵs/Nj'6h <Z%g9s?UѶE)8̜\aXK0MjsG3(ʥBʦH 6 Y*eEDhM f3FC,8)RүZ5s L’?\˘?90kL_fPh.<2UEZlX˫fW11T`8E;n2l.5XđT#DXkRxN)YL+k42nϓB3 S,8"a$DM8;#dbzlE=ܤ,+ " B3!9ZFYLs,Li1CLY4KIƬiH9&R1Z@ ):nbi=_<\0YA\mGCpYć|H!Vm~}s5G?`Hop ""A$ ?lXr<߫ )Nh~;5Jwwpl>Vn{po03K#BXe/7pGM1&jl‘Wb/ _B^hvPZ*+沁qx"5fMLZ7uTb)X_|,vPz5"l*V~Sg*2T%LGUSjJb}>r\Mdz,Tn:L 7^#aOb1H! (:A--"]9}Ė#{EM|[ړz2f}k@>tNGn8ތv;jpɐ,JL'S(JEbJ%Pn051S.5RvmnW5뽻6R7=Z-l{Mv_}@j_}־ۘ(Mx^.*IS/1uxR{r׾3V7&n^|S|џAB1Ebgf#0f)0ӟFX :nrnBp=ަU8^:oO:^N8*!;͛6zZ9EzZt9 .068ر3NYGDNq5ǧ޺m9ʩd~c9Gn!zР,d9쀞j*޾HM"A#Kaϣ|zVij}L`n"M&/ēW WK+0WSRBE w9sKMh21ZYCiڤ`3'(׃}nW2v>#O ۔.>}b PP&DKk {S3얉P?WW)D۴/Dݦ)F2i QE.1axerIXCI݇oRiN~c:Yx4tt?vBG/oɡNJ2#sJsL*qXfV$爁Ô({JQD=$:pdK¨(92‘1#c|]Rk!ayb\|S%i9\\9BC1X}5]W=93G.۰f͓gJcJKT!EY!tX)NU1cGfS3\A!Lr~&2Y*FI30y< tb^u}SnR(-Y_aQMߨUxR~zdRh"*/zh6c" aƮD>cc1aD&e gۜpy'JkeRS9-ٻ綍$mH{zTڭ=8Tx l#$j)  jTbQ 7==~81:Ml<"ƙ5 |x㹈 }v//no`x yC|dJ/ 0 ]Wo!QN4 FɘK b8$1Gw҉dTJb0qYlumzSGɤbhi$=13\Է1!NDde,vAdZL$wNpjf}T8wg43Z{ ;Z baG,:^CVE3r !xj2Ӷ#:jFYvjDL65L)QNp~_\5}43>R͞V 6\/nRॠg(eE-3Ԇ#xy 5kjhOFib-%R,^eXҼd5ĕTjqPȫ\VY:- 1P3-eqj4MK7.$,N⡧c@"Fmb&Rc8&dI(GgyvJ> }PСg`ݼl$fsj(&8Q{JST4cE&q&1{q`/I4ev.c*G9!s6mnsnv2%V|C=7eI w&8˵"᯹%ZhCZ2\ӕm>ό.gTP#U+.# 'Bkn(USu/uwH׫B'\TsCcu~Z_-,&_}>sVB3z@+X}$m3hQih}>8~E%נ+ ݌f\K{,g3Bhƭlx{C$ݯblV̓6zTk\.l򗛛򳋏M?]ݬ?w|mj%;VFRs]8GTZը":cVJ]]egqv3J{3qͻыֺ^#B|<&w rwQ*E=|Yyh^ 7v[- q4mna2#cG۞H!RSſW6]8puԞ~f/7%CX!Jf(W )QV .QVP<Ԗ,a5sGp]^BlOܻmtk dqӂoWre?'M2Ók1W%tk B <#ttDl?JvYU^[1K'c*!B;WjP8=) ګR^fN vVŀք7P8Z  ԞjfMEÓ`͇HmQwCC+5CR!B#);%4Dp-RK%!(K9ET(^uϨ9t*xog y T]VB4u :ϨeP^׺N N)6zsǺz ->urCk^:ӺNƜrS;{~oܡ"|Tp窔^zrt !~|6,ggy'J˲<O6d;'aȞ'cu_>K> @.eAjԜ%+Ἧ .$jr/$;(_pBE;ET$&xYd ~Q @iĉ%$,#QT0Ph\K`(I'j`{Vjf8%a4JED(P)DH$E *Ib8) ZgBLJ3JE@fʭpM8bof0kX76Q.#="H91(K Ŷs,>#^s]X*(K^zeQ?R(XzHꋲԆq ?r-@\I-GR!X*rKci.5n5^wRX*LRaVc[߿} '܎"?,EYR iw% 7wY3 ~urNOǬptj-WnՁJZUI+BTB XT#r(VBу KAQKjyH-Q + CU tUb*ixB TZ bf{D~. . <$sN,e7n]07J ؃UL3\a!.nW_ctRC!˒Q2hlN KI$Ce Jx>/'P9`:WNȈe(d⥢gџ!RΞ"ǩ]ugBaB^@4׊2БV^k):Bu] * 2U^ SW|Q} HЅv47Bmj8t/, I;}T>~όұ"6pFtS&1eaJu[.qYK$Uu!0:Xvp&8B !qT2Ih8 <1hG*I9q#є%q`hC\~J3oqo>| 檳&"r,&_M~󻒘6RPu剖̅T3m4I*)h@ 4J TQ&UgtQZq~J\jmkQ+yB=;ӳ1kx2w͌oa@C*qE*$Cg39D8x"C,YS!,7.˃8?\\=I׻FWrP~ۻV aB_?$it}7 kg8PV5\FZ(/OFr) h>Ua'3 Q~@gL?drA׾)se%܏p kTtNSw @u<9`ؾɹ|{9Vx] 늮}?0(ɓ)XLkOfB5RUR#K}QZͫ?<R,0`}=lAOc컙lk3){p0 !Nөf {<ۏN N5,[w[1PxHuQ˺ wNbxȪuk!/AzgٺiBaB1N3Xww nm0S))5I lPX;m gslD=8K}Qz}3n[qlK}Qxgt,Ò;(n,5hYjΈaU(KAW5Kc)cY,= Kci.)zӯ,=FrR^%K9ci.5WY ,a iDu:ͬ4((ɦ&2%4FxrO iUޤ7Ȉ@l0I〈۔J2C~8]x{}w,߹Y/R\U]3JJ-kcԄYQ>OoN/>®N}4~C‘m G~NEǨɂqR$HɹmS(=S[%}#ƌyJO`iQ'K/TwggJ3_N?ohɳ__wDσF >޲wmpxƨ!d[C5c`{:mo_Dp1*ӴN[5+skouqU&aqez.)PB͝B{W~1Z)QKfvy Dz޿ 0 |p\ .du`o+L ̉n8K&%[ 6 JpX4pyۢA<WGO֗ 7K5\$a~8nu%-Z2.=)v HV`&H2<2I.b$Q&b"N"Buh8c#$q}wVbG KZ Fم YӭYgWgr:g̞ΝsUj y*uI)ĸN;9A,*SpS;qkKNx){nDG6JKD?ś sCA"٨w{on(H`<5̓_>}y\wzRrC۩2V 53 Ouê֨58OUP0C֮HSx &萘/IV=@)?$էPI.$% LS︕t ֬3oW !zgDwT i5 50ٴOs1k0T{ޟ95YC(C<Bv[L5&(9-",cֺcS i|.c2j^) 2ˋپ5ƓxVgm~_eLmJ%tEc jLE{ 6ECE㗟}uY #֋WE, CgD5Qz{g޾U3A46fB@* (Cdb8x*fc.&2ZmSV6Z%f_r}U0DCgT}"zBEPYN0DSh?$IBxBW,m[f6jGyΦL&|,s)ٸO^оhɥHpǯjOP,t\jmQ%?U㡜#̡b5ݿRPU/&Ĉ񀤡X $R&#qvr v kil_#n_`'=^ᖌAsg;xMM`OXqq0O+S'9qqfzxK{3e @gz̉J=< Ø&}ZՂF3zV}jd6T)0LX2"d{R t۷uGkxP%wݪ!tXfчUl#LDZ$2`&!V)/:p~eB wl= wz:X{ۿ" c~$8h9דHN9?cGo٠^Gw-5yN"Iq 1L(W'$gHN>.`F-"D5%>}$F} L\>86nƘWx}F(?mcy(D~hTƞ5 MoZIBc |HbO,}WX5Cjĕl 8<Ķt>jm3wbTVCo*לUX??oA:V|gݢ}7 = d0ރEDX 6>/c76{cc76{.OFOg;Rq")bInXBxHY 8N1G#:+18rχ +y#A::suزC6'{}2Y ecfw&7h~=w?Ϗݮ%64 PQ%6߹`R;z1C]^MўŹC6]|h  cKS-BQ\"% Fǔc7֜QXЄǩ`1c{%x)hKDJY4ڥ/8vӅHfplk€!H,$ )S3)3 0;F4J`V h)1oP _޷6kȖ7K"?jVW~%·+)~m#LET~u"j_s-(('ujk RtM}sgf itKnbqzl8 7֍ uHsΏIDd9nuugͧܐExw~_w#WYma~l'ciӘPY;+~u+^\Hy6fbgJ#8$0;cs gwkqBbIjsU;IfAk3Kc67֐2%*`CnxMb`+2MaVHHeLJq,v,,}(].̃I,y%&ʕߜFC|kG'*nBЪgOK(oͩ>{+对Tg9X7k]Y h/^Qq{ O4b)9=]+FQ-?\m߭MӭXL/Q00%bVwM0`@ 1&B{z)F\&ڳ191븑-#lL*|I{C+oMh/H=O];Ńd4Y6@(diBi@0 XhhS R,1,7(\ }i}K7\.Ʈ>y!KmM}ydwF)HOɦ~G:2PT8w 9eoP e (wMaK|P TCLljc{J C&vA :NoPUwՠfM{,׬̴V=vqڋ`UPE^Dv`*T 7Uf<^|uȖ{ɒrGbB9]u(1{u򃷎۶ֶ4 `Vj6+.N=d}71HU'mK,N=rO9n}LU{v[ӎC$5-kO+5i-(sN;nsک2KuJEsk"9֠{<Պ`ݒ\g(օkp+(wlL}lAӭXLo DSkK= RP0 [V p 4ъ/bR+1)n9Vhjt{2k6d6ISπ^eήe;Îu12h.Uxð$YAOt$R 瘤Ip1VXGX8%짩I$Ÿ[T3rT[4N;o22Ej3=~ys~UF\kثAYӐpzg5iy)MMA 2p-V|S  1Y$zjy;IOc0$ @*Pj_o&xjַ\ƛ9͆Ho'ab$V9ܰ~jmA:yzXDcs|?z5<@ fx@0+I+Fl~ AU+H>V l}l8&Vj:jQ}MjOq2h NH>J^JO[J/09$ܡB)) PQU\QMJ@&DI ,E8 1V:1| w,˩x1>Vc~mŽ1sC ыlTݻC=L8{`vxMbMC#h@_3c ;F5OK^77Bktjiy0V38`⣐[} s.Tj @*J /DVM_R~_e1߸ ?MѠ#p7v кLILrP+An`BNBQOF,4(oJ9iNzW,cS$_Xˢ;~Y|_JʞQ'S gl;br ?dgXSdO5 5O|Ξ<wȄLY 1 囆7#{iN:X4%&̅?23x?BxW)*64c 3^ү68V<'zihRsP'J Pb{s|qvL!c>۩O_HdwA("BbC0;XIr* 큵@/'*pUW7mWɪ9!)lC.f6tګFn.gv;utsJ X4 Xnc, A2X!TMXƬ s@":gg,4›Nk}vzKb[rȐw;̅Td3(cƵT< ܧxk).H 'Y8tgrSڍij6Ν&ڇqqE$q K WDod>\^ya!89g9yFVF}β20"1+ZN83pE݋ 5L<Θs};_&c ТsiL Hv'{}coqM8{x][s7+,6C~qrlmNj+9* XH>߷1CÛ WT"L MI'!ᖥFZ,E]L Iμ)3$Kq_?os_?_,`kE;u_j ϧXb98,\o,ggvJ/og_?lCdlRЊe9- T^=?! mnOB8! e$ kXʽO|R`2/XG[\ղ/?^jTaTsgcBϾ-VWJR"*RI"GZ | 1 uc ̓3+JX{ub&<1;*=v[fFIh5CF*#KjAҠWnHT ?E?Gc " cSei$s^IiS9Yi3 \) gZQ]=˼I XS3띧4G LhF+X [ ||nF/IT)Qu:uOIu'J}!4x'\RA<@rn4ˉ˿.3_w ""A$)Oλ_^v!TLg "-N>_w3Z`|8A]kCVݹQ'$|V"xwwWq qcܣ< @0-ګ< aPcx0ʳqQ$H!A#OXՊ_n;뷢v?{95i `'ˉFu'jj]j/5?\">Eui| t=Qk S{HЧl=U*W7B(\ݾ=F,UsPfJ}f o J =A^x˲c\aĽٔZrl&Ms) Wn{S>Bo;ӏ~놭_Aвﻹr{!|]Ыy}HQJd}ߣig0x띏Uq7}Los-YѪMé\6K^"P V]G/ syڵ.QƗ5p2UY3W?`Py=Icg v/,ܠ(kMp)>1z7pf*o;ӾgZb|D5ll 3n{AfT}d1. <&R@3OѦYnb45%Ӭ%fA]\+mcLQ[GH/‘2E8nϑ?qvE֭qgpJc/)ƉȳF4g?uNHsd kE/*H^x;2M!ar Pئ'EE5qN DfNGUNY?WHU t!;MŸbKO^ƋTOPO6]!t|Һo٬N??)L7])dkP4^sx%O{Ws:yʇWADvKV\s ծݻ:\\xS\xQ sG_a="Kcyt1壋Qͷ+A C1(>zy]89Hܥ1|p; D"l<,{! к\з6Q@6UԤ.$L+REKKH<0U0-̕0 סD)ŢWX)G?ft!·ZEl)MQrfatMjѱ7y)ڀYG a W 9L9>L4L{ ҥ{kí+~ X[ϕ2Z~ !9f]&^tH?;&bU8Зk`8CgÐVb8$e_"hjv|ƕ%i0M" {x2밧.Ϩ~\K$). :G1QI|;ydg¿YyP2v;iIbE0PӮz8Mg?2 VHD^DKKi-Z{sb9"ܽ~54?jՀX?k'aU4s8vpHds;aTf_ gYBݖid泻dtw4ަ eTcC #䭌Hx#:1؁k%IuDaĔg!Ƌ3rNE1G]O3G\3Fh!rgvg ظ8Sd m5]6b` 8DqRSDR:*j/q.TPbΎ1,Q{x(!hK'= dqRpB ;BeFR.1G\;|*u7.[|ΙR)]lHTtr⣥f,V`Oc&:#1(kQg"䡤 PCvڿ |Of{n4 LO)Mw&JJK00IM#U\22k"z]h(MX V)h"H5F)ѧgQKmbg4ZHcO:T9M_u"%@ Yv e: \Ʊ$3ʭ\uY/<\[R+_qHK T#Tbϩ{SdeY*$!spxG*8̥.58?T+f:T&=l?vet[3#<~^tmևӷ|~acl?dB;o :ktʃ6Bap4|^a`]p)5ȷC>$r͟(YOFź^' ؆B ^w9=/h>ȊBa_u1xH<+-),PG/ó4]P0pe|{~_)&9ZVާG bT>Yb1! WѣuJ`&X|Ofx`V#= B)v54|$D$'> .r)[=' y( cld_Jc$ ZA$"KXNd*NR0'r)O)kH$DŽj+,-,(/%8|Mwa9`Ѱ'Uw\w>ՠ^-^}T(jj eg"LdsYɿ{M_+|t '3B|MlZcl +X vz: V? ݯ2B>|- f:ӅmJX)աJnڕ3)_{v!v`6{ܪr5hۄ($^D*5hKm4ҍ1;nvdqd}дqF5r&_|Pv#Ar%8} %JWyDdi*Z7̵ְZkV9{2YtݛW=6pӦ݋@few^GV%o9yhKu¼V%dw3y2lDfmB 6؟E~ 8JJhDG5Gt&g_Ep\ZMV;)jW{BمEs 8xWKjOJ`Bn;WtP3?@u`Q<<;8&Z q>HپMU'P1jueBٰ\9]s7WT~ܽ SWIeoRfQH4J6{ș!kFUa? @@UcZ jTN ԒKT s-}9r!`ml²$V$"1ܺ@YMULE Úٹvu16dV *c`u?ǺB10=%~] zjs}ݢg6D2y~Ip֦̤hJF7JIk&ȣT!ؚLPu։IUZ,AÁ.tؿzO|s4B&JVcJŞoUd#E0s(UKbXZWZ*-IʀQKy [UϏ2jѥ:[T[⭞KW+[ \z/gW.oa6:}ՋOb@د E]+?ބpw?_1䇇E,6)"Vv|ů  X9}GC 6$΢1 Xqbdz*&"dp1@5#>y; ˜^Na.ehVBwMQg'g >PNÀǮCπ92rJFghj;q]ghkW ̉2YX՟.wUwޙR汍zIWKsaG@ٺZ2ҭ5h%x&=9/w1Hݺ<Vzfx*ʰn Um-p4pXl ߹=%k zn]_x{@tgchLGxjH#}c=j_ƉV =Ϋ6 ]NՕ W-v d}1H)9ȎU++ !!߸)'Soi7J?vA|Gv;x]KnH7.d-mƭ6?ǬҜ=ߑU=^R^=H7.eÎ{`c4X9ڙb -U}w0r_Wa]M)Zl GOn+v= _0Y!N$A' PdGJ7 *T¿*B2lEU4܌c;놽Vͫz".zB=g߆3ɉ˧'gn2J(vJ贃+szn&޳_kQեmAwE;̌<$Oe-f9փvJTCtJz3ݬF[$E,GJn@.pZOJzVd^V~qbYY_E >xbGVlJ@%":]Z=5zxH1*iZ",;~pAiU]ڝ펺p01JQ #'J6!O$QVd7(hwbaPZWDAK ki2Ζvy7}Wfɧݚyn)#왼HkEX`ty}W/W=dMgu]~:M1c\_=y,t]}q0}~<(wo 43;kؾ]Չ _Wb&?wpR)n?ٴE_]_a1G`i!޿y`퓝=ǚ`TaAE7 lQzΑWX?\͗[Ơ)X xJ? |gl߳$$͔ 8Tևy)Z]o߶Y7C'&3d߽NŕǢVU d|oI` ٲYSd˳)ݧ&xXKZGX'SVXүz/^;vh׭d ј.#HL: U0Cs̳2| n}ᅿ.f].~o\L(֧*gݼu.c\dsP^_ETF{FN"fHXdX{Z֠ka^(8Ԓ6 Ŧ.UQ$v(k]/xL&92^;c Glw|NtJОw\&$Xm% &ZN:8byU@ 9x3jCqtjd#ip Ԯg5aJ,k%W %RVah<7:IXW5ؒ*NI3بCzW(>K l[^᳡gy`0]z\=q#O>zL ;XaW -5Y4W9{sTQeYKa*Awղ QԎРfvJ@ټ2?bZ%])*Q.Uxל̦p5! ^D^ j3-Bq]\9,Al$$鯀 *F.`aѺ&XRa6$! BC 6MoTK:j=cIuq+5& 0 %yH_#[%84U%gs@4b_WZ9!*4K*|TJ Jd2\ E&,Qq*ziԠuJ.!RW  ~]Yz_)Wr55YdգWM]Uc2`+]PF^uq|^uyZɆai"a+ bbsRaA"^ufA%3ګ> ]JAAnn-({SoM4JiQBx*I*C>o Ӯ 0XAj)9f\Q"{2dHqbZ(f{+izŎCHa,vݣ6Iݨג_W"Kpu͢읔BVƒi#鰩dKSiUkR[M) b&}^!xf:[FZ EPBХeULl荋@ƈzeһ>2 =.uaIi!c />3V"@pA9Vsv"H=2--my26IL=2 ~xJke"f,d A)nqMZ0>7#V2i5n]x{3h9n(Zl= 3 D*L2țE\dfygYr@5QƧUs 5`|ȭhE|toE1zk"vҾX;}o}ռe G뜚yS'm-NXLͿ|zrM 0*yۓR$.$c]KagݱYHigg`⽝ e"rV$u}KmRRcE(?fВyn2KԷq׭Qs*˅љ~Z9Ռ ƒ,ENyl& 9_fpĐc A4@1;?@=LUx[BGZj0:o(*'{IM\) 2N *kEc{ϜȴlO;o/oI cGXȧ}(tXvZ [qAan9z*|m8a|UXզrP4B?/ ",F!R^|Ty ⃉2(e}?0kZXIGLϤlĠ=x Qc8|n ˍJ4U=pYCsqg͔< rg{qF%4tr\p"S'ب6ၦRFcc<ԶMm f[Z9;UW.v5 UMzLtrzRDۨ))4^07J*I #Nb  %aBlMUb%gG5Z^Ja %8tIsvx/-F&:y?y<*>E'=R3В8K^d9 _@If4X5MlPfhѹ9 ]͛1aHfwfZ!Nݼaen^ F`VJ޼:. 5^ܵت &{@FYS3Q#$$ xw?BWzw^+i5D(-O/Palj[Q'F,g1 4?X ~P7q7qdVTw7HVƪgJƕҗd߾qL66ʾe_ $A[幢cw'Ӏ43u HP$呷"?4>(h *M,"Msi-FI\cͭ qal>xџd67Fl Eか3_ 1_˙Ӱm%¿sj /}:f//#6#$\/F|ḂrI)V0g"s\6:GրJ M4˰ Y5TTbA1?ѭyܛxlXnCƈJ]%)1=A<'#D(d"F Or&Mj1%,CzBE k%ŭ o,U]?4b]s*LkN WsՇL=:Ss#HG=\tje0yU*ukp5n2RWEqUM!̲" PREtG5Fw¦*Pz%_j=yK=<ݬL@V0+.:k>jĶnH&/۪WDbk%Ggצ",%~0)SMh({؝o\U>=}xb]gw*"π.~v#w¿UYGk=Yٟyi8.SǛꎰb߱MhڹX箶}Ox*AG>g[[U}u)nsN`v¼!oƮГ?FJ~(W; !C4 SntbywTnْNckRڭ p` {s[B[,B*ڭK/[ nM C6vA!JrG->],引мJsAy1Grxk3ylD$y0 Z ;?k) {z-U#4(.F#NZ=FP A\-1k=xD#&u@R +QG kEvP),WV"0@S/ٌe{TBxmN6^miil_DVW+0A Vy(58 t5u(&(pj0=SBOIXO' ͇ʹjM]s(AY5X*2D2DEDT{c+rԩh:A77O WީC{Q JW OH V.Sj]VW jIEWM BGo;)ޯժaF,BncQ%X;dZyLѯ#׻az3;r^]+H]/s?}K {wx1aU g7 x(WEY JLH3;Vfգ4Bc (bJJi3غ ߁f?Omz 8ܦP`AyP"ybL!XbK 58VY7Grs|#P@?^;mJ*={r8"!*ɂq#LU~}{G0]@rX̿M_òi[mčqFѾbd#7ojxc,M7S:)S~ZOݲh3Wc dþkJ(p9>T'uu ^=3O\C$w.븵(E"҅Pb&-ITboްX1$Ţ -kK L1݅PT`J(L(NC;P_է-M.V o_u50ogEPvHI(~Q[\#EϦk+R+UT=miٓݛjU[AJ )z.g/ng+OU:iõ]rQyzaL]`C]]vP"Ek3] [ŝNS'/(%X{$udNS8RbjpvT/KsLksKEz$SI(~ ΏxY+p#0ϸ"՗*JԔRH.hERs;ui /kđ|$F9e"%qJYY"d2"r1I1V`&|I5-PѲ6FlH,+bG 3y"ՄbTajg#Rd ϑAZ(R+!F4{ zgNޝObzxǯf67˫ś֟ l%|)@$ĩ5ĭ12`A[(3Ֆe  eRXj<+pR7nD3 zy1 @ruAG|&"Qn29`;IA[B&[E2AT"cJ4"3G""л0 xgwVpkmo;WуljkqᗷϹ$#)ƊJL<|>SOd3?uz D.~ >lXg3B;*|yp}$A\~8$si14 0nל~gyz,WXYP86PɐpybSJzifU;H)׈>*l$p>UsC-?<_o]ys0_˥ǃ' rRXf8'㕥8yX6=pTOT+\xO͞R0\ԭu R8P!b2Bj]X 3a  hGΆ37ڰ߄˵KK^< ,p V"gUܭf5Y[6}/Y+ktpY5?laԺQ:UՇ|p֥FU(DEOFLXZGc!J3 ٞu~t]fYwG4#Jfӹr" 8ͦ/1Pn62):37 G÷>ͪ# S~1jbRYĺCOSkoE,mG,Bz"ľJێ)TaZێx'mGyGߣqjX@'U[wviz->d5P5(L){s[B.F i;*Ћr3B6D Ĉ1Ygr-P˵'l$pDڇ FzRZc,'KUJr(fsF)a(eڡԽ^h֔3JO%\q.glQϺQIXjw(=DL5\yQJxJ w;*dwCTOTot:Q=qn Փ2J3JOAޓOzyOTkuFIQHPhPzZ3JOXu\^0zQgQJIJ)q(DS}4gPZ;a,~Z:QyJ9Sx!C) ߍv+g4J C)Ľ@)a(TkRPJW{gj?CLgQTJ)L='RVُ(-DuTOTkrKOA ~h?PEJ=Պ鳏(8 GjWwRPVRuiR浒f(e2j9"H67PȪ6*[I)ma'bIriF,!L!cRSyD.D<P(nPb{??ySط. ( 2ӻkscʼ3 ,r-d{dvd(ލ`?ᅊ.%~O-TVyb`闛k_HIlE`ܹXʕc"i vެQ")`41]h RʓPFafݯ' AGjWWRAsYv6 2XU=J 0`;!s\$4W\ڄ&6m,P? v?@DPl"*N^;HP{񝈋.Dc)EQEsbњ\ɑqVaaXk嬯eQmO +?x>Ggއy.P!9(h$t푓xsDhIWPFZt;CR&)iTNkq9E)QF1N D%鞖-Y k~m^#3Z$Ct"SxfH1:7Bp-5DsBieXrl iEQ6:ԓ |fCy+GIg),j6fggE22ݸ\`}H;DHI Lg$Fhnu 2KE vJ!exՑcd#QɹV]ѽUY to0"<$#Wc5J$4"2/ z*%ҩ\ܦ)\iK9& +\^qQEޖ\}2ڷuV_Lj GʕXˇ_><ꒀq=~>!7*?~^(;u\m|lXKwKq$O\_5U#fZ)PWfy=+]Sa Lv˛xخ9Ir98[wH 暻;9%o D &5 ;{b5Wk_p}*Xu0?͢u$4"d333#Kb \YYm}g!|wum) ?aۢ(b_^-pO[,)lͯPٯ%Zo.[y33yh琶:{)]t%5ݺ'U~ǃSʼn|z:}cfwF<075w?]_F2Fm߫`._k Ikҥ}K4#S._syV.-ln  /g.NOH@;WrqW$8d@Z5b[gzߞkuϩ~5}z8t hs\AL]ɺmxZn\d3>{/wUO]`y'nf5Mbͪ߅b RG+-@-λRU9EAhIL{P;4ck\=a J`^}^d&'5vkrpް"Q&I)=CA:v{ @٦jvVȦ*N(7y`,T@jFf3qج7XOfJ؛'UN1;PvR WRHe:[0ce=Ɏ(BR  #$kLZ2g34wCz1)9tW[IU9S XU Eg t;K#A$kIjfY惀ty^-^/tΩꖺN3^er# $/XK1NZNr0 c ObyyN9毮T"%fXb֒=M^1UƎ揯n+wTK ,')3$>H Yyc3Ķ- #vnUb}}dT¹*-&kMyүǡv:,ex# A U&OT@&+8,D٪σr]/-2DpM+ }6nBQFf D"'B|H^q R{BsKTT2_anTb1ҞZ_i'yޔ9kÖUUQVT%cGf.[&9U+Ryhη謏@e;e`$B`C d]8a(@mّ@ٌjs,eg8R0!I&5YQ@LeTr4mɋbixn3}m.@jkA7SCly%))ȬP`@is5MMPeٲ}fڋztL'}*d6W8zd}^wEtg$#HfŨ6{ZM%j6MG{&xi:9rNIcFfSj)Aعڽ1D:O^07#"SW3蜱C`Nls0E8=IT!1rʒm!z5X[2E*IhG,*u-マ ͦ~؎]F!HI5Tfuv6TZ==/wf\JTp&|*Yfd-;ؔgG2B"S7PpΠ{`pGr8}t>`I'6Q&S0 vH싼uN RoxX,,do# v6!M8"DXVod͋t&rҿ2lgj?f 7i V2!2d"#S%]f}x bpY εAAZ19[6s;p9Vy!sIMdP@HH`+axQ>09+ߡހ3lkj15$ZQ x(0rz!Y9Q"3"fd 駲;)#2=g##+qW)딌%bSmpAxHոldpңR`/ri2wZ;+]P@QZĚ`eG#{} ߮{Ja拇Y_w=[]kNZL,FQ.wem$Iz$a h4cA!O/(Jˢ,kmHꋈ"hY2{ƬT-Ӟ K.txbSxvVR *P0D$yQCke9@:8ãg2[>U( q;i3!+)޶QSo^ e ݥZӶj0#1;i-mP LQPq qV.40cjy0OH@+'{!jQE1cg/d@mȦ7}t5CԤ&Dr˓g0+)Q,*P[kIi%&]& OHDl(9qkFJ7a$i$@Y2ﭿzEeL:1x.9U㇊)?N'0J,f=CqP * 8avA7=8N/M2!W*Zsǖ)_?`8N!M|h"޼>;'c e#jQ>{)ʼnoCJ%dO5K>JS*i8A?}ԊPrVn|p^d I&ם2wwBAQ ":|ՙ^3 `\>w 5;σ̦q#bVh4/ \>`2~{<Oo֛b}i[&ǣ?>}+y,: E`Uœ;  ~=[7 ,|׊tj[^dN15]X?>@]F/_?&W :⑟6_6V>:ݪ߁vto6'w&;~tFV"<^> jI 7tt~aک/ L=FjѶz~tvtc%e R4jӽ%r7TY&/?<QWX?GjdG<4ǒ_r=|³JF $ ,B,EnV4&lJTGsGlEjK7i/pvkɮw;e /7MT 4`0fiS{iс7%i+QΔ`fkdwo^ǃM[wmK6P_yUi^u}|#Yh)50=X٘!~&&g@ȭSPB#pggǓE<ϖ}?qi%ᆣR.u<{v /Ng?gg/^hNʍMۧUHy9_ d;"8_b~}x^pdgp ߾mwV XS áux_jԹJKzA`$a[ߔ>R8pTZLR#gO%XQT7BSA-ϗng0/W}3-Rߙ">Yr\nt)Nq\篦FTzϷxl-XN^<u1LfoՅ4>dM3HqE)LិN Bou&-Vk!xuLؖXWKԄ]QDeLq& (=Dq8mΦ8%YFg]<,wkr64<, SsLi%b%>궰3i<~oҔ" 9.y)e%מc[MR03k >AOfϓUp~9 W | U9sŵu0f F@b2%Ǫ*Q"5'QGtVandr4B}Í@wW YT'ݎ/䤍֐/+C N<濑s nP'QJ:g|͌4qN%gtjExce!ڥq#^f?x'C wWhLL+91ZC@á | zlvvү|a@H"%W8+^~`kEaoj:8_^|^,Gkb!%J,r.0"%QW)XQXmaP}ya2$?)T0X21ɤ"StpZ/t-I#6 ZC[\POː Œt? )IyA/mpyI1\K_XnLIk Xt 3jg [0ۧH _!IF0AMv5vvu.9Ԫ!yU#BbͿxjQR% JM",gJ`9-`\.I{hLjowG6\vV$ALz՗d,?JZDazN%\wOc{CS ~W9'3J)HRk]ST{Ord , ,j2US[P+x]Wq[6pK24z20i%aHf聕[ Py.QJ{a8!7xVe͒{7!/^&N|u_ L&aQ.kLDhU!'.PJ4L8+Pc~ies0x`'f^ $0`qPqJsL)׍ɉ({w5̺BWB⯁BN|>ܿA`)-YXf!\p*ie;`0Bo7c~OyzښɭQ&CȆ2I<ĵH0/(="ۥMFD+F.nH8 KU-\_ A+=jn낛f`1&dFwrbm/. ZK̡9hPX|vMV SRue/M<0RZ!j21 XQ{H3ͳ٫ 4|6\b53# iD$FND<ؘVAxNάuEmDbp1ϙ!h J7RYo:o(L"LF? 6wR4weC.$YW eh+&ԟyUSڅPBrrU6[#q oؐK~4⹏uV! l&jʇU)&zg-a9AoV< Uf"N ~kqxp;Crd/ ?Qq1j֮JEvNp>cg@´/HVр0vmyi2T;_^2(!^Τzq[o-c<]5 TI._iA?~qB_#lX/XAp0q\/~\SLQN[=Rg2e D NUWUwW5נBtQ)X=ixl=D`9] )th~1wD !9**mBD%\s^~YSNJo#ђcMkӁ?Vz@i"s=z̤<]H~N{?ÿ Ghp.X0($QQ;Fo#= c:&GF&(,wB]SDg{ a6jDHlgmp82N\/SvecpqDFw[0p#fb>8idE%W>}( Q;Y0 QWdpYT 5kR5W` Hx=qK}Gn*z>b=[-,4%-b&<*Q?@2GF9x¸Ԙ+ǵG.U_5D D Y6K'zdtL \=r*r@w5 D!񨿬7~v0OwtpI6F[ Hj%kVJYtQ[s>rM ģt=%ɑO9ZjH]+!5ȁh"?[A5& apf=у̒ q@!ÇI9KdLESXk̴qޚ(UwDL+˟j+JR^޴F3~**'#-,K3=$@[dӅ7byc_A 6Q1 -)Py3W"bݾBQkΗj:2D$]ΈRt+EWTR\H`N Bf FZ@H[B/ v48cLN:T0RW8pw>zd2`=*R`C\εLCzV;̡8.NXLȨg `&[DCC"tőb#Ă0|ar?\ /q&䈤3sH| $rJ)WV+vuЬEZ`T)7Lh.oj*}d9]VN/-kvh_GX&:atWZat"SOFB"*sߟ Io<5Id`ǤcV JYt`& 1V;!j;ؤ"+ :G*ҡ꽹&)i*)l ْ̝jzWd>s iE2JJߑwnn@ @uԴ p2E݁J9Re`CqBS$*P5Fb&JٕdHPnX""`Csk 6B0kK j9P:Tufu) jS j ߟAv18pjPZ뮜GX1|f&|W*M5` v1L>M .׮J=PܹhK4E0[2g6g]s.ﳤ ߼{ \ԇ$*@Ǔ/UPV^-$gs i^ lqڌy ;ΌR7#@Ulj( 1N6#^Ky8UuH{soՋ,'=uI%9{W`ugp7`zC~:e3R[g:nwL%>Kyq/\F[6/ YhDDU~2*8M&6,2T@~ˡ&E34"S{ #K fbYrMdH`Όl:4RF{σ[B1mDkUhID)ҥV '-!8L P& =Zc{:0{֭(s )ujh׃hũ]]" Sl5B_K|۱ߦ]Od"it;2:"kF NE`2iMD#GFaDcm6jcL"ߓ2AFhF2UL*Ӂܱ/ACgµ}}#Ӕm{6,iXνns-.v`3ɞY~-< ښ@8k8*a7sO;mք@|m/ׄPㇰڔzJX5ߤל`F  UZA|Kǣyo3h\+yrCu$狽V-Ѵ!M'NƗR|'jC&tǩ&WɕK?$f)W_IrkCJ5Mid5cv~ZebzRcNW!fS A4NZd@s,>7EMwh-fZ22CS ԁ ޗՂhz쾬uҙ&nĎ1,C2[IJŕv"BbJN;CFK USZBu!HV{y*V.WdM4/qg dGcS*^~i5S$" l2ÒQ* %5ؽncʷjK>ȿK5Hϥ XJDIfI'ool ddzns/UW·_vk\v_r8cQ_hřV1 5n,lZ+D;`kZ`Ŷ{\Eylz _P_ɾ8mZϾV r).i'̼P{mͅKMeZCY`36=l K;ڣz.!./Lͩ]-mI$!&M;6{zZ\ݍVhiTs$>:lLuAO^]j}k(q%8 ݐnw|+t{$4Gi,C<]*Ɠ[G!P跎Zy{o׵@uͣ ւWuuڍ{`HuFg Fls2""mkp;_5̸!v4 iuX;ON9kG' 9qmdSJ|^n8wkA4lw[ޕtwǸּ 0׻`!'nMlJ8 rbc:Mۨ8w5?ڰ7эmJjiˌkyk͏1a%ΤK(s2ø@5 ъ"RH7 _p$8GK7I`G$LPD FqsEdhz=m`͆IEray8 cW4Byfy+)]H"c-:N ? b6T}Im}A# "̪oFz I+U4@;yaK/ ¼4&E hoT|죽W:#*-zVB #A~dWfwQT.o.Y!ox^#BjyrշC9;G}FQKɢ|bgt3a7Bբdj$$Tn/zˆC G{LSP70ґ;F0S CPBq@e51;1~VԜ5^q@?8Tް3aKT6=C)G BebX$~ܸI1(#7 aMoFb(0S~>uzW=#z[R/qq}ݢ[ju!G;xƗV&&ռf/J.\m1̔n 7.I/qVeْ3 1j)` yQ`2d\zP@ Ln찃aÖV*XJNC"$\1:˅eR,fͥ&FR eĤƘڨ8ɌL TFՁ_KC c*dž $R5HOTZSdPZh8ӌf4[tD<3ʣ̬IXԌ ɏÏmz.u\>c+EG`ۥI s=m+"JV Tnz.lmDY)JiiVMԊs?m+fs_QX6R+JOJfOpQXii.uOJ%Jx@I7꺒N>8~lxVr^v&, Yړ5Xhٱ͍?3w,%|Hj7W? U*ݗ[wc(pջݬ_oBksXTiuXRBT1*q>~ Q #79$vPv[!)?D-yMXoDiQ&O M˨P<7RaA]p`Ԉo6obD^7j}y|]QIՙc*Ӌ&#炓qxͬH*0<]!S؋>0[/9fRY.1"UmTW)8yJq'fT0e=wI&Im.lp5>& 2DA(x1Js(#y?|? x ' LXl^ 3Gff- jd%nII.,U9)0["^f6rK<O JT<@p2sW'.h(Ǚl]Ip_g#=^>jq|51er@90M [hkf! vϤkixF8J%}1Q& k"\*!T! l&Z˸i53b 4먑J)V9ֹy>/p{86w{X7hT]5(9aBzH׋/T68_}^v{"H# z?\.=Lgs@=MG@cFpRPDrE/ K=?{R,L".zwsekxWlY"0UaU/ Q[6sW5طRnLR :HsK+T^Ć) m(GŜ-|p',H #A뷭nuvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005634005215145724642017714 0ustar rootrootFeb 19 21:28:18 crc systemd[1]: Starting Kubernetes Kubelet... Feb 19 21:28:18 crc restorecon[4754]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:18 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:19 crc restorecon[4754]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 19 21:28:20 crc kubenswrapper[4771]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:20 crc kubenswrapper[4771]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 19 21:28:20 crc kubenswrapper[4771]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:20 crc kubenswrapper[4771]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:20 crc kubenswrapper[4771]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 19 21:28:20 crc kubenswrapper[4771]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.170986 4771 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176138 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176170 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176179 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176190 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176200 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176212 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176222 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176237 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176245 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176254 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176262 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176269 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176277 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176284 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176292 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176300 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176308 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176316 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176323 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176331 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176338 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176346 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176353 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176361 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176368 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176376 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176384 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176391 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176399 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176407 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176414 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176422 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176430 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176438 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176446 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176453 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176461 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176468 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176477 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176485 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176492 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176500 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176509 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176517 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176524 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176532 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176539 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176547 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176555 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176562 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176569 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176577 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176585 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176593 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176600 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176608 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176615 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176624 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176632 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176639 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176647 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176657 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176668 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176678 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176687 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176698 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176706 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176715 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176723 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176731 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.176739 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177599 4771 flags.go:64] FLAG: --address="0.0.0.0" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177628 4771 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177643 4771 flags.go:64] FLAG: --anonymous-auth="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177655 4771 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177667 4771 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177676 4771 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177687 4771 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177698 4771 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177707 4771 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177716 4771 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177725 4771 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177734 4771 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177743 4771 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177753 4771 flags.go:64] FLAG: --cgroup-root="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177762 4771 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177771 4771 flags.go:64] FLAG: --client-ca-file="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177779 4771 flags.go:64] FLAG: --cloud-config="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177788 4771 flags.go:64] FLAG: --cloud-provider="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177799 4771 flags.go:64] FLAG: --cluster-dns="[]" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177809 4771 flags.go:64] FLAG: --cluster-domain="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177818 4771 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177828 4771 flags.go:64] FLAG: --config-dir="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177836 4771 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177845 4771 flags.go:64] FLAG: --container-log-max-files="5" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177856 4771 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177865 4771 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177874 4771 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177883 4771 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177892 4771 flags.go:64] FLAG: --contention-profiling="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177901 4771 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177910 4771 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177921 4771 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177930 4771 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177941 4771 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177951 4771 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177960 4771 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177968 4771 flags.go:64] FLAG: --enable-load-reader="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177978 4771 flags.go:64] FLAG: --enable-server="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177986 4771 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.177998 4771 flags.go:64] FLAG: --event-burst="100" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178007 4771 flags.go:64] FLAG: --event-qps="50" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178044 4771 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178053 4771 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178062 4771 flags.go:64] FLAG: --eviction-hard="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178072 4771 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178082 4771 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178090 4771 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178099 4771 flags.go:64] FLAG: --eviction-soft="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178109 4771 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178117 4771 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178127 4771 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178135 4771 flags.go:64] FLAG: --experimental-mounter-path="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178144 4771 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178153 4771 flags.go:64] FLAG: --fail-swap-on="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178162 4771 flags.go:64] FLAG: --feature-gates="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178173 4771 flags.go:64] FLAG: --file-check-frequency="20s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178182 4771 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178191 4771 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178200 4771 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178209 4771 flags.go:64] FLAG: --healthz-port="10248" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178218 4771 flags.go:64] FLAG: --help="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178227 4771 flags.go:64] FLAG: --hostname-override="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178235 4771 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178244 4771 flags.go:64] FLAG: --http-check-frequency="20s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178253 4771 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178262 4771 flags.go:64] FLAG: --image-credential-provider-config="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178271 4771 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178280 4771 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178290 4771 flags.go:64] FLAG: --image-service-endpoint="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178298 4771 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178307 4771 flags.go:64] FLAG: --kube-api-burst="100" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178316 4771 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178326 4771 flags.go:64] FLAG: --kube-api-qps="50" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178334 4771 flags.go:64] FLAG: --kube-reserved="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178343 4771 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178352 4771 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178361 4771 flags.go:64] FLAG: --kubelet-cgroups="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178370 4771 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178379 4771 flags.go:64] FLAG: --lock-file="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178387 4771 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178396 4771 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178405 4771 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178418 4771 flags.go:64] FLAG: --log-json-split-stream="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178426 4771 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178435 4771 flags.go:64] FLAG: --log-text-split-stream="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178444 4771 flags.go:64] FLAG: --logging-format="text" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178453 4771 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178462 4771 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178471 4771 flags.go:64] FLAG: --manifest-url="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178480 4771 flags.go:64] FLAG: --manifest-url-header="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178491 4771 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178500 4771 flags.go:64] FLAG: --max-open-files="1000000" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178511 4771 flags.go:64] FLAG: --max-pods="110" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178520 4771 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178528 4771 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178538 4771 flags.go:64] FLAG: --memory-manager-policy="None" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178546 4771 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178556 4771 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178565 4771 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178574 4771 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178593 4771 flags.go:64] FLAG: --node-status-max-images="50" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178602 4771 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178612 4771 flags.go:64] FLAG: --oom-score-adj="-999" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178622 4771 flags.go:64] FLAG: --pod-cidr="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178631 4771 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178644 4771 flags.go:64] FLAG: --pod-manifest-path="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178653 4771 flags.go:64] FLAG: --pod-max-pids="-1" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178663 4771 flags.go:64] FLAG: --pods-per-core="0" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178671 4771 flags.go:64] FLAG: --port="10250" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178680 4771 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178689 4771 flags.go:64] FLAG: --provider-id="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178698 4771 flags.go:64] FLAG: --qos-reserved="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178707 4771 flags.go:64] FLAG: --read-only-port="10255" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178716 4771 flags.go:64] FLAG: --register-node="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178725 4771 flags.go:64] FLAG: --register-schedulable="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178733 4771 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178747 4771 flags.go:64] FLAG: --registry-burst="10" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178756 4771 flags.go:64] FLAG: --registry-qps="5" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178765 4771 flags.go:64] FLAG: --reserved-cpus="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178773 4771 flags.go:64] FLAG: --reserved-memory="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178784 4771 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178792 4771 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178802 4771 flags.go:64] FLAG: --rotate-certificates="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178810 4771 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178819 4771 flags.go:64] FLAG: --runonce="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178827 4771 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178836 4771 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178845 4771 flags.go:64] FLAG: --seccomp-default="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178855 4771 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178863 4771 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178872 4771 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178881 4771 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178890 4771 flags.go:64] FLAG: --storage-driver-password="root" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178899 4771 flags.go:64] FLAG: --storage-driver-secure="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178907 4771 flags.go:64] FLAG: --storage-driver-table="stats" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178916 4771 flags.go:64] FLAG: --storage-driver-user="root" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178926 4771 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178935 4771 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178944 4771 flags.go:64] FLAG: --system-cgroups="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178954 4771 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178974 4771 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178983 4771 flags.go:64] FLAG: --tls-cert-file="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.178991 4771 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179002 4771 flags.go:64] FLAG: --tls-min-version="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179010 4771 flags.go:64] FLAG: --tls-private-key-file="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179045 4771 flags.go:64] FLAG: --topology-manager-policy="none" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179055 4771 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179063 4771 flags.go:64] FLAG: --topology-manager-scope="container" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179072 4771 flags.go:64] FLAG: --v="2" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179083 4771 flags.go:64] FLAG: --version="false" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179094 4771 flags.go:64] FLAG: --vmodule="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179104 4771 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179113 4771 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179328 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179338 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179346 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179355 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179364 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179372 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179386 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179394 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179403 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179411 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179419 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179426 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179434 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179442 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179449 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179460 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179470 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179479 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179488 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179496 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179506 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179515 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179525 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179534 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179543 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179551 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179560 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179569 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179577 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179585 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179593 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179602 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179610 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179618 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179625 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179633 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179640 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179649 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179659 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179667 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179674 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179682 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179690 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179698 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179705 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179713 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179720 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179728 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179736 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179744 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179751 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179759 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179766 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179774 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179782 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179789 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179797 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179804 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179812 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179821 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179828 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179836 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179844 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179852 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179860 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179867 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179875 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179885 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179895 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179906 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.179917 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.179940 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.194222 4771 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.194468 4771 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194557 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194566 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194570 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194582 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194594 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194602 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194607 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194612 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194618 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194624 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194629 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194635 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194639 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194643 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194648 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194653 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194660 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194665 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194669 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194674 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194679 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194683 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194687 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194690 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194694 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194698 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194701 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194705 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194708 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194712 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194715 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194719 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194722 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194727 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194733 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194737 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194741 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194745 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194748 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194752 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194756 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194760 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194764 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194767 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194771 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194775 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194779 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194782 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194786 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194789 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194792 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194796 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194799 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194803 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194806 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194810 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194813 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194817 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194820 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194824 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194827 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194831 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194834 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194838 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194841 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194845 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194848 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194852 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194855 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194860 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194864 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.194872 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194984 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194992 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.194996 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195000 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195003 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195007 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195010 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195030 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195034 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195039 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195042 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195046 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195051 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195055 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195066 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195074 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195079 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195083 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195087 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195092 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195097 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195103 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195109 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195114 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195119 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195124 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195128 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195132 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195138 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195142 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195146 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195149 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195153 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195157 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195161 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195165 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195169 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195172 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195176 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195179 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195183 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195186 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195190 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195193 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195197 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195202 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195207 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195211 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195215 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195219 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195223 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195227 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195233 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195237 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195241 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195245 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195249 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195253 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195257 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195261 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195264 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195268 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195272 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195276 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195279 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195283 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195287 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195290 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195295 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195300 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.195304 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.195310 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.195495 4771 server.go:940] "Client rotation is on, will bootstrap in background" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.199877 4771 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.199951 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.202003 4771 server.go:997] "Starting client certificate rotation" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.202075 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.202363 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-28 10:47:06.804492966 +0000 UTC Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.202487 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.234692 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.237180 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.239740 4771 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.259804 4771 log.go:25] "Validated CRI v1 runtime API" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.298858 4771 log.go:25] "Validated CRI v1 image API" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.301052 4771 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.305901 4771 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-19-21-23-48-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.305949 4771 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.337999 4771 manager.go:217] Machine: {Timestamp:2026-02-19 21:28:20.335268239 +0000 UTC m=+0.606710779 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4d7bd1bd-3a80-4030-965a-4482ba18474d BootID:14ce4e64-a877-4aba-8285-cae8d7ba50e6 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f6:71:c1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f6:71:c1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:fa:f0:82 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1a:97:7b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:de:81:8e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3b:9b:40 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:9b:0e:cc Speed:-1 Mtu:1496} {Name:eth10 MacAddress:56:6c:9f:a2:fa:ab Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:12:39:43:f0:27:ea Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.338551 4771 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.338753 4771 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.339514 4771 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.339834 4771 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.339898 4771 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.340271 4771 topology_manager.go:138] "Creating topology manager with none policy" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.340290 4771 container_manager_linux.go:303] "Creating device plugin manager" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.340842 4771 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.341192 4771 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.341474 4771 state_mem.go:36] "Initialized new in-memory state store" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.341622 4771 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.345547 4771 kubelet.go:418] "Attempting to sync node with API server" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.345586 4771 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.345612 4771 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.345633 4771 kubelet.go:324] "Adding apiserver pod source" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.345652 4771 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.350491 4771 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.351710 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.352255 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.352373 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.352400 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.352533 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.354769 4771 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356644 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356694 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356714 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356770 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356801 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356821 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356839 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356867 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356889 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356913 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356938 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.356957 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.358202 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.359115 4771 server.go:1280] "Started kubelet" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.359237 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.360710 4771 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.360686 4771 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 19 21:28:20 crc systemd[1]: Started Kubernetes Kubelet. Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.362802 4771 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.363821 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.363887 4771 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.364194 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 11:40:31.469090079 +0000 UTC Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.366390 4771 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.366449 4771 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.366419 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.366592 4771 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.367781 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="200ms" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.369056 4771 server.go:460] "Adding debug handlers to kubelet server" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.368911 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.248:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895c30fa28a8729 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:28:20.359071529 +0000 UTC m=+0.630514039,LastTimestamp:2026-02-19 21:28:20.359071529 +0000 UTC m=+0.630514039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.374279 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.374404 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.375543 4771 factory.go:55] Registering systemd factory Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.375565 4771 factory.go:221] Registration of the systemd container factory successfully Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.377253 4771 factory.go:153] Registering CRI-O factory Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.377277 4771 factory.go:221] Registration of the crio container factory successfully Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.377388 4771 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.377442 4771 factory.go:103] Registering Raw factory Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.377467 4771 manager.go:1196] Started watching for new ooms in manager Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.383141 4771 manager.go:319] Starting recovery of all containers Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.390813 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.391058 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.391194 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.391974 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.392161 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.392286 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.392422 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.392956 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.393156 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.393356 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.393567 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.393709 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.393820 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.393979 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.394226 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.394378 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.394490 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.394611 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.394732 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.394857 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.394969 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.395113 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.395229 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.397166 4771 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.397352 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.397511 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.397623 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.397813 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.397937 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.398124 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.398266 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.398389 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.398523 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.398646 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.398827 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.398947 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.399104 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.399219 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.399329 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.399441 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.399676 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.399792 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.399913 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.400065 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.400192 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.400336 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.401387 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.401525 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.401640 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.401766 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.401877 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.401995 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.402172 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.402325 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.402444 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.402561 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.402684 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.402808 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.402970 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.403119 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.403233 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.403366 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.403484 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.403620 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.403767 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.403892 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404015 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404172 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404289 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404426 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404538 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404649 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404798 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.404913 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.405056 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.405251 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.405386 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.405520 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.405643 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.405758 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.405880 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.406091 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.406229 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.406365 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.406484 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.406603 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.406733 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.406872 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.407004 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.407176 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.407328 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.407463 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.407602 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.407773 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.407924 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.408166 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.408311 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.408433 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.408570 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.408743 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.408894 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.409055 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.409207 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.409329 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.409458 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.409585 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.409750 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.409931 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.410123 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.410351 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.410525 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.410697 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.410912 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.411073 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.411356 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.411508 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.411640 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.411772 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.411887 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412005 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412191 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412310 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412425 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412540 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412654 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412823 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.412943 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.413105 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.413257 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.413426 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.413566 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.413719 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.413834 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.413950 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.414145 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.414274 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.414408 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.414537 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.414653 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.414775 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.414958 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.415190 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.415343 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.415488 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.415605 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.415740 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.415867 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.416061 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.416227 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.416448 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.416619 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.416816 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.416981 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.417142 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.417259 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.417374 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.417491 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.417622 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.417740 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.417855 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418010 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418217 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418363 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418489 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418609 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418723 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418838 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.418965 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419119 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419240 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419355 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419489 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419613 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419762 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419926 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.420116 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.420253 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.420385 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.420535 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.420706 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.420843 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.420968 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421122 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421272 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421388 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.419519 4771 manager.go:324] Recovery completed Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421469 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421612 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421688 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421772 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421897 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.421978 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422111 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422194 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422301 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422409 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422496 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422574 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422668 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422746 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422829 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422907 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.422982 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423135 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423223 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423313 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423402 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423479 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423554 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423631 4771 reconstruct.go:97] "Volume reconstruction finished" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.423787 4771 reconciler.go:26] "Reconciler: start to sync state" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.432846 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.435799 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.435877 4771 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.435943 4771 kubelet.go:2335] "Starting kubelet main sync loop" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.436068 4771 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 19 21:28:20 crc kubenswrapper[4771]: W0219 21:28:20.438318 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.438485 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.438846 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.440711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.440777 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.440803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.442068 4771 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.442089 4771 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.442120 4771 state_mem.go:36] "Initialized new in-memory state store" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.458728 4771 policy_none.go:49] "None policy: Start" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.462472 4771 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.462542 4771 state_mem.go:35] "Initializing new in-memory state store" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.466598 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.536953 4771 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.567388 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.567695 4771 manager.go:334] "Starting Device Plugin manager" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.567764 4771 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.567785 4771 server.go:79] "Starting device plugin registration server" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.568386 4771 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.568415 4771 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.568945 4771 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.569106 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="400ms" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.569135 4771 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.569173 4771 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.586362 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.669434 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.671013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.671099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.671127 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.671167 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.671717 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.248:6443: connect: connection refused" node="crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.737707 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.738122 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.739934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.739980 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.739992 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.740215 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.740721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.740950 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.741300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.741327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.741338 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.741448 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.742106 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.742141 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.742987 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743048 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743157 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743603 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743771 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743747 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.744167 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.743644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.744435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.744455 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.744314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.744689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.744699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.744789 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745315 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745485 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745904 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745931 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.745372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.746246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.746260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.747268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.747301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.747311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.747934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.747959 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.747968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.830923 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.830987 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831052 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831085 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831136 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831181 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831224 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831265 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831297 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831388 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831472 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831568 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.831636 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.872536 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.874154 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.874202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.874221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.874253 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.874925 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.248:6443: connect: connection refused" node="crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933279 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933351 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933399 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933551 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933580 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933633 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933700 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933927 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933957 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.933996 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934054 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934122 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934203 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934280 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934302 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934282 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934349 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: I0219 21:28:20.934367 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:20 crc kubenswrapper[4771]: E0219 21:28:20.970099 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="800ms" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.087108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.100845 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.124784 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.140644 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.147342 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:21 crc kubenswrapper[4771]: W0219 21:28:21.161474 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1a5787682d8d5a1adc9a09ff932a394b0c44f174d62f82976360157f7d3b9052 WatchSource:0}: Error finding container 1a5787682d8d5a1adc9a09ff932a394b0c44f174d62f82976360157f7d3b9052: Status 404 returned error can't find the container with id 1a5787682d8d5a1adc9a09ff932a394b0c44f174d62f82976360157f7d3b9052 Feb 19 21:28:21 crc kubenswrapper[4771]: W0219 21:28:21.163042 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-cd00c1e65979636c0e36f7f9c32b700ce4b39d5cc9292f14b991b8432ba19547 WatchSource:0}: Error finding container cd00c1e65979636c0e36f7f9c32b700ce4b39d5cc9292f14b991b8432ba19547: Status 404 returned error can't find the container with id cd00c1e65979636c0e36f7f9c32b700ce4b39d5cc9292f14b991b8432ba19547 Feb 19 21:28:21 crc kubenswrapper[4771]: W0219 21:28:21.244084 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:21 crc kubenswrapper[4771]: E0219 21:28:21.245128 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.275870 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.277513 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.277567 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.277585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.277622 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:21 crc kubenswrapper[4771]: E0219 21:28:21.278149 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.248:6443: connect: connection refused" node="crc" Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.360749 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.365101 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 18:58:37.072049709 +0000 UTC Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.443869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa44a8a1568407e3c0be432bbd09ea63f729708b138ee9b48684f698a56086a7"} Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.446261 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cd00c1e65979636c0e36f7f9c32b700ce4b39d5cc9292f14b991b8432ba19547"} Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.448047 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1a5787682d8d5a1adc9a09ff932a394b0c44f174d62f82976360157f7d3b9052"} Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.449257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"66d877ca7f43fa45bdc8a2eb4d31894b7d24166901ab009b1723b3ef072d40a1"} Feb 19 21:28:21 crc kubenswrapper[4771]: I0219 21:28:21.453135 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28fc39c41a7d7c324a015579bc7db7358675e724ffccf9559d96bb0d5266bd1c"} Feb 19 21:28:21 crc kubenswrapper[4771]: W0219 21:28:21.504776 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:21 crc kubenswrapper[4771]: E0219 21:28:21.504854 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:21 crc kubenswrapper[4771]: E0219 21:28:21.770621 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="1.6s" Feb 19 21:28:21 crc kubenswrapper[4771]: W0219 21:28:21.818509 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:21 crc kubenswrapper[4771]: E0219 21:28:21.818618 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:21 crc kubenswrapper[4771]: W0219 21:28:21.916011 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:21 crc kubenswrapper[4771]: E0219 21:28:21.916109 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.078268 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.079856 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.079900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.079921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.079967 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:22 crc kubenswrapper[4771]: E0219 21:28:22.080652 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.248:6443: connect: connection refused" node="crc" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.360904 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.365947 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 22:05:46.903036575 +0000 UTC Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.421536 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 21:28:22 crc kubenswrapper[4771]: E0219 21:28:22.423114 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.457560 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50" exitCode=0 Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.457610 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50"} Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.457727 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.459268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.459307 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.459320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.461279 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.464074 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.464116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.464071 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3"} Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.464201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913"} Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.464248 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3"} Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.464131 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.466444 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a" exitCode=0 Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.466518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a"} Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.466659 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.468533 4771 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="599add6bfaa32c904eda76c44adc1d5d97f3196343414bd469f0f360140d9ce8" exitCode=0 Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.468617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.468638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"599add6bfaa32c904eda76c44adc1d5d97f3196343414bd469f0f360140d9ce8"} Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.468643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.468679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.468707 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.469669 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.469696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.469708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.470862 4771 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5f57f96a7bbc81f21d92ac99b4fb20e3714c4b4ec23920775f3d4d9db0130641" exitCode=0 Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.470919 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.470916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5f57f96a7bbc81f21d92ac99b4fb20e3714c4b4ec23920775f3d4d9db0130641"} Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.471793 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.471823 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:22 crc kubenswrapper[4771]: I0219 21:28:22.471834 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.361869 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.366635 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 19:37:19.877555726 +0000 UTC Feb 19 21:28:23 crc kubenswrapper[4771]: E0219 21:28:23.372089 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="3.2s" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.480788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4dae89a6e40a69b837cfbdf17a8e1dac3c978e97cd8cd9803ec649e1270d17d9"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.480848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"20d5ec4d701a2a2084dac42abe3e2b4f6a24c7dbd752028c1483d15bc1af3d90"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.480867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5ea174ca076ae5b91580e356974291cc84599cd60d717fa407c1aab5c0ab56e4"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.480992 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.482325 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.482368 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.482385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.487867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3db61b31339e30d62a6ef829eb2c1b1ca90fc6b32f6a3a4b16cbf827baf0b5f6"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.487969 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.489295 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.489343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.489366 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.509362 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.509428 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.509448 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.509468 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.515849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.516008 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.517570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.517614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.517631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.521616 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca" exitCode=0 Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.521664 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca"} Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.521791 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.522836 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.522870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.522885 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.681367 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.682648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.682707 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.682723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.682758 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:23 crc kubenswrapper[4771]: E0219 21:28:23.683219 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.248:6443: connect: connection refused" node="crc" Feb 19 21:28:23 crc kubenswrapper[4771]: W0219 21:28:23.849177 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.248:6443: connect: connection refused Feb 19 21:28:23 crc kubenswrapper[4771]: E0219 21:28:23.849291 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.248:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:23 crc kubenswrapper[4771]: I0219 21:28:23.918871 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.367163 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:09:44.719793654 +0000 UTC Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.476388 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.528413 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c" exitCode=0 Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.528558 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c"} Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.528632 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.530139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.530204 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.530229 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.537440 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789"} Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.537597 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.537660 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.537712 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.537918 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.539676 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.539729 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.539755 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.540181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.540392 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.540570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.540948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.541002 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.541055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.541939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.542014 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.542060 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:24 crc kubenswrapper[4771]: I0219 21:28:24.797359 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.280633 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.367626 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:32:25.051026671 +0000 UTC Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.549846 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.549970 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.549923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37"} Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.550051 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7"} Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.550068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83"} Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.550119 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551136 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551241 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551298 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:25 crc kubenswrapper[4771]: I0219 21:28:25.551814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.368584 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 07:42:01.258890868 +0000 UTC Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.559968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc"} Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.560048 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1"} Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.560077 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.560134 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.561467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.561520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.561540 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.561739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.561785 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.561803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.754305 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.883731 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.885680 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.885741 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.885759 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:26 crc kubenswrapper[4771]: I0219 21:28:26.885790 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:27 crc kubenswrapper[4771]: I0219 21:28:27.368852 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 16:11:13.751150587 +0000 UTC Feb 19 21:28:27 crc kubenswrapper[4771]: I0219 21:28:27.476721 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:28:27 crc kubenswrapper[4771]: I0219 21:28:27.476824 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:28:27 crc kubenswrapper[4771]: I0219 21:28:27.562718 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:27 crc kubenswrapper[4771]: I0219 21:28:27.563929 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:27 crc kubenswrapper[4771]: I0219 21:28:27.563996 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:27 crc kubenswrapper[4771]: I0219 21:28:27.564057 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.229178 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.229526 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.231638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.231718 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.231744 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.315501 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.315776 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.317396 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.317460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.317477 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:28 crc kubenswrapper[4771]: I0219 21:28:28.369385 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 19:49:54.998887595 +0000 UTC Feb 19 21:28:29 crc kubenswrapper[4771]: I0219 21:28:29.369912 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:42:55.960172945 +0000 UTC Feb 19 21:28:29 crc kubenswrapper[4771]: I0219 21:28:29.815164 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 19 21:28:29 crc kubenswrapper[4771]: I0219 21:28:29.815476 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:29 crc kubenswrapper[4771]: I0219 21:28:29.817453 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:29 crc kubenswrapper[4771]: I0219 21:28:29.817520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:29 crc kubenswrapper[4771]: I0219 21:28:29.817536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:30 crc kubenswrapper[4771]: I0219 21:28:30.370694 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 04:41:06.512759847 +0000 UTC Feb 19 21:28:30 crc kubenswrapper[4771]: E0219 21:28:30.586484 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.296677 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.296945 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.298623 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.298671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.298688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.370862 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 22:32:47.969499967 +0000 UTC Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.476856 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.477100 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.479721 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.479821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:31 crc kubenswrapper[4771]: I0219 21:28:31.479851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.158316 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.158545 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.161132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.161196 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.161219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.166878 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.371912 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 17:24:44.034808776 +0000 UTC Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.578110 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.579739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.579810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.579829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:32 crc kubenswrapper[4771]: I0219 21:28:32.586236 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.248246 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.248379 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.373083 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 06:50:54.943833977 +0000 UTC Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.580258 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.581705 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.581755 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.581774 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:33 crc kubenswrapper[4771]: W0219 21:28:33.979049 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 19 21:28:33 crc kubenswrapper[4771]: I0219 21:28:33.979185 4771 trace.go:236] Trace[395983723]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:23.977) (total time: 10001ms): Feb 19 21:28:33 crc kubenswrapper[4771]: Trace[395983723]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:28:33.979) Feb 19 21:28:33 crc kubenswrapper[4771]: Trace[395983723]: [10.001881812s] [10.001881812s] END Feb 19 21:28:33 crc kubenswrapper[4771]: E0219 21:28:33.979222 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.361006 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.373424 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 13:33:01.15991526 +0000 UTC Feb 19 21:28:34 crc kubenswrapper[4771]: W0219 21:28:34.509902 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.510056 4771 trace.go:236] Trace[1936460802]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:24.508) (total time: 10001ms): Feb 19 21:28:34 crc kubenswrapper[4771]: Trace[1936460802]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:28:34.509) Feb 19 21:28:34 crc kubenswrapper[4771]: Trace[1936460802]: [10.001871049s] [10.001871049s] END Feb 19 21:28:34 crc kubenswrapper[4771]: E0219 21:28:34.510096 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 19 21:28:34 crc kubenswrapper[4771]: W0219 21:28:34.621397 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.621534 4771 trace.go:236] Trace[1054431207]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:24.619) (total time: 10001ms): Feb 19 21:28:34 crc kubenswrapper[4771]: Trace[1054431207]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:28:34.621) Feb 19 21:28:34 crc kubenswrapper[4771]: Trace[1054431207]: [10.001712353s] [10.001712353s] END Feb 19 21:28:34 crc kubenswrapper[4771]: E0219 21:28:34.621571 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.797451 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.797537 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.948245 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 21:28:34 crc kubenswrapper[4771]: I0219 21:28:34.948339 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 21:28:35 crc kubenswrapper[4771]: I0219 21:28:35.374005 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 21:34:29.362621283 +0000 UTC Feb 19 21:28:36 crc kubenswrapper[4771]: I0219 21:28:36.375105 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 23:50:05.694600417 +0000 UTC Feb 19 21:28:37 crc kubenswrapper[4771]: I0219 21:28:37.375846 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 02:33:33.949697976 +0000 UTC Feb 19 21:28:37 crc kubenswrapper[4771]: I0219 21:28:37.477502 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:28:37 crc kubenswrapper[4771]: I0219 21:28:37.477589 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:28:38 crc kubenswrapper[4771]: I0219 21:28:38.320529 4771 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:38 crc kubenswrapper[4771]: I0219 21:28:38.376863 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 19:18:04.735562738 +0000 UTC Feb 19 21:28:38 crc kubenswrapper[4771]: I0219 21:28:38.482246 4771 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.358009 4771 apiserver.go:52] "Watching apiserver" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.366475 4771 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.366898 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.367416 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.367631 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.367870 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.367986 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.368186 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:39 crc kubenswrapper[4771]: E0219 21:28:39.368565 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:39 crc kubenswrapper[4771]: E0219 21:28:39.368564 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.369068 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:39 crc kubenswrapper[4771]: E0219 21:28:39.369171 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.374150 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.374175 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.374917 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.375477 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.375513 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.375645 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.377186 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 01:10:39.080700254 +0000 UTC Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.377592 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.378350 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.380112 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.437294 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.454751 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.467534 4771 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.473976 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.494093 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.508891 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.518213 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.535048 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.803357 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.810386 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.819821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.823095 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.830380 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.842914 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.853076 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.870621 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.883728 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.901600 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.929273 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: E0219 21:28:39.933587 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.936348 4771 trace.go:236] Trace[1042351418]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:28.423) (total time: 11512ms): Feb 19 21:28:39 crc kubenswrapper[4771]: Trace[1042351418]: ---"Objects listed" error: 11512ms (21:28:39.936) Feb 19 21:28:39 crc kubenswrapper[4771]: Trace[1042351418]: [11.512558125s] [11.512558125s] END Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.936378 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.941788 4771 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.946007 4771 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.947584 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.951152 4771 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.951472 4771 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.952684 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.952721 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.952737 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.952763 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.952778 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.965523 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.973417 4771 csr.go:261] certificate signing request csr-n9pjp is approved, waiting to be issued Feb 19 21:28:39 crc kubenswrapper[4771]: E0219 21:28:39.973647 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.978250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.978299 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.978312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.978336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.978348 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.980798 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: E0219 21:28:39.991413 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.992759 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.994047 4771 csr.go:257] certificate signing request csr-n9pjp is issued Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.994842 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.994889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.994902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.994927 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4771]: I0219 21:28:39.994940 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.006778 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.007357 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.017149 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.019519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.019566 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.019578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.019605 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.019618 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.028432 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.030417 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.032261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.032289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.032298 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.032317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.032327 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.040410 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.040535 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.040481 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042322 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042342 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042380 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042406 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042351 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042430 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042491 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042546 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042611 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042635 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042677 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042673 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042712 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042719 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042737 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042759 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042781 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042800 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042820 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042864 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042888 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042899 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042907 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.042985 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043002 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043035 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043055 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043077 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043097 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043115 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043133 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043149 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043157 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043165 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043201 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043218 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043235 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043252 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043268 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043283 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043301 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043318 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043336 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043356 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043392 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043429 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043449 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043440 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043466 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043484 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043500 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043551 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043551 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043601 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043622 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043640 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043673 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043709 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043773 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043807 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043811 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043825 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043842 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043839 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043859 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043943 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043960 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043985 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.043996 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044067 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044097 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044122 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044147 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044171 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044192 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044214 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044241 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044267 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044288 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044311 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044331 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044366 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044388 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044420 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044442 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044465 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044486 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044512 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044554 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044575 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044623 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044650 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045626 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045653 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045678 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045734 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045785 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045808 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045916 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045944 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045968 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046036 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046059 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046092 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046120 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046146 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046172 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046196 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046223 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046272 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046296 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046351 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046375 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046400 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046430 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046453 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044618 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044837 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044850 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.044846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045042 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045204 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045630 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.045850 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046066 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046090 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046249 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046296 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.046478 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:40.546456436 +0000 UTC m=+20.817898896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.052967 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.052986 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053063 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053097 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053155 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053185 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053211 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053238 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053266 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053294 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053451 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053718 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053758 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053784 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053808 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053832 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053877 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053902 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053926 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053954 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.053980 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054063 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054093 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054117 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054308 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054346 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054378 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054404 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054429 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054454 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054482 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054507 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054555 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054581 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054605 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054603 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054655 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054683 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054707 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054734 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054758 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054785 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054798 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054845 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.054851 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046918 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.047006 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.047141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.047186 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.047323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.047483 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.047912 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.048028 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.055114 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.055428 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056175 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.048137 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.048152 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.048258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.048365 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.048432 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.049012 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.049197 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.049388 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.049480 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.049789 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.049828 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.050100 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.050195 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.050283 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.050718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.051696 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.051847 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.052002 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.052132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.052283 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.052718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056515 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056250 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056767 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056794 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056813 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056851 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056871 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056888 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056923 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056942 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056960 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.056980 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057000 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057091 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057107 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057127 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057146 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057165 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057200 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057216 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057234 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057251 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057268 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057284 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057300 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057347 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057394 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057414 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057423 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057509 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057534 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057567 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057636 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057645 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057652 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.057700 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057713 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057745 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.057962 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.058283 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.058412 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.058499 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.058680 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.058727 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.059075 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.059491 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.059567 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.059722 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.059930 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.059888 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.058713 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.060383 4771 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.060863 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.060892 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.061363 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.061452 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.061618 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.061745 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.061808 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.061869 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.062441 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.062742 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.062843 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:40.56281499 +0000 UTC m=+20.834257450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.063360 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.063848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.065319 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.065679 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.065899 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.065964 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.066283 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.067306 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.067336 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.067403 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.067720 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.067737 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.067872 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.067965 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.068063 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.068419 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.068433 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.068438 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.068484 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.068500 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.069338 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.069538 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.070009 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.070287 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.076624 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.076209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077136 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077162 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077175 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077186 4771 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077195 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077205 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077215 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077224 4771 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077235 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077246 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077256 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077264 4771 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077273 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077285 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077294 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077303 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077312 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077321 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077331 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.046653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.073410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.073787 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.074364 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.074182 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.074549 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.075351 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.078218 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.075446 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.075893 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.076077 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.076965 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077516 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077576 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077899 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.077934 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.078292 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.078121 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.078067 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.079893 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.079917 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.079932 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.080668 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.080896 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.080917 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.080930 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.081129 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.081157 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.081170 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.081641 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:40.58160372 +0000 UTC m=+20.853046210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.081714 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:40.581702533 +0000 UTC m=+20.853145013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083003 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083038 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083050 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083068 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083079 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083090 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083100 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083110 4771 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083119 4771 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083132 4771 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083143 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083153 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083162 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083172 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083182 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083191 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083199 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083210 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083221 4771 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083236 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083246 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083258 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083270 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083281 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083291 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083302 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083312 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.083430 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.084090 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.085178 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.086217 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.086439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.088864 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.089061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.090314 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.090340 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.090358 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.090371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.090408 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:40.590393449 +0000 UTC m=+20.861835919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.090513 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.090751 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.091717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.091862 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.092687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.092813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.096178 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.096350 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.096565 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.096600 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.097051 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.105737 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.105920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.109557 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.109608 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.109713 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.109940 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.109953 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110097 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110171 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110306 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110445 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110537 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110754 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110966 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.111002 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.110961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.111170 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.111318 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.111335 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.111385 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.111833 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.111872 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.116073 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117180 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117229 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117370 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117457 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117510 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.117658 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.118135 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.118325 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.121556 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.122005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.122265 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.123039 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.123099 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.123150 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.123316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.123479 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.124779 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.124842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.125103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.125790 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.127795 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.129499 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.129680 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.129754 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.132070 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.136418 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.145753 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.145775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.145784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.145802 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.145810 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.146245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.147049 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184382 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184422 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184498 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184510 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184519 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184527 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184535 4771 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184543 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184551 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184560 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184568 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184576 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184584 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184592 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184601 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184608 4771 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184617 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184625 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184632 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184639 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184647 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184655 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184663 4771 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184671 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184679 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184687 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184696 4771 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184704 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184712 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184719 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184727 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184735 4771 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184744 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184752 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184760 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184769 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184777 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184786 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184795 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184803 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184812 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184822 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184830 4771 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184838 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184857 4771 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184866 4771 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184873 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184881 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184889 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184897 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184907 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184915 4771 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184923 4771 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184931 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184938 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184946 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184953 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184961 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184969 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184977 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184985 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.184992 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185000 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185008 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185032 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185039 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185047 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185055 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185065 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185074 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185084 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185093 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185101 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185110 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185118 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185126 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185134 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185141 4771 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185150 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185158 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185166 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185173 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185181 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185189 4771 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185197 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185205 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185213 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185221 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185230 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185238 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185246 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185255 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185263 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185271 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185278 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185286 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185294 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185302 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185310 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185319 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185327 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185335 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185342 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185351 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185359 4771 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185368 4771 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185375 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185384 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185392 4771 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185400 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185408 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185417 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185425 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185433 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185444 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185452 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185461 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185469 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185477 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185486 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185494 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185502 4771 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185511 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185519 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185527 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185535 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185543 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185551 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185560 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185568 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185576 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185584 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185592 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185601 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185609 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185617 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185625 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185633 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185641 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185650 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185657 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185665 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185775 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.185811 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.206475 4771 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.206785 4771 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.206835 4771 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207048 4771 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207092 4771 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207116 4771 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207136 4771 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207157 4771 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207178 4771 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207199 4771 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.207287 4771 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.207244 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.248:42650->38.102.83.248:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1895c30fd2886b62 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:28:21.164239714 +0000 UTC m=+1.435682174,LastTimestamp:2026-02-19 21:28:21.164239714 +0000 UTC m=+1.435682174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.247462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.247497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.247506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.247523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.247531 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.295720 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:40 crc kubenswrapper[4771]: W0219 21:28:40.306002 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-9a489f6f3a3e726ef89b31addcc5a37b73fa04cb54a3d73839755a258f57bde9 WatchSource:0}: Error finding container 9a489f6f3a3e726ef89b31addcc5a37b73fa04cb54a3d73839755a258f57bde9: Status 404 returned error can't find the container with id 9a489f6f3a3e726ef89b31addcc5a37b73fa04cb54a3d73839755a258f57bde9 Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.307594 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.309136 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:28:40 crc kubenswrapper[4771]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 19 21:28:40 crc kubenswrapper[4771]: set -o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 19 21:28:40 crc kubenswrapper[4771]: source /etc/kubernetes/apiserver-url.env Feb 19 21:28:40 crc kubenswrapper[4771]: else Feb 19 21:28:40 crc kubenswrapper[4771]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 19 21:28:40 crc kubenswrapper[4771]: exit 1 Feb 19 21:28:40 crc kubenswrapper[4771]: fi Feb 19 21:28:40 crc kubenswrapper[4771]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 19 21:28:40 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 19 21:28:40 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.310310 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.315433 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.319080 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.320307 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.328211 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:28:40 crc kubenswrapper[4771]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 19 21:28:40 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 19 21:28:40 crc kubenswrapper[4771]: set -o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: source "/env/_master" Feb 19 21:28:40 crc kubenswrapper[4771]: set +o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: fi Feb 19 21:28:40 crc kubenswrapper[4771]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 19 21:28:40 crc kubenswrapper[4771]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 19 21:28:40 crc kubenswrapper[4771]: ho_enable="--enable-hybrid-overlay" Feb 19 21:28:40 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 19 21:28:40 crc kubenswrapper[4771]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 19 21:28:40 crc kubenswrapper[4771]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 19 21:28:40 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 19 21:28:40 crc kubenswrapper[4771]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --webhook-host=127.0.0.1 \ Feb 19 21:28:40 crc kubenswrapper[4771]: --webhook-port=9743 \ Feb 19 21:28:40 crc kubenswrapper[4771]: ${ho_enable} \ Feb 19 21:28:40 crc kubenswrapper[4771]: --enable-interconnect \ Feb 19 21:28:40 crc kubenswrapper[4771]: --disable-approver \ Feb 19 21:28:40 crc kubenswrapper[4771]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --wait-for-kubernetes-api=200s \ Feb 19 21:28:40 crc kubenswrapper[4771]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 19 21:28:40 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 19 21:28:40 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.330078 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:28:40 crc kubenswrapper[4771]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 19 21:28:40 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 19 21:28:40 crc kubenswrapper[4771]: set -o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: source "/env/_master" Feb 19 21:28:40 crc kubenswrapper[4771]: set +o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: fi Feb 19 21:28:40 crc kubenswrapper[4771]: Feb 19 21:28:40 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 19 21:28:40 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 19 21:28:40 crc kubenswrapper[4771]: --disable-webhook \ Feb 19 21:28:40 crc kubenswrapper[4771]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 19 21:28:40 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 19 21:28:40 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.331204 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.349879 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.349905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.349913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.349932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.349941 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.378290 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 08:47:27.795377903 +0000 UTC Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.436858 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.437042 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.442745 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.443962 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.446259 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.447509 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.449417 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.450431 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.451623 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.452275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.452393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.452488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.452606 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.452681 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.453478 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.454747 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.455721 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.456636 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.457813 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.460136 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.461229 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.462268 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.464200 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.465304 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.466618 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.467434 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.468602 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.469921 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.470202 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.471093 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.472245 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.474303 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.475385 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.475983 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.476812 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.477858 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.478703 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.479560 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.480223 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.480871 4771 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.481004 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.482770 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.483454 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.483981 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.485195 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.489923 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.491088 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.492701 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.493783 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.495443 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.496067 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.496848 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.498318 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.499590 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.499919 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.500556 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.501808 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.502624 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.504331 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.504986 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.506396 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.507073 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.507748 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.509012 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.509633 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.512897 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.528336 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.539218 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.555374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.555427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.555442 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.555458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.555469 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.587891 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.587959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.587985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.588048 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588122 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588134 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:41.588104657 +0000 UTC m=+21.859547137 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588184 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:41.588167169 +0000 UTC m=+21.859609649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588215 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588310 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:41.588283242 +0000 UTC m=+21.859725742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588385 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588431 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588452 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.588546 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:41.588520878 +0000 UTC m=+21.859963378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.602815 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b3cfd533db92831fbec45168c5d198f8e85a1dab611fd0d44d119db4293bf4d3"} Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.606440 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.607147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9a489f6f3a3e726ef89b31addcc5a37b73fa04cb54a3d73839755a258f57bde9"} Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.607550 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.608304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0cee6e16a0d284a2ec0dda349e35e0d4b5ae25670f636fa6934566dab5dc0d28"} Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.609216 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:28:40 crc kubenswrapper[4771]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 19 21:28:40 crc kubenswrapper[4771]: set -o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 19 21:28:40 crc kubenswrapper[4771]: source /etc/kubernetes/apiserver-url.env Feb 19 21:28:40 crc kubenswrapper[4771]: else Feb 19 21:28:40 crc kubenswrapper[4771]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 19 21:28:40 crc kubenswrapper[4771]: exit 1 Feb 19 21:28:40 crc kubenswrapper[4771]: fi Feb 19 21:28:40 crc kubenswrapper[4771]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 19 21:28:40 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 19 21:28:40 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.609864 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:28:40 crc kubenswrapper[4771]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 19 21:28:40 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 19 21:28:40 crc kubenswrapper[4771]: set -o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: source "/env/_master" Feb 19 21:28:40 crc kubenswrapper[4771]: set +o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: fi Feb 19 21:28:40 crc kubenswrapper[4771]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 19 21:28:40 crc kubenswrapper[4771]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 19 21:28:40 crc kubenswrapper[4771]: ho_enable="--enable-hybrid-overlay" Feb 19 21:28:40 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 19 21:28:40 crc kubenswrapper[4771]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 19 21:28:40 crc kubenswrapper[4771]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 19 21:28:40 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 19 21:28:40 crc kubenswrapper[4771]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --webhook-host=127.0.0.1 \ Feb 19 21:28:40 crc kubenswrapper[4771]: --webhook-port=9743 \ Feb 19 21:28:40 crc kubenswrapper[4771]: ${ho_enable} \ Feb 19 21:28:40 crc kubenswrapper[4771]: --enable-interconnect \ Feb 19 21:28:40 crc kubenswrapper[4771]: --disable-approver \ Feb 19 21:28:40 crc kubenswrapper[4771]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --wait-for-kubernetes-api=200s \ Feb 19 21:28:40 crc kubenswrapper[4771]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 19 21:28:40 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 19 21:28:40 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.610286 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.613246 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:28:40 crc kubenswrapper[4771]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 19 21:28:40 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 19 21:28:40 crc kubenswrapper[4771]: set -o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: source "/env/_master" Feb 19 21:28:40 crc kubenswrapper[4771]: set +o allexport Feb 19 21:28:40 crc kubenswrapper[4771]: fi Feb 19 21:28:40 crc kubenswrapper[4771]: Feb 19 21:28:40 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 19 21:28:40 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 19 21:28:40 crc kubenswrapper[4771]: --disable-webhook \ Feb 19 21:28:40 crc kubenswrapper[4771]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 19 21:28:40 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 19 21:28:40 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 19 21:28:40 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.614400 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.626921 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.646762 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.657769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.657805 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.657817 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.657836 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.657850 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.665135 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.688502 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.688889 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.688915 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.688928 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: E0219 21:28:40.688973 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:41.688955466 +0000 UTC m=+21.960397926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.690044 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.711259 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.728874 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.748750 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.751209 4771 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.760485 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.760528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.760545 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.760569 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.760585 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.764996 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.793886 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.830228 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.850282 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.861334 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.863013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.863139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.863160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.863186 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.863205 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.872597 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.884836 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.966552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.966603 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.966619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.966643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.966660 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.995281 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 21:23:39 +0000 UTC, rotation deadline is 2026-11-10 11:33:32.748969138 +0000 UTC Feb 19 21:28:40 crc kubenswrapper[4771]: I0219 21:28:40.995373 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6326h4m51.75360098s for next certificate rotation Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.070474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.070537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.070555 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.070581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.070601 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.076718 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.087689 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.137550 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.170830 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.175330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.175602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.175629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.175664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.175691 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.234595 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h8rw6"] Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.235417 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.238375 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.239490 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.241567 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.257230 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.277005 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.284053 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.284121 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.284143 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.284177 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.284206 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.286412 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.293685 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02ee2b40-7d95-4e9d-936e-1feb3161e6c8-hosts-file\") pod \"node-resolver-h8rw6\" (UID: \"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\") " pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.293763 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrw58\" (UniqueName: \"kubernetes.io/projected/02ee2b40-7d95-4e9d-936e-1feb3161e6c8-kube-api-access-mrw58\") pod \"node-resolver-h8rw6\" (UID: \"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\") " pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.305318 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.321993 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.338177 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.339100 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.350347 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.351207 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.356374 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.361197 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.365488 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.378776 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 22:11:27.881216698 +0000 UTC Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.382191 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.389461 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.389521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.389538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.389562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.389579 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.394848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrw58\" (UniqueName: \"kubernetes.io/projected/02ee2b40-7d95-4e9d-936e-1feb3161e6c8-kube-api-access-mrw58\") pod \"node-resolver-h8rw6\" (UID: \"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\") " pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.395010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02ee2b40-7d95-4e9d-936e-1feb3161e6c8-hosts-file\") pod \"node-resolver-h8rw6\" (UID: \"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\") " pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.395167 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02ee2b40-7d95-4e9d-936e-1feb3161e6c8-hosts-file\") pod \"node-resolver-h8rw6\" (UID: \"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\") " pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.400898 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.419296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrw58\" (UniqueName: \"kubernetes.io/projected/02ee2b40-7d95-4e9d-936e-1feb3161e6c8-kube-api-access-mrw58\") pod \"node-resolver-h8rw6\" (UID: \"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\") " pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.420568 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.422229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.436901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.436923 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.437472 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.437569 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.441875 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.454783 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.487417 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.492486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.492528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.492553 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.492579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.492597 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.503849 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.529370 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.554200 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h8rw6" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.568604 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 21:28:41 crc kubenswrapper[4771]: W0219 21:28:41.570388 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ee2b40_7d95_4e9d_936e_1feb3161e6c8.slice/crio-0b63b94c6d63f9904a07b89303543771c5fd72f3658418f6ea8115984b7aa9ac WatchSource:0}: Error finding container 0b63b94c6d63f9904a07b89303543771c5fd72f3658418f6ea8115984b7aa9ac: Status 404 returned error can't find the container with id 0b63b94c6d63f9904a07b89303543771c5fd72f3658418f6ea8115984b7aa9ac Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.572377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.588474 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.595839 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.595899 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.595920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.595942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.595957 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.597413 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.597528 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.597586 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.597638 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597745 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597779 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597779 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597857 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:43.5978288 +0000 UTC m=+23.869271300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597862 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597887 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597888 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:43.597874362 +0000 UTC m=+23.869316862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.597960 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:43.597938713 +0000 UTC m=+23.869381233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.598164 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:43.598124008 +0000 UTC m=+23.869566478 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.611564 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h8rw6" event={"ID":"02ee2b40-7d95-4e9d-936e-1feb3161e6c8","Type":"ContainerStarted","Data":"0b63b94c6d63f9904a07b89303543771c5fd72f3658418f6ea8115984b7aa9ac"} Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.639541 4771 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.698132 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.698277 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.698296 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.698308 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:41 crc kubenswrapper[4771]: E0219 21:28:41.698365 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:43.698345881 +0000 UTC m=+23.969788351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.703178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.703231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.703245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.703265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.703278 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.716958 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.805069 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.805424 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.805434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.805448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.805458 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.907646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.907704 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.907723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.907749 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4771]: I0219 21:28:41.907769 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.010789 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.010840 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.010856 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.010877 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.010897 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.114527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.114904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.115083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.115184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.115262 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.218404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.218906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.219059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.219212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.219335 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.323149 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.323644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.323786 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.323956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.324165 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.379977 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 20:09:31.003820693 +0000 UTC Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.427806 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.427859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.427879 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.427902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.427920 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.437257 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:42 crc kubenswrapper[4771]: E0219 21:28:42.437544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.531536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.532195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.532434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.532654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.532855 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.616983 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h8rw6" event={"ID":"02ee2b40-7d95-4e9d-936e-1feb3161e6c8","Type":"ContainerStarted","Data":"b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.631232 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.636153 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.636689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.636731 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.636761 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.636779 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.652298 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.662563 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.678653 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.695108 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.710628 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.723922 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.735745 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.739820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.740051 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.740190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.740328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.740465 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.753930 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5jtrr"] Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.754357 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.755519 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-q64kb"] Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.756095 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.756215 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.756342 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.756455 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.757336 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.757871 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.758106 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.759057 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.759266 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.760054 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.761517 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.779651 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.790924 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.803230 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.810635 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-cnibin\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.810897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-socket-dir-parent\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-netns\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811122 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-k8s-cni-cncf-io\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811238 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-kubelet\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811361 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-cni-multus\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811510 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db19bfc5-46dd-47ae-9608-aafec9e35f9e-proxy-tls\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811575 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db19bfc5-46dd-47ae-9608-aafec9e35f9e-mcd-auth-proxy-config\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811669 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-system-cni-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-os-release\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811736 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-cni-bin\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-cni-binary-copy\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.811943 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-hostroot\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812064 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/db19bfc5-46dd-47ae-9608-aafec9e35f9e-rootfs\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-conf-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812285 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8kvw\" (UniqueName: \"kubernetes.io/projected/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-kube-api-access-t8kvw\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812382 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-daemon-config\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812475 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-multus-certs\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812572 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqvbr\" (UniqueName: \"kubernetes.io/projected/db19bfc5-46dd-47ae-9608-aafec9e35f9e-kube-api-access-gqvbr\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-cni-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.812776 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-etc-kubernetes\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.828208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.843375 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.843416 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.843427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.843445 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.843455 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.845350 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.860707 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.892407 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-conf-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8kvw\" (UniqueName: \"kubernetes.io/projected/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-kube-api-access-t8kvw\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914167 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-multus-certs\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqvbr\" (UniqueName: \"kubernetes.io/projected/db19bfc5-46dd-47ae-9608-aafec9e35f9e-kube-api-access-gqvbr\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-daemon-config\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914237 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-cni-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914236 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-conf-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-etc-kubernetes\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914287 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-multus-certs\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914305 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-etc-kubernetes\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914429 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-socket-dir-parent\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-netns\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-cnibin\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-cni-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-kubelet\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-socket-dir-parent\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914558 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-kubelet\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914524 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-netns\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914550 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-k8s-cni-cncf-io\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914585 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-cnibin\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914639 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-cni-multus\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db19bfc5-46dd-47ae-9608-aafec9e35f9e-proxy-tls\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db19bfc5-46dd-47ae-9608-aafec9e35f9e-mcd-auth-proxy-config\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914722 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-os-release\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914746 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-cni-bin\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914782 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-system-cni-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914802 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-cni-bin\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914805 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-hostroot\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914809 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-os-release\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914748 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-var-lib-cni-multus\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914876 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-hostroot\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-host-run-k8s-cni-cncf-io\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914902 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-multus-daemon-config\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-system-cni-dir\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914932 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/db19bfc5-46dd-47ae-9608-aafec9e35f9e-rootfs\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/db19bfc5-46dd-47ae-9608-aafec9e35f9e-rootfs\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.914996 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-cni-binary-copy\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.915434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-cni-binary-copy\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.915689 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db19bfc5-46dd-47ae-9608-aafec9e35f9e-mcd-auth-proxy-config\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.922779 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.923899 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db19bfc5-46dd-47ae-9608-aafec9e35f9e-proxy-tls\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.936067 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.939453 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8kvw\" (UniqueName: \"kubernetes.io/projected/c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1-kube-api-access-t8kvw\") pod \"multus-5jtrr\" (UID: \"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\") " pod="openshift-multus/multus-5jtrr" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.944456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqvbr\" (UniqueName: \"kubernetes.io/projected/db19bfc5-46dd-47ae-9608-aafec9e35f9e-kube-api-access-gqvbr\") pod \"machine-config-daemon-q64kb\" (UID: \"db19bfc5-46dd-47ae-9608-aafec9e35f9e\") " pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.945382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.945404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.945414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.945427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.945436 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.957354 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.979177 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:42 crc kubenswrapper[4771]: I0219 21:28:42.988982 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.048216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.048288 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.048311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.048337 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.048356 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.072430 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5jtrr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.080328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:28:43 crc kubenswrapper[4771]: W0219 21:28:43.091794 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28b5b75_d85c_4b96_98d5_2fc60ef5f8e1.slice/crio-95a42b7449cd1a07f35f79f713dee47fa672be54d512f570e6f435548f887dc8 WatchSource:0}: Error finding container 95a42b7449cd1a07f35f79f713dee47fa672be54d512f570e6f435548f887dc8: Status 404 returned error can't find the container with id 95a42b7449cd1a07f35f79f713dee47fa672be54d512f570e6f435548f887dc8 Feb 19 21:28:43 crc kubenswrapper[4771]: W0219 21:28:43.093268 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb19bfc5_46dd_47ae_9608_aafec9e35f9e.slice/crio-c74b808f4a1e199f408a46374438b3f5f65a876a7734ad24e83c2c6174266ac3 WatchSource:0}: Error finding container c74b808f4a1e199f408a46374438b3f5f65a876a7734ad24e83c2c6174266ac3: Status 404 returned error can't find the container with id c74b808f4a1e199f408a46374438b3f5f65a876a7734ad24e83c2c6174266ac3 Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.129404 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vsjdr"] Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.130148 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.132423 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hl4r5"] Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.132900 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.133119 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.133300 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.135769 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.135980 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.136210 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.136395 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.136403 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.136502 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.136591 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.148170 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.151227 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.151254 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.151262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.151276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.151307 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.157781 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.170879 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.180594 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.208075 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-systemd-units\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217531 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s99bh\" (UniqueName: \"kubernetes.io/projected/6b73af94-3912-40fb-95ea-7720e6279f52-kube-api-access-s99bh\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217548 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-var-lib-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217563 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-slash\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-log-socket\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217592 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217606 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-netd\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-config\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-env-overrides\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-kubelet\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217663 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8wz\" (UniqueName: \"kubernetes.io/projected/1ae8f212-03b1-4a09-8a89-22a30241d9a8-kube-api-access-hg8wz\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217694 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-bin\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217709 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-script-lib\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217723 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b73af94-3912-40fb-95ea-7720e6279f52-cni-binary-copy\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217737 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-netns\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217750 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-systemd\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-etc-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217778 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovn-node-metrics-cert\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217791 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-system-cni-dir\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-ovn\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217820 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-node-log\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217834 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b73af94-3912-40fb-95ea-7720e6279f52-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217907 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-cnibin\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.217938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-os-release\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.218185 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.228415 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.240113 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.250639 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.254507 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.254559 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.254575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.254592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.254603 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.264905 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.276058 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.288464 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.310010 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318738 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-netd\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318779 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-config\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-env-overrides\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-kubelet\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8wz\" (UniqueName: \"kubernetes.io/projected/1ae8f212-03b1-4a09-8a89-22a30241d9a8-kube-api-access-hg8wz\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318872 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-bin\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-script-lib\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318935 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-netns\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318954 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-systemd\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318973 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-etc-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.318991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovn-node-metrics-cert\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-system-cni-dir\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b73af94-3912-40fb-95ea-7720e6279f52-cni-binary-copy\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-node-log\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b73af94-3912-40fb-95ea-7720e6279f52-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-ovn\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319132 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-etc-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319129 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-ovn\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319199 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-netns\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319003 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-netd\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319061 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-bin\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319280 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-system-cni-dir\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319369 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-cnibin\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319401 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-os-release\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319684 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-systemd\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319746 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319782 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-cnibin\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.319838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-os-release\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.320885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-env-overrides\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.320996 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6b73af94-3912-40fb-95ea-7720e6279f52-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321309 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-config\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321333 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321419 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-kubelet\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-node-log\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321470 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b73af94-3912-40fb-95ea-7720e6279f52-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-systemd-units\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-systemd-units\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s99bh\" (UniqueName: \"kubernetes.io/projected/6b73af94-3912-40fb-95ea-7720e6279f52-kube-api-access-s99bh\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321869 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-script-lib\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.321912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-var-lib-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-slash\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322061 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-var-lib-openvswitch\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322071 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-log-socket\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-slash\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322255 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b73af94-3912-40fb-95ea-7720e6279f52-cni-binary-copy\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322300 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.322313 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-log-socket\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.328946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovn-node-metrics-cert\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.330605 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.341807 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.346494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8wz\" (UniqueName: \"kubernetes.io/projected/1ae8f212-03b1-4a09-8a89-22a30241d9a8-kube-api-access-hg8wz\") pod \"ovnkube-node-hl4r5\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.353328 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.354883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s99bh\" (UniqueName: \"kubernetes.io/projected/6b73af94-3912-40fb-95ea-7720e6279f52-kube-api-access-s99bh\") pod \"multus-additional-cni-plugins-vsjdr\" (UID: \"6b73af94-3912-40fb-95ea-7720e6279f52\") " pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.356241 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.356263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.356270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.356353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.356366 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.365571 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.374522 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.381028 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:28:17.974399115 +0000 UTC Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.386922 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.397995 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.412375 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.432036 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.436960 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.437008 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.437123 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.437230 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.444696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.454258 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.458579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.458613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.458626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.458640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.458651 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.472079 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.534616 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" Feb 19 21:28:43 crc kubenswrapper[4771]: W0219 21:28:43.548188 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b73af94_3912_40fb_95ea_7720e6279f52.slice/crio-aa0dea850d3daef5b2d407589202043c5f0878b21d64fc8834cc24d6c9a1150c WatchSource:0}: Error finding container aa0dea850d3daef5b2d407589202043c5f0878b21d64fc8834cc24d6c9a1150c: Status 404 returned error can't find the container with id aa0dea850d3daef5b2d407589202043c5f0878b21d64fc8834cc24d6c9a1150c Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.560867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.560903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.560914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.560929 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.560940 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.564418 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:43 crc kubenswrapper[4771]: W0219 21:28:43.579700 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ae8f212_03b1_4a09_8a89_22a30241d9a8.slice/crio-9c98926fe726dcf9f9a723b0f10f3e59f8e520a5b1c56ebe240a8b893cfe6eae WatchSource:0}: Error finding container 9c98926fe726dcf9f9a723b0f10f3e59f8e520a5b1c56ebe240a8b893cfe6eae: Status 404 returned error can't find the container with id 9c98926fe726dcf9f9a723b0f10f3e59f8e520a5b1c56ebe240a8b893cfe6eae Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.623287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerStarted","Data":"aa0dea850d3daef5b2d407589202043c5f0878b21d64fc8834cc24d6c9a1150c"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.625094 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.625211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"9c98926fe726dcf9f9a723b0f10f3e59f8e520a5b1c56ebe240a8b893cfe6eae"} Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625254 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:47.625219673 +0000 UTC m=+27.896662193 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.625314 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.625372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.625428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625660 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625663 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625706 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:47.625694516 +0000 UTC m=+27.897136986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625778 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625794 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625795 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:47.625771678 +0000 UTC m=+27.897214188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625807 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.625840 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:47.625829919 +0000 UTC m=+27.897272389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.628217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.628275 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.628301 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"c74b808f4a1e199f408a46374438b3f5f65a876a7734ad24e83c2c6174266ac3"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.631050 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jtrr" event={"ID":"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1","Type":"ContainerStarted","Data":"44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.631079 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jtrr" event={"ID":"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1","Type":"ContainerStarted","Data":"95a42b7449cd1a07f35f79f713dee47fa672be54d512f570e6f435548f887dc8"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.638347 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.660451 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.664093 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.664164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.664176 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.664220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.664233 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.675369 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.685913 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.696416 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.704464 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.717351 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.726148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.726921 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.726964 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.726982 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:43 crc kubenswrapper[4771]: E0219 21:28:43.727070 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:47.727048739 +0000 UTC m=+27.998491219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.728734 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.738851 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.756733 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.767078 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.767138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.767161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.767187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.767207 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.773449 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.790087 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.806259 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.818292 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.850485 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.862902 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.872650 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.872699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.872714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.872734 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.872749 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.876529 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.887779 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.897316 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.913439 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.928089 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.944597 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.969436 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.975592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.975651 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.975664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.975682 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.975694 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4771]: I0219 21:28:43.987754 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.000694 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.035386 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.078219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.078263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.078274 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.078290 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.078303 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.180640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.180674 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.180685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.180703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.180715 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.283592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.283642 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.283660 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.283679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.283693 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.381220 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 03:38:23.428212378 +0000 UTC Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.387481 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.387525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.387537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.387557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.387570 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.436339 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:44 crc kubenswrapper[4771]: E0219 21:28:44.436473 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.481662 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.485223 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.491213 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.492553 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.492584 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.492749 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.492775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.492791 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.494148 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.519217 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.539280 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.573289 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.587317 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.595098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.595135 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.595147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.595164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.595175 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.596796 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.604356 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.611687 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.618532 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.633386 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.634710 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b73af94-3912-40fb-95ea-7720e6279f52" containerID="42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b" exitCode=0 Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.634776 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerDied","Data":"42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.637333 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0" exitCode=0 Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.638432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.646417 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.670922 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.685589 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.701104 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.701157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.701174 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.701192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.701206 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.707224 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.719177 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.731448 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.739925 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.767299 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.804746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.804792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.804809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.804834 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.804853 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.806471 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.851807 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.887630 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.908509 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.908562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.908582 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.908606 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.908623 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.927423 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:44 crc kubenswrapper[4771]: I0219 21:28:44.971824 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.011199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.011265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.011284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.011311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.011329 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.016938 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.052759 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.086827 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.113677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.114070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.114095 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.114120 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.114139 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.133325 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.215821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.215855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.215866 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.215881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.215892 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.318605 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.318650 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.318664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.318683 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.318698 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.381382 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 12:36:50.747641723 +0000 UTC Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.420864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.421279 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.421309 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.421331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.421345 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.436715 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.436724 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:45 crc kubenswrapper[4771]: E0219 21:28:45.436867 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:45 crc kubenswrapper[4771]: E0219 21:28:45.436928 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.523234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.523284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.523296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.523312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.523324 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.626088 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.626139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.626155 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.626175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.626190 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.652316 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.652365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.652378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.652389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.652400 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.652412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.654411 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b73af94-3912-40fb-95ea-7720e6279f52" containerID="15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e" exitCode=0 Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.654450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerDied","Data":"15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.675454 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9h9qn"] Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.676084 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.677647 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.677821 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.678358 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.678384 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.683507 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.692703 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.709755 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.722383 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.728802 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.728912 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.728935 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.728964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.728985 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.743000 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.751192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-host\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.751320 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-serviceca\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.751487 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdhp\" (UniqueName: \"kubernetes.io/projected/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-kube-api-access-khdhp\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.752626 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.768426 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.782083 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.796328 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.809766 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.827332 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.831672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.831719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.831739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.831765 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.831784 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.838618 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.849258 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.852009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdhp\" (UniqueName: \"kubernetes.io/projected/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-kube-api-access-khdhp\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.852106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-host\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.852140 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-serviceca\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.852256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-host\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.853807 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-serviceca\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.859129 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.867685 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.877193 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.879501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdhp\" (UniqueName: \"kubernetes.io/projected/ccd8e7a3-6efc-407a-bd75-a762a5b72bf1-kube-api-access-khdhp\") pod \"node-ca-9h9qn\" (UID: \"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\") " pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.905854 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.934783 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.934830 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.934844 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.934863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.934874 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.943831 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:45 crc kubenswrapper[4771]: I0219 21:28:45.990207 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.001719 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9h9qn" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.033671 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.037201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.037232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.037245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.037262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.037274 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: W0219 21:28:46.046669 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd8e7a3_6efc_407a_bd75_a762a5b72bf1.slice/crio-35850f3944f591891aa8bdbb80e283e7a7b5a54cd1bbe084dd038859d9ae542a WatchSource:0}: Error finding container 35850f3944f591891aa8bdbb80e283e7a7b5a54cd1bbe084dd038859d9ae542a: Status 404 returned error can't find the container with id 35850f3944f591891aa8bdbb80e283e7a7b5a54cd1bbe084dd038859d9ae542a Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.074984 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.113439 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.140732 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.140762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.140773 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.140788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.140799 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.154989 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.186946 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.226190 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.246592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.246671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.246683 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.246702 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.246730 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.270980 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.308853 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.348178 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.350954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.350994 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.351006 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.351041 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.351057 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.381797 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 07:25:01.06614207 +0000 UTC Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.385776 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.436618 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:46 crc kubenswrapper[4771]: E0219 21:28:46.436773 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.453691 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.453735 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.453746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.453764 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.453777 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.557377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.557704 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.557722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.557745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.557764 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.660211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.660269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.660292 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.660322 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.660346 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.660658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9h9qn" event={"ID":"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1","Type":"ContainerStarted","Data":"be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.660729 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9h9qn" event={"ID":"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1","Type":"ContainerStarted","Data":"35850f3944f591891aa8bdbb80e283e7a7b5a54cd1bbe084dd038859d9ae542a"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.664658 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b73af94-3912-40fb-95ea-7720e6279f52" containerID="bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1" exitCode=0 Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.664729 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerDied","Data":"bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.677539 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.688901 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.710821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.722744 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.737792 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.752422 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.764358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.764406 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.764425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.764448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.764466 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.767337 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.796908 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.812538 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.826667 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.848784 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.867058 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.867115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.867132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.867157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.867175 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.877775 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.908546 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.948730 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.970488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.970538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.970557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.970582 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.970600 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4771]: I0219 21:28:46.988164 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.031802 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.070056 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.073286 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.073337 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.073354 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.073379 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.073398 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.110320 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.161270 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.177390 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.177519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.177546 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.177575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.177598 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.193939 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.232526 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.271678 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.280874 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.280930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.280948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.280974 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.280992 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.324195 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.353688 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.381952 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 20:25:29.06894993 +0000 UTC Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.382803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.382881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.382900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.382921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.383310 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.386877 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.428508 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.436589 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.436640 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.436711 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.436804 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.470673 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.485819 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.485875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.485892 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.485915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.485970 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.507296 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.550692 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.588477 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.589977 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.590053 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.590079 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.590107 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.590126 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.667203 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.667370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.667410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.667445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.667591 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.667659 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:55.667638343 +0000 UTC m=+35.939080843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.667775 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.667852 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:55.667827418 +0000 UTC m=+35.939269928 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.667965 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:55.667946851 +0000 UTC m=+35.939389361 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.668138 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.668169 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.668191 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.668249 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:55.668229019 +0000 UTC m=+35.939671529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.676307 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b73af94-3912-40fb-95ea-7720e6279f52" containerID="95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654" exitCode=0 Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.676412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerDied","Data":"95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.687501 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.697461 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.697507 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.697523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.697546 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.697564 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.702631 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.714366 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.752374 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.768507 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.768555 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.768575 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:47 crc kubenswrapper[4771]: E0219 21:28:47.768637 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:55.768616716 +0000 UTC m=+36.040059226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.768673 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.788804 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.800582 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.800654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.800747 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.800837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.800925 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.829891 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.882123 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.903316 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.903365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.903380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.903397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.903409 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.906984 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.950597 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:47 crc kubenswrapper[4771]: I0219 21:28:47.986001 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.006595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.006652 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.006672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.006706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.006732 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.027919 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.068125 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.110154 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.110223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.110245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.110277 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.110299 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.121212 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.159490 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.192510 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.218485 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.218845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.218864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.218890 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.218908 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.320815 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.320859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.320871 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.320889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.320901 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.382925 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 22:30:03.375353639 +0000 UTC Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.422973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.423035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.423047 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.423072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.423096 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.437166 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:48 crc kubenswrapper[4771]: E0219 21:28:48.437311 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.525916 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.525968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.525980 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.525999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.526011 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.629362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.629428 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.629444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.629470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.629490 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.701528 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.705883 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b73af94-3912-40fb-95ea-7720e6279f52" containerID="91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398" exitCode=0 Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.705932 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerDied","Data":"91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.725643 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.740170 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.741911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.741959 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.741976 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.742002 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.742046 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.757943 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.768941 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.783317 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.797137 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.813221 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.830696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.845505 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.845579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.845597 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.845621 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.845638 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.848376 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.864795 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.883431 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.918362 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.936451 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.948806 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.948867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.948885 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.948913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.948929 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.950719 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:48 crc kubenswrapper[4771]: I0219 21:28:48.968119 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.051913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.051999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.052098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.052147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.052169 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.154785 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.154838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.154847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.154861 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.154872 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.257747 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.257825 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.257842 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.257869 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.257887 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.361583 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.361687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.361705 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.361727 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.361755 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.384089 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 16:34:37.554376979 +0000 UTC Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.436722 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.436726 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:49 crc kubenswrapper[4771]: E0219 21:28:49.436914 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:49 crc kubenswrapper[4771]: E0219 21:28:49.437099 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.464782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.464852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.464875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.464906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.464929 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.568626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.568692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.568710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.568737 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.568753 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.677191 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.677259 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.677278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.677303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.677322 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.715267 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b73af94-3912-40fb-95ea-7720e6279f52" containerID="9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d" exitCode=0 Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.715322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerDied","Data":"9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.736048 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.750236 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.769500 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.780841 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.785052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.785121 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.785146 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.785178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.785205 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.798905 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.817179 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.832447 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.855128 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.868436 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.880844 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.887639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.887665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.887673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.887686 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.887694 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.898943 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.926141 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.943000 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.955004 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.965112 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.991346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.991417 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.991440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.991467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4771]: I0219 21:28:49.991490 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.093896 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.093954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.093972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.093998 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.094042 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.151199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.151266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.151291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.151323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.151346 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: E0219 21:28:50.168161 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.173792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.173876 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.173894 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.173917 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.173934 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: E0219 21:28:50.190182 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.194809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.194882 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.194906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.194936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.194960 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: E0219 21:28:50.210901 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.215438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.215513 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.215536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.215565 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.215590 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: E0219 21:28:50.234067 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.243193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.243973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.244054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.244084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.244103 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: E0219 21:28:50.263182 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"14ce4e64-a877-4aba-8285-cae8d7ba50e6\\\",\\\"systemUUID\\\":\\\"4d7bd1bd-3a80-4030-965a-4482ba18474d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: E0219 21:28:50.263404 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.265258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.265350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.265368 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.265393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.265415 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.368123 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.368188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.368205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.368234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.368257 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.384868 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 17:02:17.210462681 +0000 UTC Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.436647 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:50 crc kubenswrapper[4771]: E0219 21:28:50.436836 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.454202 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.465783 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.478350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.478425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.478491 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.478562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.478660 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.482538 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.496149 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.511577 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.526288 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.542355 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.568996 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.580905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.580959 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.580976 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.581006 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.581135 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.587168 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.603064 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.623410 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.653623 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.669661 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.682999 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.684367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.684440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.684460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.684519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.684541 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.696819 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.723338 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" event={"ID":"6b73af94-3912-40fb-95ea-7720e6279f52","Type":"ContainerStarted","Data":"b3fa884d496055d82d31b563ef9797c69dbfe229f070d3f39433e01f2393d82e"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.741052 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.743356 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerStarted","Data":"934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.744248 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.744352 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.759377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.780083 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3fa884d496055d82d31b563ef9797c69dbfe229f070d3f39433e01f2393d82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.780877 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.781903 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.786798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.786836 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.786851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.786869 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.786884 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.810600 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.827178 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.840893 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.851901 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.867378 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.878318 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.889941 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.889997 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.890047 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.890075 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.890092 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.894516 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.905493 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.920722 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.934524 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.949492 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.970698 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.974104 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.992632 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.992696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.992714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.992737 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.992757 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4771]: I0219 21:28:50.993675 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.009237 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.029001 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3fa884d496055d82d31b563ef9797c69dbfe229f070d3f39433e01f2393d82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.057848 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.074603 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.088710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.095813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.095869 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.095888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.095919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.095944 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.102260 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.117241 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.128211 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.143403 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.155464 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.171131 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.187528 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.204230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.204308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.204334 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.204369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.204392 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.212398 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.259526 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.306850 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.306898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.306915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.306935 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.306948 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.385403 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 19:39:33.05186958 +0000 UTC Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.409568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.409618 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.409636 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.409661 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.409679 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.436950 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:51 crc kubenswrapper[4771]: E0219 21:28:51.437122 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.437317 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:51 crc kubenswrapper[4771]: E0219 21:28:51.437519 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.512507 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.512592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.512611 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.512636 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.512653 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.615956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.616073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.616111 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.616136 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.616154 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.719420 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.719493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.719513 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.719538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.719555 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.823295 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.823375 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.823425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.823453 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.823473 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.928010 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.928093 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.928114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.928143 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.928160 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4771]: I0219 21:28:51.979295 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.030986 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.031123 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.031145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.031178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.031201 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.134888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.134963 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.134980 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.135006 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.135060 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.238758 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.238819 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.238838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.238864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.238886 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.342532 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.342607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.342631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.342657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.342677 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.385601 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 21:01:12.097918267 +0000 UTC Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.437130 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:52 crc kubenswrapper[4771]: E0219 21:28:52.437319 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.446479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.446563 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.446583 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.446658 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.446679 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.551969 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.552055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.552073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.552097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.552114 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.655137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.655193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.655212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.655239 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.655262 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.752640 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2e4406c66aa1344160b6f7cb5e2eb2df69660795597fa2e4c6fc4c3fe6e7f357"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.758423 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.758471 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.758490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.758514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.758531 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.783966 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.800410 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e4406c66aa1344160b6f7cb5e2eb2df69660795597fa2e4c6fc4c3fe6e7f357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.814881 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.829430 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.846209 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.858621 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.861562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.861742 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.861770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.861796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.861814 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.876148 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.887575 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.902729 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.918694 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.933734 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.960653 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.965619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.965667 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.965685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.965712 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.965730 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.978737 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:52 crc kubenswrapper[4771]: I0219 21:28:52.993407 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.011929 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3fa884d496055d82d31b563ef9797c69dbfe229f070d3f39433e01f2393d82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.068220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.068286 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.068308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.068332 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.068349 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.171007 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.171100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.171140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.171173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.171198 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.273974 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.274066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.274084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.274111 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.274127 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.377322 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.377377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.377395 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.377418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.377436 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.385757 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 01:34:37.297477733 +0000 UTC Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.436790 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.436811 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:53 crc kubenswrapper[4771]: E0219 21:28:53.436963 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:53 crc kubenswrapper[4771]: E0219 21:28:53.437182 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.480851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.480906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.480924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.480947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.480964 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.584506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.584594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.584612 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.584637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.584655 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.687397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.687472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.687494 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.687521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.687538 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.790708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.790807 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.790826 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.790854 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.790879 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.893552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.893620 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.893639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.893664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.893682 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.996779 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.996843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.996859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.996904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4771]: I0219 21:28:53.996922 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.100193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.100256 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.100271 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.100296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.100313 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.203072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.203132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.203149 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.203220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.203238 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.306675 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.306721 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.306744 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.306770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.306787 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.386920 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:34:13.127705569 +0000 UTC Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.409353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.409444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.409470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.409502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.409530 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.436869 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:54 crc kubenswrapper[4771]: E0219 21:28:54.437118 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.514497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.514559 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.514571 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.514586 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.514956 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.545096 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85"] Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.545516 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.549291 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.549543 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.562677 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.575401 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.603684 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.617289 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e4406c66aa1344160b6f7cb5e2eb2df69660795597fa2e4c6fc4c3fe6e7f357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.619403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.619430 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.619440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.619456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.619467 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.633033 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.647104 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.647607 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40896374-746a-48b3-883b-2ff1c35b269f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.647649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40896374-746a-48b3-883b-2ff1c35b269f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.647670 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp7dp\" (UniqueName: \"kubernetes.io/projected/40896374-746a-48b3-883b-2ff1c35b269f-kube-api-access-gp7dp\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.647882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40896374-746a-48b3-883b-2ff1c35b269f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.661912 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.671386 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.700425 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.717135 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.722602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.722636 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.722647 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.722665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.722676 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.731388 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.745994 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.748902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40896374-746a-48b3-883b-2ff1c35b269f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.749000 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40896374-746a-48b3-883b-2ff1c35b269f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.749046 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40896374-746a-48b3-883b-2ff1c35b269f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.749068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp7dp\" (UniqueName: \"kubernetes.io/projected/40896374-746a-48b3-883b-2ff1c35b269f-kube-api-access-gp7dp\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.749769 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40896374-746a-48b3-883b-2ff1c35b269f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.749923 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40896374-746a-48b3-883b-2ff1c35b269f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.755136 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40896374-746a-48b3-883b-2ff1c35b269f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp7dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp7dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gdd85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.755380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40896374-746a-48b3-883b-2ff1c35b269f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.767701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"05ee8ae1c4e8ed15d26a4053afcfe2aca666004cb700c5992224fe97b9277dd8"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.767886 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5825194a5b732c857f11f1662778291262c95c70c005f2c8e6e2f844ef6d0f3a"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.768699 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp7dp\" (UniqueName: \"kubernetes.io/projected/40896374-746a-48b3-883b-2ff1c35b269f-kube-api-access-gp7dp\") pod \"ovnkube-control-plane-749d76644c-gdd85\" (UID: \"40896374-746a-48b3-883b-2ff1c35b269f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.773639 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.785223 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.802842 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3fa884d496055d82d31b563ef9797c69dbfe229f070d3f39433e01f2393d82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.816961 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b18f1394-2b12-45ce-b8c0-42941b327bb1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf66fe6b0f34e221db4e8117c2351379ad2fababe2d5345845679af18e42913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7797b20637646ff15b720a9ef48377271245417de515985766a228ac7e45d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a734d993714916547bd1f165095e9e09d734640b656acd87c6613d91b495e79a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.824967 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.825182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.825309 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.825397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.825480 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.830944 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.848710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05ee8ae1c4e8ed15d26a4053afcfe2aca666004cb700c5992224fe97b9277dd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5825194a5b732c857f11f1662778291262c95c70c005f2c8e6e2f844ef6d0f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.860630 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.870542 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae8f212-03b1-4a09-8a89-22a30241d9a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg8wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hl4r5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.887950 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ed4c7c-f32c-4a55-b662-93e27798acc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.905331 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.920869 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b73af94-3912-40fb-95ea-7720e6279f52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3fa884d496055d82d31b563ef9797c69dbfe229f070d3f39433e01f2393d82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42549b0a433ca2169e2dbc143832deaccc0a5108a0927d72b24d3ea72e03e61b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15823f51973b414e5f304d8595f69f836e2e3aae4f564b7084ad4fbbba75168e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd05aced665c9011c290cea6884bd1eb916faadd74a40c3fcb1ae684f9bc17c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95a3960ed36ebe4e663fbd423bdb188702248bd55df1922c87ba7964c916c654\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91860a022cfd28363cd54ef366f47516ba1db8147021b70350de1b82cd54e398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f9cae64891bb2b5fd46dcd316bf5b3134b2de615587ebd7e503a9a96326804d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s99bh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vsjdr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.929930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.930982 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.930008 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40896374-746a-48b3-883b-2ff1c35b269f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp7dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gp7dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gdd85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.931482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.931548 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.933554 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.953530 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b85c531-0692-4483-aff6-56645d78cf68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017be955a2e04fa13ac538fd86bf26758e80594daf6b5bb1b07b5c4173abc1b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec9249f744e284ce2f3132a919047e278a918976f709d7577a0e7f3996eaf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0458282db2a95bc46d83a47120006f7b748b7b5f27986514bb1f69d08c89fac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bf986eafac9a78461d1da29a8e5d35b449a72af45e432f89fb6c3f32030bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c6721b43b0c8cfeca82061c01413bc787c997e681dac771d9da3ee1cf9dd83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1e26b96af80469d7626d71b6d9d3400d7cdd27264ea127756d16bad0a68c49a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08243139cf2fa3a710fc06f8457b2cd71ccc5cabbcbc53da3f4a5adb024229ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbf42f90a90aee550fb33ba479010b664ae0df5e6c8213adc32a7db052f9290c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.972839 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e4406c66aa1344160b6f7cb5e2eb2df69660795597fa2e4c6fc4c3fe6e7f357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.983167 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:54 crc kubenswrapper[4771]: I0219 21:28:54.994650 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db19bfc5-46dd-47ae-9608-aafec9e35f9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9b692ade485f0cc2a132fee1146a954a2ffd1114814703e589759f7f4ca03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqvbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-q64kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.008341 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.021605 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h8rw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02ee2b40-7d95-4e9d-936e-1feb3161e6c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b34e2cd1754a44acfac8660d9c5e6891cf6f236bfd0b5078b8879217cf079198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mrw58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h8rw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.036706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.036755 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.036771 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.036789 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.036802 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.058196 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jtrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t8kvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jtrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.094210 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h9qn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccd8e7a3-6efc-407a-bd75-a762a5b72bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be82edb57882795a8eea1404b300a1fb024cc874394a43902416354e0f14b558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khdhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h9qn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.138718 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.139044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.139066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.139080 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.139112 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.242299 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.242344 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.242356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.242372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.242384 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.319727 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-k86sj"] Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.320483 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.320584 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k86sj" podUID="c9e29256-0243-48a0-b33d-138f7031c12a" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.345874 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.345932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.345950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.345975 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.345996 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.355077 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9h9qn" podStartSLOduration=13.355051119 podStartE2EDuration="13.355051119s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.354850193 +0000 UTC m=+35.626292763" watchObservedRunningTime="2026-02-19 21:28:55.355051119 +0000 UTC m=+35.626493629" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.355428 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5jtrr" podStartSLOduration=13.355416589 podStartE2EDuration="13.355416589s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.340546675 +0000 UTC m=+35.611989215" watchObservedRunningTime="2026-02-19 21:28:55.355416589 +0000 UTC m=+35.626859099" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.385120 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h8rw6" podStartSLOduration=15.385092555 podStartE2EDuration="15.385092555s" podCreationTimestamp="2026-02-19 21:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.384815658 +0000 UTC m=+35.656258138" watchObservedRunningTime="2026-02-19 21:28:55.385092555 +0000 UTC m=+35.656535065" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.387954 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 04:58:14.27359065 +0000 UTC Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.435115 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podStartSLOduration=13.435091483 podStartE2EDuration="13.435091483s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.418834511 +0000 UTC m=+35.690277011" watchObservedRunningTime="2026-02-19 21:28:55.435091483 +0000 UTC m=+35.706534003" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.435640 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=11.435632618 podStartE2EDuration="11.435632618s" podCreationTimestamp="2026-02-19 21:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.434982169 +0000 UTC m=+35.706424669" watchObservedRunningTime="2026-02-19 21:28:55.435632618 +0000 UTC m=+35.707075128" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.436143 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.436284 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.436375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.436613 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.449382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.449453 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.449481 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.449511 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.449528 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.456200 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.456278 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxw9t\" (UniqueName: \"kubernetes.io/projected/c9e29256-0243-48a0-b33d-138f7031c12a-kube-api-access-qxw9t\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.539845 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.539826507 podStartE2EDuration="16.539826507s" podCreationTimestamp="2026-02-19 21:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.517549212 +0000 UTC m=+35.788991752" watchObservedRunningTime="2026-02-19 21:28:55.539826507 +0000 UTC m=+35.811268987" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.552231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.552269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.552282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.552301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.552315 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.557088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.557197 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxw9t\" (UniqueName: \"kubernetes.io/projected/c9e29256-0243-48a0-b33d-138f7031c12a-kube-api-access-qxw9t\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.557270 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.557323 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs podName:c9e29256-0243-48a0-b33d-138f7031c12a nodeName:}" failed. No retries permitted until 2026-02-19 21:28:56.057306782 +0000 UTC m=+36.328749272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs") pod "network-metrics-daemon-k86sj" (UID: "c9e29256-0243-48a0-b33d-138f7031c12a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.573844 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vsjdr" podStartSLOduration=13.573826781 podStartE2EDuration="13.573826781s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.560421907 +0000 UTC m=+35.831864407" watchObservedRunningTime="2026-02-19 21:28:55.573826781 +0000 UTC m=+35.845269261" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.582086 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxw9t\" (UniqueName: \"kubernetes.io/projected/c9e29256-0243-48a0-b33d-138f7031c12a-kube-api-access-qxw9t\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.599751 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podStartSLOduration=13.599731894 podStartE2EDuration="13.599731894s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.58812846 +0000 UTC m=+35.859570980" watchObservedRunningTime="2026-02-19 21:28:55.599731894 +0000 UTC m=+35.871174374" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.642216 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=14.642194467 podStartE2EDuration="14.642194467s" podCreationTimestamp="2026-02-19 21:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.640560573 +0000 UTC m=+35.912003073" watchObservedRunningTime="2026-02-19 21:28:55.642194467 +0000 UTC m=+35.913636977" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.655217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.655270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.655284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.655301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.655312 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.758598 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.758788 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.758863 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:11.758826756 +0000 UTC m=+52.030269256 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.758901 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.758931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.758974 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:11.758952559 +0000 UTC m=+52.030395059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.759004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.759143 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.759167 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.759187 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.759198 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.759260 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:11.759244937 +0000 UTC m=+52.030687437 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.759288 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:11.759275897 +0000 UTC m=+52.030718407 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.760993 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.761073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.761092 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.761118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.761135 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.774623 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" event={"ID":"40896374-746a-48b3-883b-2ff1c35b269f","Type":"ContainerStarted","Data":"6c5116f8cfc2e83fa8743d120e3f379658a7e85fe0ed8ff749088ba3df8eb304"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.774686 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" event={"ID":"40896374-746a-48b3-883b-2ff1c35b269f","Type":"ContainerStarted","Data":"b6e7be54623272880ba9a421d6d18346ea4d0d5dbb6cfc3f5dd7f9fb49d6b7dd"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.774708 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" event={"ID":"40896374-746a-48b3-883b-2ff1c35b269f","Type":"ContainerStarted","Data":"a9468618f0184e7bdb31f55f389ffa00299b65d95ac19109b2e0e20e278f7061"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.860644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.861137 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.861253 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.861346 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:55 crc kubenswrapper[4771]: E0219 21:28:55.861454 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:11.861436502 +0000 UTC m=+52.132878972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.863480 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.863530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.863548 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.863574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.863592 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.966733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.967166 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.967229 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.967301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4771]: I0219 21:28:55.967375 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.063262 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:56 crc kubenswrapper[4771]: E0219 21:28:56.063445 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:56 crc kubenswrapper[4771]: E0219 21:28:56.063514 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs podName:c9e29256-0243-48a0-b33d-138f7031c12a nodeName:}" failed. No retries permitted until 2026-02-19 21:28:57.06349051 +0000 UTC m=+37.334933020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs") pod "network-metrics-daemon-k86sj" (UID: "c9e29256-0243-48a0-b33d-138f7031c12a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.070208 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.070258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.070276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.070299 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.070316 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.173255 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.173308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.173321 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.173339 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.173351 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.276290 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.276582 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.276755 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.276934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.277134 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.380348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.380412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.380431 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.380458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.380476 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.388704 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 05:53:52.992976982 +0000 UTC Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.436331 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:56 crc kubenswrapper[4771]: E0219 21:28:56.436522 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.483424 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.483781 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.483943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.484134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.484312 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.587498 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.587562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.587580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.587605 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.587622 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.699541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.699600 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.699617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.699642 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.699659 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.720500 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gdd85" podStartSLOduration=14.720476124 podStartE2EDuration="14.720476124s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:28:55.79691203 +0000 UTC m=+36.068354530" watchObservedRunningTime="2026-02-19 21:28:56.720476124 +0000 UTC m=+36.991918624" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.721913 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k86sj"] Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.722098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:56 crc kubenswrapper[4771]: E0219 21:28:56.722240 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k86sj" podUID="c9e29256-0243-48a0-b33d-138f7031c12a" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.801319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.801356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.801365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.801380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.801389 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.904318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.904383 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.904403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.904427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4771]: I0219 21:28:56.904443 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.007610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.007657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.007674 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.007699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.007717 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.075545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:57 crc kubenswrapper[4771]: E0219 21:28:57.075850 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:57 crc kubenswrapper[4771]: E0219 21:28:57.076064 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs podName:c9e29256-0243-48a0-b33d-138f7031c12a nodeName:}" failed. No retries permitted until 2026-02-19 21:28:59.075977469 +0000 UTC m=+39.347420009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs") pod "network-metrics-daemon-k86sj" (UID: "c9e29256-0243-48a0-b33d-138f7031c12a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.110729 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.110780 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.110796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.110821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.110838 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.213346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.213426 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.213451 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.213481 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.213505 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.317175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.317250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.317276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.317303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.317319 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.389371 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 01:10:11.204250593 +0000 UTC Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.419962 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.420188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.420199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.420216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.420226 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.436477 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.436560 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:57 crc kubenswrapper[4771]: E0219 21:28:57.436575 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:57 crc kubenswrapper[4771]: E0219 21:28:57.436720 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.522826 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.522866 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.522877 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.522894 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.522905 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.625626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.625683 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.625703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.625734 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.625757 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.729328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.729377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.729388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.729407 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.729419 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.783481 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2e1aa1a426d374bde7eaad7481a99bfe8d23259fbd2d34ad13240115024e6977"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.831369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.831431 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.831468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.831495 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.831513 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.934760 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.934823 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.934839 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.934864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4771]: I0219 21:28:57.934882 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.038169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.038225 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.038242 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.038274 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.038297 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.141257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.141320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.141341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.141365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.141382 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.244636 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.244690 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.244708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.244733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.244750 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.349728 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.349775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.349792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.349816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.349834 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.390276 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 00:00:52.771564069 +0000 UTC Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.436748 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:58 crc kubenswrapper[4771]: E0219 21:28:58.436927 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.437377 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:58 crc kubenswrapper[4771]: E0219 21:28:58.437583 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k86sj" podUID="c9e29256-0243-48a0-b33d-138f7031c12a" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.452183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.452222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.452239 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.452263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.452281 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.555402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.555457 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.555475 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.555498 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.555515 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.658227 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.658287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.658306 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.658334 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.658352 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.760892 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.760946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.760963 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.760988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.761005 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.863838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.863895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.863913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.863941 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.863959 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.966672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.966736 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.966750 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.966767 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4771]: I0219 21:28:58.966779 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.069418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.069489 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.069515 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.069546 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.069569 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.097418 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:28:59 crc kubenswrapper[4771]: E0219 21:28:59.097704 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:59 crc kubenswrapper[4771]: E0219 21:28:59.097829 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs podName:c9e29256-0243-48a0-b33d-138f7031c12a nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.0977967 +0000 UTC m=+43.369239210 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs") pod "network-metrics-daemon-k86sj" (UID: "c9e29256-0243-48a0-b33d-138f7031c12a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.172399 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.172454 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.172472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.172496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.172514 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.275867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.275934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.275957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.275986 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.276008 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.378336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.378409 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.378429 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.378459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.378480 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.390713 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 04:07:11.539922783 +0000 UTC Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.436333 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.436355 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:59 crc kubenswrapper[4771]: E0219 21:28:59.436510 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:59 crc kubenswrapper[4771]: E0219 21:28:59.436863 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.481996 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.482055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.482064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.482077 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.482086 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.586178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.586263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.586282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.586335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.586535 4771 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.648590 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xkkw5"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.649360 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.651760 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jwglk"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.652965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.653726 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.654344 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.655200 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.656298 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.659434 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.660103 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.663304 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.663645 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.663851 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.663866 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.664082 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.664173 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.664425 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.666742 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.666933 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.667476 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.667691 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.667875 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.668081 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.668280 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.666835 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.669241 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.675102 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qpfhv"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.669544 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.669750 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.669797 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.669854 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.679655 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8n2h"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.669904 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.671258 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.679975 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.671295 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.671337 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.680315 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.680485 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.680738 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.681139 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.681601 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.682915 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.689146 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.692924 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.711742 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.712208 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.713604 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.713622 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.715425 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.715608 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.715782 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.716508 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.716511 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.716711 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.716711 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-h7w47"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.717061 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b552v\" (UniqueName: \"kubernetes.io/projected/e971e93e-3654-4871-8809-acf956852f8a-kube-api-access-b552v\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.717103 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.717158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e971e93e-3654-4871-8809-acf956852f8a-serving-cert\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.717209 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-client-ca\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.717265 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-config\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.717619 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.717739 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5tqd2"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.718262 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.718688 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.719618 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lggcc"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.720263 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lggcc" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722256 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722426 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722612 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722769 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722952 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723152 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722630 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723548 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723566 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723650 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723672 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723676 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722973 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723039 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723793 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723838 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.722682 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723900 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723797 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.723992 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.724653 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.727146 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.727649 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.727969 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.728251 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.728325 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.728434 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.728537 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.728634 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.728772 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.728863 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.729196 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.731455 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.731705 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.732101 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.732408 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.732523 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.732641 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.734742 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.735477 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-km8wt"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.735781 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.736053 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.736108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.736504 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.737204 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.737307 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.737442 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.737693 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.737805 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.737903 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.737912 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.739243 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.740095 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.743708 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.743781 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.743729 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.743934 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.744814 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b4lf5"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.745833 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h7gx8"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.747526 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.753214 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.757370 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.757721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.758997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.759985 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.765051 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.765588 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.765657 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.766191 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.768009 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.768281 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.768618 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.769345 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.769446 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.769716 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.769916 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.770462 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.770705 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.771743 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7b8hx"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.771917 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.772639 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.772946 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.773615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.772728 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.775850 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.777738 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r4rmw"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.778404 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.779041 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.782791 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.785425 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.785455 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.785596 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.786714 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.787028 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.787821 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.789399 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.790691 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.792003 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.797953 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.798587 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.799687 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.800564 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8ls29"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.802077 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.802645 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gdxph"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.803581 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.804513 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tjnz7"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.805988 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.809543 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.810737 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.810947 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.813421 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.814069 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.815815 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fp5bm"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.817948 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-config\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.817986 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.818006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b552v\" (UniqueName: \"kubernetes.io/projected/e971e93e-3654-4871-8809-acf956852f8a-kube-api-access-b552v\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.818054 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e971e93e-3654-4871-8809-acf956852f8a-serving-cert\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.818079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-client-ca\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.818782 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-client-ca\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.819650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-config\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.820525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.823154 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xkkw5"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.823179 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.823191 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb2kp"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.823487 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.823829 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.824094 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.824627 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fp5bm" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.826154 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.828814 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e971e93e-3654-4871-8809-acf956852f8a-serving-cert\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.830034 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.830354 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.830374 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.830444 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.830642 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.831183 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8n2h"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.832060 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.832173 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jwglk"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.832909 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbd5d"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.833410 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.833744 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.839971 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lggcc"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.839995 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h7w47"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.840004 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5699f"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.840474 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.840492 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.840550 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.840716 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.841992 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.843055 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b4lf5"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.845065 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.845862 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.846861 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8jph"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.848154 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.848233 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.849139 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.850124 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qpfhv"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.851107 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.851295 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-km8wt"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.852268 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.853053 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8ls29"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.853988 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.854918 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h7gx8"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.855844 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5tqd2"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.856756 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fp5bm"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.857702 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.858824 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8jph"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.859815 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.860896 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.861902 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.862851 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.874106 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gdxph"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.874467 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.875926 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7b8hx"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.876768 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.877832 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5699f"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.878680 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.879660 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbd5d"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.880688 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.882008 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nl82r"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.882566 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.883399 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nl82r"] Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.892363 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.911612 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.931379 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.971670 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 21:28:59 crc kubenswrapper[4771]: I0219 21:28:59.991708 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.011113 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.032142 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.051929 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.072246 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.091574 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.113395 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.132636 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.152510 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.172141 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.192198 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.212793 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.232993 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.252612 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.292835 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.312399 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.332900 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.352177 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.372171 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.392055 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.392300 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:10:47.712087956 +0000 UTC Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.392387 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.412822 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.432313 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.437249 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.437385 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.462641 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.492812 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.512360 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.531991 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.551737 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.571813 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.592130 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.605537 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.612205 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.632483 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.652320 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.672235 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.691691 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.719676 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.734665 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.751522 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.782837 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.790307 4771 request.go:700] Waited for 1.003270398s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-dockercfg-k9rxt&limit=500&resourceVersion=0 Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.791676 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.811916 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.832451 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.852074 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.872846 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.892754 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.912392 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.932779 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.953536 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.972624 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 21:29:00 crc kubenswrapper[4771]: I0219 21:29:00.992619 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.012862 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.034012 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.053481 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.073124 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.092788 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.112581 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.132798 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.153383 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.172401 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.192931 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.211952 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.232697 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.272385 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.279675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b552v\" (UniqueName: \"kubernetes.io/projected/e971e93e-3654-4871-8809-acf956852f8a-kube-api-access-b552v\") pod \"controller-manager-879f6c89f-xkkw5\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.294332 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.313074 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.332556 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.351976 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.372483 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.393402 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.412815 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.432661 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.437041 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.437097 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.454577 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.472258 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.483543 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.492094 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.512889 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.533363 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.552417 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.574227 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.591607 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.630676 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.632111 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.652352 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.672115 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.692453 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.713222 4771 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.721767 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xkkw5"] Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.733268 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.752448 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.771210 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.792592 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.796222 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" event={"ID":"e971e93e-3654-4871-8809-acf956852f8a","Type":"ContainerStarted","Data":"72544deba9571bcb93e9972457786c4f3653b14e677de6ffb47657c935c16a01"} Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.810637 4771 request.go:700] Waited for 1.927765968s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.812137 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.832567 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.913521 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.933050 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-service-ca\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943342 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a808f7e3-3664-4131-9968-5f98c4f8480f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943385 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/96dfe4d4-62ec-4dca-824e-a7770af9db35-machine-approver-tls\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7c9f264f-27e9-419d-8163-3d16ce6c5eda-images\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943454 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gndc8\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-kube-api-access-gndc8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943553 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-audit\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943603 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afc30e40-775d-4371-b176-2320928155bb-audit-dir\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943635 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffhc\" (UniqueName: \"kubernetes.io/projected/cb5d417b-c982-4c90-a56c-73eb5672adce-kube-api-access-tffhc\") pod \"cluster-samples-operator-665b6dd947-qvwzh\" (UID: \"cb5d417b-c982-4c90-a56c-73eb5672adce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943672 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slmm5\" (UniqueName: \"kubernetes.io/projected/e7550f22-e38d-40af-a730-6e5f8ef3fef3-kube-api-access-slmm5\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943722 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-config\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943759 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a808f7e3-3664-4131-9968-5f98c4f8480f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-serving-cert\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c82393e3-5edd-4eac-b692-04f21c0e4c10-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff4fff0-509c-411a-b055-595fc81f61c3-serving-cert\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7s6m\" (UniqueName: \"kubernetes.io/projected/a808f7e3-3664-4131-9968-5f98c4f8480f-kube-api-access-x7s6m\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943964 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-trusted-ca-bundle\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.943994 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5af55e6d-0e47-4ee9-a654-2fe525618804-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944052 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93a7a0c-c589-42dc-bc58-9e0ccca50250-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhhjv\" (UniqueName: \"kubernetes.io/projected/aff4fff0-509c-411a-b055-595fc81f61c3-kube-api-access-jhhjv\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944109 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-config\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944155 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hs5\" (UniqueName: \"kubernetes.io/projected/5c750565-2876-44fb-a7b7-4afd72a47b96-kube-api-access-94hs5\") pod \"downloads-7954f5f757-lggcc\" (UID: \"5c750565-2876-44fb-a7b7-4afd72a47b96\") " pod="openshift-console/downloads-7954f5f757-lggcc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-etcd-client\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944511 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-image-import-ca\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944566 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7550f22-e38d-40af-a730-6e5f8ef3fef3-serving-cert\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944615 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde96967-28c6-414f-af23-6333d9636d0d-serving-cert\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944664 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96dfe4d4-62ec-4dca-824e-a7770af9db35-config\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944712 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff4fff0-509c-411a-b055-595fc81f61c3-config\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-encryption-config\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-etcd-service-ca\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.944948 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zsh8\" (UniqueName: \"kubernetes.io/projected/2dd71911-b029-4e67-882c-eb6ad1983c00-kube-api-access-2zsh8\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945013 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5af55e6d-0e47-4ee9-a654-2fe525618804-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945112 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7txcp\" (UniqueName: \"kubernetes.io/projected/6fa7e479-a1ec-4aca-8172-2ce281048f4a-kube-api-access-7txcp\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945231 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-tls\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945355 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945415 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c82393e3-5edd-4eac-b692-04f21c0e4c10-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945484 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ts57\" (UniqueName: \"kubernetes.io/projected/96dfe4d4-62ec-4dca-824e-a7770af9db35-kube-api-access-6ts57\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945521 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5af55e6d-0e47-4ee9-a654-2fe525618804-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945550 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5af55e6d-0e47-4ee9-a654-2fe525618804-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f93a7a0c-c589-42dc-bc58-9e0ccca50250-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945879 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-trusted-ca\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.945941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5af55e6d-0e47-4ee9-a654-2fe525618804-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:01 crc kubenswrapper[4771]: E0219 21:29:01.945977 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.445956734 +0000 UTC m=+42.717399244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmnfk\" (UniqueName: \"kubernetes.io/projected/67d4896e-f092-419b-971d-aac9dbaed6d0-kube-api-access-zmnfk\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946098 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946135 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a808f7e3-3664-4131-9968-5f98c4f8480f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946172 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-serving-cert\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946239 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6pnx\" (UniqueName: \"kubernetes.io/projected/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-kube-api-access-t6pnx\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946268 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-encryption-config\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946297 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-etcd-client\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946379 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-oauth-serving-cert\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946428 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c9f264f-27e9-419d-8163-3d16ce6c5eda-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946461 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946490 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fde96967-28c6-414f-af23-6333d9636d0d-etcd-client\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946520 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946587 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afc30e40-775d-4371-b176-2320928155bb-node-pullsecrets\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946914 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.946977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-dir\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96dfe4d4-62ec-4dca-824e-a7770af9db35-auth-proxy-config\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947050 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-policies\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947093 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-certificates\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947162 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-client-ca\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947183 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-audit-policies\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947255 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-config\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947281 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-service-ca-bundle\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46t6\" (UniqueName: \"kubernetes.io/projected/c82393e3-5edd-4eac-b692-04f21c0e4c10-kube-api-access-j46t6\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947510 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-etcd-serving-ca\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947540 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxjzb\" (UniqueName: \"kubernetes.io/projected/f93a7a0c-c589-42dc-bc58-9e0ccca50250-kube-api-access-xxjzb\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947565 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr4t9\" (UniqueName: \"kubernetes.io/projected/7c9f264f-27e9-419d-8163-3d16ce6c5eda-kube-api-access-mr4t9\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947620 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-config\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947664 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407f05a8-4462-4951-8d34-d0d3f0e5a604-serving-cert\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947696 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-bound-sa-token\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb5d417b-c982-4c90-a56c-73eb5672adce-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qvwzh\" (UID: \"cb5d417b-c982-4c90-a56c-73eb5672adce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947832 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fa7e479-a1ec-4aca-8172-2ce281048f4a-audit-dir\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947865 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-serving-cert\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947902 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-console-config\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9f264f-27e9-419d-8163-3d16ce6c5eda-config\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947963 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-etcd-ca\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.947995 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4sj4\" (UniqueName: \"kubernetes.io/projected/fde96967-28c6-414f-af23-6333d9636d0d-kube-api-access-v4sj4\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.948072 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.948105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sck8t\" (UniqueName: \"kubernetes.io/projected/afc30e40-775d-4371-b176-2320928155bb-kube-api-access-sck8t\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.948139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8jsg\" (UniqueName: \"kubernetes.io/projected/407f05a8-4462-4951-8d34-d0d3f0e5a604-kube-api-access-l8jsg\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.948169 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-serving-cert\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.948201 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-oauth-config\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.948235 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aff4fff0-509c-411a-b055-595fc81f61c3-trusted-ca\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.948277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.952243 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.972404 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 21:29:01 crc kubenswrapper[4771]: I0219 21:29:01.992348 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.012742 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.048785 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.049070 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.549003123 +0000 UTC m=+42.820445623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049137 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/410fd892-777e-4c9d-bed5-69f1a797501f-metrics-tls\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-default-certificate\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049242 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a0153e8a-327b-4622-895d-02017e5b03d7-node-bootstrap-token\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-serving-cert\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049377 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-console-config\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049409 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e17ac1b5-90d3-4f7e-a806-0285862f3c05-srv-cert\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049446 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9f264f-27e9-419d-8163-3d16ce6c5eda-config\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-etcd-ca\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049511 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twjlw\" (UniqueName: \"kubernetes.io/projected/308e6697-8ef5-46ec-b57e-6c86e2cfe4c3-kube-api-access-twjlw\") pod \"package-server-manager-789f6589d5-4pp7p\" (UID: \"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049543 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sck8t\" (UniqueName: \"kubernetes.io/projected/afc30e40-775d-4371-b176-2320928155bb-kube-api-access-sck8t\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049578 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-oauth-config\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6dh9\" (UniqueName: \"kubernetes.io/projected/74547cb9-9778-4565-a709-7d29768bf23f-kube-api-access-c6dh9\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049643 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-proxy-tls\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049676 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7c9f264f-27e9-419d-8163-3d16ce6c5eda-images\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049705 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049736 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-config\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-srv-cert\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049801 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049835 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049870 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tffhc\" (UniqueName: \"kubernetes.io/projected/cb5d417b-c982-4c90-a56c-73eb5672adce-kube-api-access-tffhc\") pod \"cluster-samples-operator-665b6dd947-qvwzh\" (UID: \"cb5d417b-c982-4c90-a56c-73eb5672adce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049901 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f986201-78c2-4e70-bad2-86a6f1fd68b1-service-ca-bundle\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d632d3b3-abbf-4aa5-9317-64588dcf1fdd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8ls29\" (UID: \"d632d3b3-abbf-4aa5-9317-64588dcf1fdd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.049969 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a808f7e3-3664-4131-9968-5f98c4f8480f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050001 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-plugins-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050074 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050118 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-serving-cert\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff4fff0-509c-411a-b055-595fc81f61c3-serving-cert\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050236 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74547cb9-9778-4565-a709-7d29768bf23f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-trusted-ca-bundle\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5af55e6d-0e47-4ee9-a654-2fe525618804-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93a7a0c-c589-42dc-bc58-9e0ccca50250-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050364 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-config\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050398 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050433 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-etcd-client\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050466 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-image-import-ca\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k222\" (UniqueName: \"kubernetes.io/projected/3dd706f5-ac29-450c-9f17-cec521732d11-kube-api-access-7k222\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050533 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-config\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050567 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ptx2\" (UniqueName: \"kubernetes.io/projected/1c07c12f-1fa9-421e-b0da-889714e045d7-kube-api-access-2ptx2\") pod \"migrator-59844c95c7-vrkjw\" (UID: \"1c07c12f-1fa9-421e-b0da-889714e045d7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050687 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzrw9\" (UniqueName: \"kubernetes.io/projected/e17ac1b5-90d3-4f7e-a806-0285862f3c05-kube-api-access-pzrw9\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050728 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050764 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-encryption-config\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/410fd892-777e-4c9d-bed5-69f1a797501f-trusted-ca\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050831 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/74547cb9-9778-4565-a709-7d29768bf23f-ready\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkhnl\" (UniqueName: \"kubernetes.io/projected/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-kube-api-access-mkhnl\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050894 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050929 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5af55e6d-0e47-4ee9-a654-2fe525618804-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5af55e6d-0e47-4ee9-a654-2fe525618804-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.050996 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051053 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/308e6697-8ef5-46ec-b57e-6c86e2cfe4c3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4pp7p\" (UID: \"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051091 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f93a7a0c-c589-42dc-bc58-9e0ccca50250-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051126 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9e598f5-22b1-4345-ae26-ea5e48640c84-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-trusted-ca\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051224 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a808f7e3-3664-4131-9968-5f98c4f8480f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6pnx\" (UniqueName: \"kubernetes.io/projected/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-kube-api-access-t6pnx\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051288 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/44818bf7-6be3-4f7f-97c2-0920e255bbba-tmpfs\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-metrics-tls\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-etcd-client\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051413 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afc30e40-775d-4371-b176-2320928155bb-node-pullsecrets\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051478 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gt5p\" (UniqueName: \"kubernetes.io/projected/b799beef-7d4d-4ea2-8c0c-4b98c52de22b-kube-api-access-4gt5p\") pod \"ingress-canary-nl82r\" (UID: \"b799beef-7d4d-4ea2-8c0c-4b98c52de22b\") " pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051511 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-dir\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051542 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a0153e8a-327b-4622-895d-02017e5b03d7-certs\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051575 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-config\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051609 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46t6\" (UniqueName: \"kubernetes.io/projected/c82393e3-5edd-4eac-b692-04f21c0e4c10-kube-api-access-j46t6\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051643 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-client-ca\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051675 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsjxw\" (UniqueName: \"kubernetes.io/projected/df159dd8-1cb1-4aa6-90c5-75168e2029fb-kube-api-access-dsjxw\") pod \"dns-operator-744455d44c-7b8hx\" (UID: \"df159dd8-1cb1-4aa6-90c5-75168e2029fb\") " pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051705 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df3295-b598-4767-b2c1-6fe0a9dbaf37-config-volume\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051739 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-metrics-certs\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051831 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051865 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-config-volume\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051896 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgjb\" (UniqueName: \"kubernetes.io/projected/a0153e8a-327b-4622-895d-02017e5b03d7-kube-api-access-gdgjb\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051936 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407f05a8-4462-4951-8d34-d0d3f0e5a604-serving-cert\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.051968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052002 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-csi-data-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb5d417b-c982-4c90-a56c-73eb5672adce-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qvwzh\" (UID: \"cb5d417b-c982-4c90-a56c-73eb5672adce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052095 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fa7e479-a1ec-4aca-8172-2ce281048f4a-audit-dir\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052129 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-stats-auth\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-profile-collector-cert\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052190 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-images\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052224 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sj4\" (UniqueName: \"kubernetes.io/projected/fde96967-28c6-414f-af23-6333d9636d0d-kube-api-access-v4sj4\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052260 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9g4\" (UniqueName: \"kubernetes.io/projected/d901f22a-3e48-4f3f-8516-b1176abe76e0-kube-api-access-gq9g4\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052293 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rplkf\" (UniqueName: \"kubernetes.io/projected/0c7897d6-b26c-4f91-ab7f-64da32791052-kube-api-access-rplkf\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df3295-b598-4767-b2c1-6fe0a9dbaf37-secret-volume\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-console-config\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052395 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8jsg\" (UniqueName: \"kubernetes.io/projected/407f05a8-4462-4951-8d34-d0d3f0e5a604-kube-api-access-l8jsg\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-serving-cert\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052546 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b799beef-7d4d-4ea2-8c0c-4b98c52de22b-cert\") pod \"ingress-canary-nl82r\" (UID: \"b799beef-7d4d-4ea2-8c0c-4b98c52de22b\") " pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052545 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-image-import-ca\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052605 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aff4fff0-509c-411a-b055-595fc81f61c3-trusted-ca\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052613 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9f264f-27e9-419d-8163-3d16ce6c5eda-config\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052660 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052717 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44818bf7-6be3-4f7f-97c2-0920e255bbba-webhook-cert\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjfsv\" (UniqueName: \"kubernetes.io/projected/06df3295-b598-4767-b2c1-6fe0a9dbaf37-kube-api-access-gjfsv\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.052953 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccjb2\" (UniqueName: \"kubernetes.io/projected/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-kube-api-access-ccjb2\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.053013 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-service-ca\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.053199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a808f7e3-3664-4131-9968-5f98c4f8480f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.053313 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/96dfe4d4-62ec-4dca-824e-a7770af9db35-machine-approver-tls\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.053423 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44818bf7-6be3-4f7f-97c2-0920e255bbba-apiservice-cert\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.053534 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.053646 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/df159dd8-1cb1-4aa6-90c5-75168e2029fb-metrics-tls\") pod \"dns-operator-744455d44c-7b8hx\" (UID: \"df159dd8-1cb1-4aa6-90c5-75168e2029fb\") " pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.054067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-trusted-ca-bundle\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.054250 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-dir\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.054361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.054388 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afc30e40-775d-4371-b176-2320928155bb-node-pullsecrets\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gndc8\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-kube-api-access-gndc8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-audit\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055365 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afc30e40-775d-4371-b176-2320928155bb-audit-dir\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slmm5\" (UniqueName: \"kubernetes.io/projected/e7550f22-e38d-40af-a730-6e5f8ef3fef3-kube-api-access-slmm5\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/410fd892-777e-4c9d-bed5-69f1a797501f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-registration-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055574 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-config\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7c9f264f-27e9-419d-8163-3d16ce6c5eda-images\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055620 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c82393e3-5edd-4eac-b692-04f21c0e4c10-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055662 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d901f22a-3e48-4f3f-8516-b1176abe76e0-signing-key\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055722 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhhjv\" (UniqueName: \"kubernetes.io/projected/aff4fff0-509c-411a-b055-595fc81f61c3-kube-api-access-jhhjv\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7s6m\" (UniqueName: \"kubernetes.io/projected/a808f7e3-3664-4131-9968-5f98c4f8480f-kube-api-access-x7s6m\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vwwq\" (UniqueName: \"kubernetes.io/projected/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-kube-api-access-4vwwq\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055879 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e17ac1b5-90d3-4f7e-a806-0285862f3c05-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055915 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-proxy-tls\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94hs5\" (UniqueName: \"kubernetes.io/projected/5c750565-2876-44fb-a7b7-4afd72a47b96-kube-api-access-94hs5\") pod \"downloads-7954f5f757-lggcc\" (UID: \"5c750565-2876-44fb-a7b7-4afd72a47b96\") " pod="openshift-console/downloads-7954f5f757-lggcc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fd29\" (UniqueName: \"kubernetes.io/projected/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-kube-api-access-4fd29\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056085 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7550f22-e38d-40af-a730-6e5f8ef3fef3-serving-cert\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056120 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde96967-28c6-414f-af23-6333d9636d0d-serving-cert\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056200 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96dfe4d4-62ec-4dca-824e-a7770af9db35-config\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056235 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff4fff0-509c-411a-b055-595fc81f61c3-config\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-etcd-service-ca\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zsh8\" (UniqueName: \"kubernetes.io/projected/2dd71911-b029-4e67-882c-eb6ad1983c00-kube-api-access-2zsh8\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/69705410-9808-46bd-8e5c-64a46eedf641-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gmjpk\" (UID: \"69705410-9808-46bd-8e5c-64a46eedf641\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.055658 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fa7e479-a1ec-4aca-8172-2ce281048f4a-audit-dir\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.056616 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-etcd-ca\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.057918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-etcd-service-ca\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.058332 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.058683 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-client-ca\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.058306 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-config\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.059139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96dfe4d4-62ec-4dca-824e-a7770af9db35-config\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.059190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5af55e6d-0e47-4ee9-a654-2fe525618804-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.059513 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afc30e40-775d-4371-b176-2320928155bb-audit-dir\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.060357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-service-ca\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.060629 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff4fff0-509c-411a-b055-595fc81f61c3-config\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.060717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb5d417b-c982-4c90-a56c-73eb5672adce-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qvwzh\" (UID: \"cb5d417b-c982-4c90-a56c-73eb5672adce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.060855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-audit\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.060849 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-trusted-ca\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5af55e6d-0e47-4ee9-a654-2fe525618804-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061680 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-etcd-client\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061779 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5af55e6d-0e47-4ee9-a654-2fe525618804-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061785 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-tls\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061911 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7txcp\" (UniqueName: \"kubernetes.io/projected/6fa7e479-a1ec-4aca-8172-2ce281048f4a-kube-api-access-7txcp\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061947 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.061992 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phwjj\" (UniqueName: \"kubernetes.io/projected/8f986201-78c2-4e70-bad2-86a6f1fd68b1-kube-api-access-phwjj\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.062107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-config\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.062251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93a7a0c-c589-42dc-bc58-9e0ccca50250-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.062668 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.062666 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-config\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.064225 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.064540 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.064734 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a808f7e3-3664-4131-9968-5f98c4f8480f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.064808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.064942 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c82393e3-5edd-4eac-b692-04f21c0e4c10-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.065004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ts57\" (UniqueName: \"kubernetes.io/projected/96dfe4d4-62ec-4dca-824e-a7770af9db35-kube-api-access-6ts57\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.065103 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5af55e6d-0e47-4ee9-a654-2fe525618804-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.065171 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-oauth-config\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.065369 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/96dfe4d4-62ec-4dca-824e-a7770af9db35-machine-approver-tls\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.066541 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c82393e3-5edd-4eac-b692-04f21c0e4c10-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.066816 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.066890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5af55e6d-0e47-4ee9-a654-2fe525618804-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.066993 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sc2h\" (UniqueName: \"kubernetes.io/projected/410fd892-777e-4c9d-bed5-69f1a797501f-kube-api-access-9sc2h\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-serving-cert\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-encryption-config\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067383 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74547cb9-9778-4565-a709-7d29768bf23f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-mountpoint-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.067488 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.567465674 +0000 UTC m=+42.838908174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067539 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmnfk\" (UniqueName: \"kubernetes.io/projected/67d4896e-f092-419b-971d-aac9dbaed6d0-kube-api-access-zmnfk\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067482 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aff4fff0-509c-411a-b055-595fc81f61c3-trusted-ca\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067629 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd706f5-ac29-450c-9f17-cec521732d11-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067856 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e598f5-22b1-4345-ae26-ea5e48640c84-config\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9ck\" (UniqueName: \"kubernetes.io/projected/44818bf7-6be3-4f7f-97c2-0920e255bbba-kube-api-access-cs9ck\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.067992 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s5qs\" (UniqueName: \"kubernetes.io/projected/d632d3b3-abbf-4aa5-9317-64588dcf1fdd-kube-api-access-6s5qs\") pod \"multus-admission-controller-857f4d67dd-8ls29\" (UID: \"d632d3b3-abbf-4aa5-9317-64588dcf1fdd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.068103 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-serving-cert\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.068168 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-encryption-config\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.068250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.068291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d901f22a-3e48-4f3f-8516-b1176abe76e0-signing-cabundle\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.068372 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdfd\" (UniqueName: \"kubernetes.io/projected/69705410-9808-46bd-8e5c-64a46eedf641-kube-api-access-fmdfd\") pod \"control-plane-machine-set-operator-78cbb6b69f-gmjpk\" (UID: \"69705410-9808-46bd-8e5c-64a46eedf641\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.068422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.068501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-serving-cert\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.069501 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-oauth-serving-cert\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.069787 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c9f264f-27e9-419d-8163-3d16ce6c5eda-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.069917 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-serving-cert\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.070227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-tls\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.070235 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a808f7e3-3664-4131-9968-5f98c4f8480f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.070324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.070800 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fde96967-28c6-414f-af23-6333d9636d0d-etcd-client\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.071272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.071335 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-oauth-serving-cert\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.071447 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.071577 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f93a7a0c-c589-42dc-bc58-9e0ccca50250-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.071673 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd706f5-ac29-450c-9f17-cec521732d11-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.071919 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.072908 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.073073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.073616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96dfe4d4-62ec-4dca-824e-a7770af9db35-auth-proxy-config\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.073696 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-policies\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.073697 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.073899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-certificates\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.073944 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.073979 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074062 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-service-ca-bundle\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074093 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-audit-policies\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074123 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074161 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-etcd-serving-ca\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxjzb\" (UniqueName: \"kubernetes.io/projected/f93a7a0c-c589-42dc-bc58-9e0ccca50250-kube-api-access-xxjzb\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr4t9\" (UniqueName: \"kubernetes.io/projected/7c9f264f-27e9-419d-8163-3d16ce6c5eda-kube-api-access-mr4t9\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074260 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e598f5-22b1-4345-ae26-ea5e48640c84-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-socket-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074330 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-config\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074123 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-etcd-client\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074366 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-bound-sa-token\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074419 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sfp\" (UniqueName: \"kubernetes.io/projected/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-kube-api-access-47sfp\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074461 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bsl6\" (UniqueName: \"kubernetes.io/projected/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-kube-api-access-6bsl6\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.074786 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-serving-cert\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.075135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-policies\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.075131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/96dfe4d4-62ec-4dca-824e-a7770af9db35-auth-proxy-config\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.075499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afc30e40-775d-4371-b176-2320928155bb-etcd-serving-ca\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.076092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/407f05a8-4462-4951-8d34-d0d3f0e5a604-service-ca-bundle\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.076147 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.076433 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde96967-28c6-414f-af23-6333d9636d0d-config\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.076650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6fa7e479-a1ec-4aca-8172-2ce281048f4a-encryption-config\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.076768 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6fa7e479-a1ec-4aca-8172-2ce281048f4a-audit-policies\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.076947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fde96967-28c6-414f-af23-6333d9636d0d-etcd-client\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.078180 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.078365 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c9f264f-27e9-419d-8163-3d16ce6c5eda-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.078399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-certificates\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.078980 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7550f22-e38d-40af-a730-6e5f8ef3fef3-serving-cert\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.079153 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c82393e3-5edd-4eac-b692-04f21c0e4c10-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.079243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde96967-28c6-414f-af23-6333d9636d0d-serving-cert\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.079289 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.079420 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc30e40-775d-4371-b176-2320928155bb-serving-cert\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.079533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff4fff0-509c-411a-b055-595fc81f61c3-serving-cert\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.080399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.081944 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.082005 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.082290 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/407f05a8-4462-4951-8d34-d0d3f0e5a604-serving-cert\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.085107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5af55e6d-0e47-4ee9-a654-2fe525618804-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.119052 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a808f7e3-3664-4131-9968-5f98c4f8480f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.130855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffhc\" (UniqueName: \"kubernetes.io/projected/cb5d417b-c982-4c90-a56c-73eb5672adce-kube-api-access-tffhc\") pod \"cluster-samples-operator-665b6dd947-qvwzh\" (UID: \"cb5d417b-c982-4c90-a56c-73eb5672adce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.150252 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sck8t\" (UniqueName: \"kubernetes.io/projected/afc30e40-775d-4371-b176-2320928155bb-kube-api-access-sck8t\") pod \"apiserver-76f77b778f-jwglk\" (UID: \"afc30e40-775d-4371-b176-2320928155bb\") " pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.169576 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gndc8\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-kube-api-access-gndc8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175383 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.175570 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.67553997 +0000 UTC m=+42.946982450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175647 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e598f5-22b1-4345-ae26-ea5e48640c84-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-socket-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sfp\" (UniqueName: \"kubernetes.io/projected/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-kube-api-access-47sfp\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bsl6\" (UniqueName: \"kubernetes.io/projected/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-kube-api-access-6bsl6\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175726 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/410fd892-777e-4c9d-bed5-69f1a797501f-metrics-tls\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-default-certificate\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175757 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a0153e8a-327b-4622-895d-02017e5b03d7-node-bootstrap-token\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175774 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e17ac1b5-90d3-4f7e-a806-0285862f3c05-srv-cert\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twjlw\" (UniqueName: \"kubernetes.io/projected/308e6697-8ef5-46ec-b57e-6c86e2cfe4c3-kube-api-access-twjlw\") pod \"package-server-manager-789f6589d5-4pp7p\" (UID: \"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6dh9\" (UniqueName: \"kubernetes.io/projected/74547cb9-9778-4565-a709-7d29768bf23f-kube-api-access-c6dh9\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175839 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-proxy-tls\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175860 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175882 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-config\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-srv-cert\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175928 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175951 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f986201-78c2-4e70-bad2-86a6f1fd68b1-service-ca-bundle\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d632d3b3-abbf-4aa5-9317-64588dcf1fdd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8ls29\" (UID: \"d632d3b3-abbf-4aa5-9317-64588dcf1fdd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.175996 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-plugins-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176033 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74547cb9-9778-4565-a709-7d29768bf23f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176082 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176119 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k222\" (UniqueName: \"kubernetes.io/projected/3dd706f5-ac29-450c-9f17-cec521732d11-kube-api-access-7k222\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176143 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-config\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176176 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ptx2\" (UniqueName: \"kubernetes.io/projected/1c07c12f-1fa9-421e-b0da-889714e045d7-kube-api-access-2ptx2\") pod \"migrator-59844c95c7-vrkjw\" (UID: \"1c07c12f-1fa9-421e-b0da-889714e045d7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176200 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzrw9\" (UniqueName: \"kubernetes.io/projected/e17ac1b5-90d3-4f7e-a806-0285862f3c05-kube-api-access-pzrw9\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176226 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/410fd892-777e-4c9d-bed5-69f1a797501f-trusted-ca\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/74547cb9-9778-4565-a709-7d29768bf23f-ready\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176275 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkhnl\" (UniqueName: \"kubernetes.io/projected/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-kube-api-access-mkhnl\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/308e6697-8ef5-46ec-b57e-6c86e2cfe4c3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4pp7p\" (UID: \"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9e598f5-22b1-4345-ae26-ea5e48640c84-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/44818bf7-6be3-4f7f-97c2-0920e255bbba-tmpfs\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-metrics-tls\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176426 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gt5p\" (UniqueName: \"kubernetes.io/projected/b799beef-7d4d-4ea2-8c0c-4b98c52de22b-kube-api-access-4gt5p\") pod \"ingress-canary-nl82r\" (UID: \"b799beef-7d4d-4ea2-8c0c-4b98c52de22b\") " pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176441 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a0153e8a-327b-4622-895d-02017e5b03d7-certs\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsjxw\" (UniqueName: \"kubernetes.io/projected/df159dd8-1cb1-4aa6-90c5-75168e2029fb-kube-api-access-dsjxw\") pod \"dns-operator-744455d44c-7b8hx\" (UID: \"df159dd8-1cb1-4aa6-90c5-75168e2029fb\") " pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df3295-b598-4767-b2c1-6fe0a9dbaf37-config-volume\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-metrics-certs\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-config-volume\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176562 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgjb\" (UniqueName: \"kubernetes.io/projected/a0153e8a-327b-4622-895d-02017e5b03d7-kube-api-access-gdgjb\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176578 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-csi-data-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-profile-collector-cert\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176612 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-images\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176628 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-stats-auth\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9g4\" (UniqueName: \"kubernetes.io/projected/d901f22a-3e48-4f3f-8516-b1176abe76e0-kube-api-access-gq9g4\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176662 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rplkf\" (UniqueName: \"kubernetes.io/projected/0c7897d6-b26c-4f91-ab7f-64da32791052-kube-api-access-rplkf\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df3295-b598-4767-b2c1-6fe0a9dbaf37-secret-volume\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176705 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b799beef-7d4d-4ea2-8c0c-4b98c52de22b-cert\") pod \"ingress-canary-nl82r\" (UID: \"b799beef-7d4d-4ea2-8c0c-4b98c52de22b\") " pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.176816 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-socket-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177087 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44818bf7-6be3-4f7f-97c2-0920e255bbba-webhook-cert\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjfsv\" (UniqueName: \"kubernetes.io/projected/06df3295-b598-4767-b2c1-6fe0a9dbaf37-kube-api-access-gjfsv\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177133 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccjb2\" (UniqueName: \"kubernetes.io/projected/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-kube-api-access-ccjb2\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44818bf7-6be3-4f7f-97c2-0920e255bbba-apiservice-cert\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177197 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/df159dd8-1cb1-4aa6-90c5-75168e2029fb-metrics-tls\") pod \"dns-operator-744455d44c-7b8hx\" (UID: \"df159dd8-1cb1-4aa6-90c5-75168e2029fb\") " pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/410fd892-777e-4c9d-bed5-69f1a797501f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-registration-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177254 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d901f22a-3e48-4f3f-8516-b1176abe76e0-signing-key\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177275 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vwwq\" (UniqueName: \"kubernetes.io/projected/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-kube-api-access-4vwwq\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177290 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e17ac1b5-90d3-4f7e-a806-0285862f3c05-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177305 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-proxy-tls\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fd29\" (UniqueName: \"kubernetes.io/projected/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-kube-api-access-4fd29\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177373 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/69705410-9808-46bd-8e5c-64a46eedf641-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gmjpk\" (UID: \"69705410-9808-46bd-8e5c-64a46eedf641\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177419 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phwjj\" (UniqueName: \"kubernetes.io/projected/8f986201-78c2-4e70-bad2-86a6f1fd68b1-kube-api-access-phwjj\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177460 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-serving-cert\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74547cb9-9778-4565-a709-7d29768bf23f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177490 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-mountpoint-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sc2h\" (UniqueName: \"kubernetes.io/projected/410fd892-777e-4c9d-bed5-69f1a797501f-kube-api-access-9sc2h\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e598f5-22b1-4345-ae26-ea5e48640c84-config\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177541 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9ck\" (UniqueName: \"kubernetes.io/projected/44818bf7-6be3-4f7f-97c2-0920e255bbba-kube-api-access-cs9ck\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177557 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s5qs\" (UniqueName: \"kubernetes.io/projected/d632d3b3-abbf-4aa5-9317-64588dcf1fdd-kube-api-access-6s5qs\") pod \"multus-admission-controller-857f4d67dd-8ls29\" (UID: \"d632d3b3-abbf-4aa5-9317-64588dcf1fdd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd706f5-ac29-450c-9f17-cec521732d11-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177593 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d901f22a-3e48-4f3f-8516-b1176abe76e0-signing-cabundle\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177608 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdfd\" (UniqueName: \"kubernetes.io/projected/69705410-9808-46bd-8e5c-64a46eedf641-kube-api-access-fmdfd\") pod \"control-plane-machine-set-operator-78cbb6b69f-gmjpk\" (UID: \"69705410-9808-46bd-8e5c-64a46eedf641\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.177626 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd706f5-ac29-450c-9f17-cec521732d11-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.178077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74547cb9-9778-4565-a709-7d29768bf23f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.178193 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd706f5-ac29-450c-9f17-cec521732d11-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.178509 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/44818bf7-6be3-4f7f-97c2-0920e255bbba-tmpfs\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.179755 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.180445 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/410fd892-777e-4c9d-bed5-69f1a797501f-metrics-tls\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.180732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.181542 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-proxy-tls\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.181653 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-config\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.181870 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.181903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f986201-78c2-4e70-bad2-86a6f1fd68b1-service-ca-bundle\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.181972 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-config\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.182139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-plugins-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.182511 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/74547cb9-9778-4565-a709-7d29768bf23f-ready\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.182780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/410fd892-777e-4c9d-bed5-69f1a797501f-trusted-ca\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.183239 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-metrics-tls\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.183843 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e17ac1b5-90d3-4f7e-a806-0285862f3c05-srv-cert\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.183975 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9e598f5-22b1-4345-ae26-ea5e48640c84-config\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.184088 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.184504 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.684472742 +0000 UTC m=+42.955915242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.185342 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-srv-cert\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.186417 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd706f5-ac29-450c-9f17-cec521732d11-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.187225 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d901f22a-3e48-4f3f-8516-b1176abe76e0-signing-cabundle\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.188279 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/308e6697-8ef5-46ec-b57e-6c86e2cfe4c3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4pp7p\" (UID: \"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.188722 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-default-certificate\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.189703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a0153e8a-327b-4622-895d-02017e5b03d7-certs\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.189963 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.190746 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a0153e8a-327b-4622-895d-02017e5b03d7-node-bootstrap-token\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.191086 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d632d3b3-abbf-4aa5-9317-64588dcf1fdd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8ls29\" (UID: \"d632d3b3-abbf-4aa5-9317-64588dcf1fdd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.191559 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-serving-cert\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.191566 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b799beef-7d4d-4ea2-8c0c-4b98c52de22b-cert\") pod \"ingress-canary-nl82r\" (UID: \"b799beef-7d4d-4ea2-8c0c-4b98c52de22b\") " pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.192451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9e598f5-22b1-4345-ae26-ea5e48640c84-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.192461 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/df159dd8-1cb1-4aa6-90c5-75168e2029fb-metrics-tls\") pod \"dns-operator-744455d44c-7b8hx\" (UID: \"df159dd8-1cb1-4aa6-90c5-75168e2029fb\") " pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.192536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74547cb9-9778-4565-a709-7d29768bf23f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.192636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-registration-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.193329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-profile-collector-cert\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.193404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df3295-b598-4767-b2c1-6fe0a9dbaf37-secret-volume\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.193601 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-config-volume\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.193680 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-images\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.193821 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-csi-data-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.193954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.194012 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-mountpoint-dir\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.194702 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df3295-b598-4767-b2c1-6fe0a9dbaf37-config-volume\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.195276 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.196369 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e17ac1b5-90d3-4f7e-a806-0285862f3c05-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.196375 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-stats-auth\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.198673 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-proxy-tls\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.198963 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/44818bf7-6be3-4f7f-97c2-0920e255bbba-webhook-cert\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.199659 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f986201-78c2-4e70-bad2-86a6f1fd68b1-metrics-certs\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.199850 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d901f22a-3e48-4f3f-8516-b1176abe76e0-signing-key\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.199976 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/69705410-9808-46bd-8e5c-64a46eedf641-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gmjpk\" (UID: \"69705410-9808-46bd-8e5c-64a46eedf641\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.200405 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46t6\" (UniqueName: \"kubernetes.io/projected/c82393e3-5edd-4eac-b692-04f21c0e4c10-kube-api-access-j46t6\") pod \"openshift-apiserver-operator-796bbdcf4f-wr7t7\" (UID: \"c82393e3-5edd-4eac-b692-04f21c0e4c10\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.201120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/44818bf7-6be3-4f7f-97c2-0920e255bbba-apiservice-cert\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.209663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8jsg\" (UniqueName: \"kubernetes.io/projected/407f05a8-4462-4951-8d34-d0d3f0e5a604-kube-api-access-l8jsg\") pod \"authentication-operator-69f744f599-km8wt\" (UID: \"407f05a8-4462-4951-8d34-d0d3f0e5a604\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.229701 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhhjv\" (UniqueName: \"kubernetes.io/projected/aff4fff0-509c-411a-b055-595fc81f61c3-kube-api-access-jhhjv\") pod \"console-operator-58897d9998-5tqd2\" (UID: \"aff4fff0-509c-411a-b055-595fc81f61c3\") " pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.252650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7s6m\" (UniqueName: \"kubernetes.io/projected/a808f7e3-3664-4131-9968-5f98c4f8480f-kube-api-access-x7s6m\") pod \"cluster-image-registry-operator-dc59b4c8b-th9zl\" (UID: \"a808f7e3-3664-4131-9968-5f98c4f8480f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.270824 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6pnx\" (UniqueName: \"kubernetes.io/projected/f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6-kube-api-access-t6pnx\") pod \"openshift-config-operator-7777fb866f-hgdkp\" (UID: \"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.277285 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.282538 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.282903 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.782856214 +0000 UTC m=+43.054298734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.284170 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.284694 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.784676474 +0000 UTC m=+43.056118984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.291171 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.297125 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.303963 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.304230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sj4\" (UniqueName: \"kubernetes.io/projected/fde96967-28c6-414f-af23-6333d9636d0d-kube-api-access-v4sj4\") pod \"etcd-operator-b45778765-h7gx8\" (UID: \"fde96967-28c6-414f-af23-6333d9636d0d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.308850 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.321218 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.325806 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slmm5\" (UniqueName: \"kubernetes.io/projected/e7550f22-e38d-40af-a730-6e5f8ef3fef3-kube-api-access-slmm5\") pod \"route-controller-manager-6576b87f9c-sl9xh\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.332885 4771 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.343481 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hs5\" (UniqueName: \"kubernetes.io/projected/5c750565-2876-44fb-a7b7-4afd72a47b96-kube-api-access-94hs5\") pod \"downloads-7954f5f757-lggcc\" (UID: \"5c750565-2876-44fb-a7b7-4afd72a47b96\") " pod="openshift-console/downloads-7954f5f757-lggcc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.381307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7txcp\" (UniqueName: \"kubernetes.io/projected/6fa7e479-a1ec-4aca-8172-2ce281048f4a-kube-api-access-7txcp\") pod \"apiserver-7bbb656c7d-286qc\" (UID: \"6fa7e479-a1ec-4aca-8172-2ce281048f4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.386088 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.386746 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.886727716 +0000 UTC m=+43.158170186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.402427 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zsh8\" (UniqueName: \"kubernetes.io/projected/2dd71911-b029-4e67-882c-eb6ad1983c00-kube-api-access-2zsh8\") pod \"oauth-openshift-558db77b4-c8n2h\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.419606 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ts57\" (UniqueName: \"kubernetes.io/projected/96dfe4d4-62ec-4dca-824e-a7770af9db35-kube-api-access-6ts57\") pod \"machine-approver-56656f9798-g8s6z\" (UID: \"96dfe4d4-62ec-4dca-824e-a7770af9db35\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.424393 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.437890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5af55e6d-0e47-4ee9-a654-2fe525618804-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kkkts\" (UID: \"5af55e6d-0e47-4ee9-a654-2fe525618804\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.455266 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.469228 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmnfk\" (UniqueName: \"kubernetes.io/projected/67d4896e-f092-419b-971d-aac9dbaed6d0-kube-api-access-zmnfk\") pod \"console-f9d7485db-h7w47\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.474248 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-bound-sa-token\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.475375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.488623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.489557 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:02.989538058 +0000 UTC m=+43.260980528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.509050 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.512263 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxjzb\" (UniqueName: \"kubernetes.io/projected/f93a7a0c-c589-42dc-bc58-9e0ccca50250-kube-api-access-xxjzb\") pod \"openshift-controller-manager-operator-756b6f6bc6-wv4nf\" (UID: \"f93a7a0c-c589-42dc-bc58-9e0ccca50250\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.518673 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.530594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr4t9\" (UniqueName: \"kubernetes.io/projected/7c9f264f-27e9-419d-8163-3d16ce6c5eda-kube-api-access-mr4t9\") pod \"machine-api-operator-5694c8668f-qpfhv\" (UID: \"7c9f264f-27e9-419d-8163-3d16ce6c5eda\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.546045 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.547709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.552875 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sfp\" (UniqueName: \"kubernetes.io/projected/4037bcfd-4c3e-4c7a-95c7-208c2c14ef70-kube-api-access-47sfp\") pod \"catalog-operator-68c6474976-cs5n6\" (UID: \"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.554726 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.561874 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.568386 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bsl6\" (UniqueName: \"kubernetes.io/projected/dfccc5a0-1d91-433b-8c5d-41734a1f8ab7-kube-api-access-6bsl6\") pod \"machine-config-controller-84d6567774-5699f\" (UID: \"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.581954 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lggcc" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.590387 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.590848 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.090834399 +0000 UTC m=+43.362276869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.591841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5b7s6\" (UID: \"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.609709 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twjlw\" (UniqueName: \"kubernetes.io/projected/308e6697-8ef5-46ec-b57e-6c86e2cfe4c3-kube-api-access-twjlw\") pod \"package-server-manager-789f6589d5-4pp7p\" (UID: \"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.629527 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9e598f5-22b1-4345-ae26-ea5e48640c84-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s24kp\" (UID: \"c9e598f5-22b1-4345-ae26-ea5e48640c84\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.645814 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.646060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k222\" (UniqueName: \"kubernetes.io/projected/3dd706f5-ac29-450c-9f17-cec521732d11-kube-api-access-7k222\") pod \"kube-storage-version-migrator-operator-b67b599dd-shgdd\" (UID: \"3dd706f5-ac29-450c-9f17-cec521732d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.665684 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.666712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phwjj\" (UniqueName: \"kubernetes.io/projected/8f986201-78c2-4e70-bad2-86a6f1fd68b1-kube-api-access-phwjj\") pod \"router-default-5444994796-r4rmw\" (UID: \"8f986201-78c2-4e70-bad2-86a6f1fd68b1\") " pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.680517 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.686135 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.689083 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6dh9\" (UniqueName: \"kubernetes.io/projected/74547cb9-9778-4565-a709-7d29768bf23f-kube-api-access-c6dh9\") pod \"cni-sysctl-allowlist-ds-jb2kp\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.692809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.693207 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.193189158 +0000 UTC m=+43.464631628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.699280 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h7gx8"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.710500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ptx2\" (UniqueName: \"kubernetes.io/projected/1c07c12f-1fa9-421e-b0da-889714e045d7-kube-api-access-2ptx2\") pod \"migrator-59844c95c7-vrkjw\" (UID: \"1c07c12f-1fa9-421e-b0da-889714e045d7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.725418 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.730296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzrw9\" (UniqueName: \"kubernetes.io/projected/e17ac1b5-90d3-4f7e-a806-0285862f3c05-kube-api-access-pzrw9\") pod \"olm-operator-6b444d44fb-vclgf\" (UID: \"e17ac1b5-90d3-4f7e-a806-0285862f3c05\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.740291 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jwglk"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.743087 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.750179 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e962ec8f-cfdf-4c17-8ae0-061d016a43d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-249lb\" (UID: \"e962ec8f-cfdf-4c17-8ae0-061d016a43d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.769452 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9ck\" (UniqueName: \"kubernetes.io/projected/44818bf7-6be3-4f7f-97c2-0920e255bbba-kube-api-access-cs9ck\") pod \"packageserver-d55dfcdfc-qddxl\" (UID: \"44818bf7-6be3-4f7f-97c2-0920e255bbba\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.785362 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.789725 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s5qs\" (UniqueName: \"kubernetes.io/projected/d632d3b3-abbf-4aa5-9317-64588dcf1fdd-kube-api-access-6s5qs\") pod \"multus-admission-controller-857f4d67dd-8ls29\" (UID: \"d632d3b3-abbf-4aa5-9317-64588dcf1fdd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.793651 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.793793 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.794481 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.29446449 +0000 UTC m=+43.565906960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.802618 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:02 crc kubenswrapper[4771]: W0219 21:29:02.804813 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfde96967_28c6_414f_af23_6333d9636d0d.slice/crio-f9dd7bde5282716e52e67c3038c1c505675479899b7646ad271887715753bfa7 WatchSource:0}: Error finding container f9dd7bde5282716e52e67c3038c1c505675479899b7646ad271887715753bfa7: Status 404 returned error can't find the container with id f9dd7bde5282716e52e67c3038c1c505675479899b7646ad271887715753bfa7 Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.826578 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.827475 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-km8wt"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.832486 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5tqd2"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.834245 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fd29\" (UniqueName: \"kubernetes.io/projected/a42ac7d5-b603-4acc-bdd3-89b710caf6f4-kube-api-access-4fd29\") pod \"dns-default-fp5bm\" (UID: \"a42ac7d5-b603-4acc-bdd3-89b710caf6f4\") " pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.839009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" event={"ID":"5af55e6d-0e47-4ee9-a654-2fe525618804","Type":"ContainerStarted","Data":"4ae0615c5dc8bb0a66c54f2bca60e2de94b560a7986fca9cc08da1ace57d2ecc"} Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.840529 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" event={"ID":"e971e93e-3654-4871-8809-acf956852f8a","Type":"ContainerStarted","Data":"a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429"} Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.841961 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.851436 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.851928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkhnl\" (UniqueName: \"kubernetes.io/projected/6f6406cd-da81-49c4-b8d6-c2a7578b8b3b-kube-api-access-mkhnl\") pod \"csi-hostpathplugin-h8jph\" (UID: \"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.858926 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" event={"ID":"96dfe4d4-62ec-4dca-824e-a7770af9db35","Type":"ContainerStarted","Data":"bf21ab889c814677ef5022c4739dd07b5edbf2e880643451ed4c33ba83de4baa"} Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.872712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9g4\" (UniqueName: \"kubernetes.io/projected/d901f22a-3e48-4f3f-8516-b1176abe76e0-kube-api-access-gq9g4\") pod \"service-ca-9c57cc56f-gdxph\" (UID: \"d901f22a-3e48-4f3f-8516-b1176abe76e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.877486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gt5p\" (UniqueName: \"kubernetes.io/projected/b799beef-7d4d-4ea2-8c0c-4b98c52de22b-kube-api-access-4gt5p\") pod \"ingress-canary-nl82r\" (UID: \"b799beef-7d4d-4ea2-8c0c-4b98c52de22b\") " pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.893054 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rplkf\" (UniqueName: \"kubernetes.io/projected/0c7897d6-b26c-4f91-ab7f-64da32791052-kube-api-access-rplkf\") pod \"marketplace-operator-79b997595-vbd5d\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.894288 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.894860 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.895190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:02 crc kubenswrapper[4771]: E0219 21:29:02.896453 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.396439509 +0000 UTC m=+43.667881979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.906742 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.908446 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8n2h"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.909381 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp"] Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.909448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsjxw\" (UniqueName: \"kubernetes.io/projected/df159dd8-1cb1-4aa6-90c5-75168e2029fb-kube-api-access-dsjxw\") pod \"dns-operator-744455d44c-7b8hx\" (UID: \"df159dd8-1cb1-4aa6-90c5-75168e2029fb\") " pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.927292 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.933366 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.943465 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/410fd892-777e-4c9d-bed5-69f1a797501f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.943866 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" Feb 19 21:29:02 crc kubenswrapper[4771]: W0219 21:29:02.953394 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod407f05a8_4462_4951_8d34_d0d3f0e5a604.slice/crio-bc870913f9379b94fe051d481871fbe79e5a524497d29fa16b75a70cf1fc35c2 WatchSource:0}: Error finding container bc870913f9379b94fe051d481871fbe79e5a524497d29fa16b75a70cf1fc35c2: Status 404 returned error can't find the container with id bc870913f9379b94fe051d481871fbe79e5a524497d29fa16b75a70cf1fc35c2 Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.959047 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.959224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccjb2\" (UniqueName: \"kubernetes.io/projected/a4f14dad-9e69-4a48-bcc4-8fadf648a2be-kube-api-access-ccjb2\") pod \"machine-config-operator-74547568cd-h6h64\" (UID: \"a4f14dad-9e69-4a48-bcc4-8fadf648a2be\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.993486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjfsv\" (UniqueName: \"kubernetes.io/projected/06df3295-b598-4767-b2c1-6fe0a9dbaf37-kube-api-access-gjfsv\") pod \"collect-profiles-29525595-bvfmm\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.998568 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" Feb 19 21:29:02 crc kubenswrapper[4771]: I0219 21:29:02.999735 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.000506 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgjb\" (UniqueName: \"kubernetes.io/projected/a0153e8a-327b-4622-895d-02017e5b03d7-kube-api-access-gdgjb\") pod \"machine-config-server-tjnz7\" (UID: \"a0153e8a-327b-4622-895d-02017e5b03d7\") " pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.001574 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.501558034 +0000 UTC m=+43.773000504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.002603 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.010263 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tjnz7" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.016862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdfd\" (UniqueName: \"kubernetes.io/projected/69705410-9808-46bd-8e5c-64a46eedf641-kube-api-access-fmdfd\") pod \"control-plane-machine-set-operator-78cbb6b69f-gmjpk\" (UID: \"69705410-9808-46bd-8e5c-64a46eedf641\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.017132 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.035195 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.041616 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vwwq\" (UniqueName: \"kubernetes.io/projected/c8f2d6cf-016a-4a9c-9209-cd45506de9ea-kube-api-access-4vwwq\") pod \"service-ca-operator-777779d784-cbz4n\" (UID: \"c8f2d6cf-016a-4a9c-9209-cd45506de9ea\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.046115 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.053817 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.078036 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sc2h\" (UniqueName: \"kubernetes.io/projected/410fd892-777e-4c9d-bed5-69f1a797501f-kube-api-access-9sc2h\") pod \"ingress-operator-5b745b69d9-65zw7\" (UID: \"410fd892-777e-4c9d-bed5-69f1a797501f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.078545 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.081740 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.102724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.102775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.103188 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.603174344 +0000 UTC m=+43.874616804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.110709 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e29256-0243-48a0-b33d-138f7031c12a-metrics-certs\") pod \"network-metrics-daemon-k86sj\" (UID: \"c9e29256-0243-48a0-b33d-138f7031c12a\") " pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.126902 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.139629 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nl82r" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.168422 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k86sj" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.203861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.204082 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.704055763 +0000 UTC m=+43.975498223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.204186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.204479 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.704466575 +0000 UTC m=+43.975909045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.225588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.235376 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.251847 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.252223 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lggcc"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.261743 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h7w47"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.272212 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.297315 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.306992 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.307273 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.807249256 +0000 UTC m=+44.078691726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.324632 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.377708 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p"] Feb 19 21:29:03 crc kubenswrapper[4771]: W0219 21:29:03.388319 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93a7a0c_c589_42dc_bc58_9e0ccca50250.slice/crio-d8f1ec62ca0776a1c1487fc31ebf7b05b60de9657a9a839be792b2b9c3ad8d5e WatchSource:0}: Error finding container d8f1ec62ca0776a1c1487fc31ebf7b05b60de9657a9a839be792b2b9c3ad8d5e: Status 404 returned error can't find the container with id d8f1ec62ca0776a1c1487fc31ebf7b05b60de9657a9a839be792b2b9c3ad8d5e Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.408508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.408751 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.908740122 +0000 UTC m=+44.180182592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: W0219 21:29:03.419791 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c750565_2876_44fb_a7b7_4afd72a47b96.slice/crio-4698b8ccb137ab163a0a4343818d6490d9d19ca0ebe4e77164b7365838faef25 WatchSource:0}: Error finding container 4698b8ccb137ab163a0a4343818d6490d9d19ca0ebe4e77164b7365838faef25: Status 404 returned error can't find the container with id 4698b8ccb137ab163a0a4343818d6490d9d19ca0ebe4e77164b7365838faef25 Feb 19 21:29:03 crc kubenswrapper[4771]: W0219 21:29:03.422125 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa7e479_a1ec_4aca_8172_2ce281048f4a.slice/crio-2de60c7c6215e9620898a0120891f1a80edeb97fc1b4476d8d9da54e96acb75e WatchSource:0}: Error finding container 2de60c7c6215e9620898a0120891f1a80edeb97fc1b4476d8d9da54e96acb75e: Status 404 returned error can't find the container with id 2de60c7c6215e9620898a0120891f1a80edeb97fc1b4476d8d9da54e96acb75e Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.500541 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.500588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.509781 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.510175 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.010160747 +0000 UTC m=+44.281603217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: W0219 21:29:03.582033 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e598f5_22b1_4345_ae26_ea5e48640c84.slice/crio-d8fba92b2ed693a8df251f0103a3e9af46c5d7e3971316c05072ae8e5bdefb3b WatchSource:0}: Error finding container d8fba92b2ed693a8df251f0103a3e9af46c5d7e3971316c05072ae8e5bdefb3b: Status 404 returned error can't find the container with id d8fba92b2ed693a8df251f0103a3e9af46c5d7e3971316c05072ae8e5bdefb3b Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.613502 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.613915 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.113902575 +0000 UTC m=+44.385345035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.627381 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5699f"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.632311 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qpfhv"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.679851 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.714263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.714566 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.214550048 +0000 UTC m=+44.485992518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.716848 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.815240 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.815526 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.315515521 +0000 UTC m=+44.586957991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.911898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" event={"ID":"fde96967-28c6-414f-af23-6333d9636d0d","Type":"ContainerStarted","Data":"f9dd7bde5282716e52e67c3038c1c505675479899b7646ad271887715753bfa7"} Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.927607 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.932107 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.432087826 +0000 UTC m=+44.703530296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.932245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:03 crc kubenswrapper[4771]: E0219 21:29:03.932513 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.432506468 +0000 UTC m=+44.703948938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.951602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" event={"ID":"5af55e6d-0e47-4ee9-a654-2fe525618804","Type":"ContainerStarted","Data":"55ae3a1e29d681dd54da4b7f98f73089626da51a03a98d4321e5e20b2a2ebc69"} Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.959934 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r4rmw" event={"ID":"8f986201-78c2-4e70-bad2-86a6f1fd68b1","Type":"ContainerStarted","Data":"9122412942e2d5ed2368d63a8e10f1ffebdf73f828c7e9b386674231bf4858ee"} Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.966980 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h7w47" event={"ID":"67d4896e-f092-419b-971d-aac9dbaed6d0","Type":"ContainerStarted","Data":"7f5fa88f4cd5d53df5eaff523aee4dc468f77994a1837f4d0d593aa58771ef6b"} Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.970161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" event={"ID":"74547cb9-9778-4565-a709-7d29768bf23f","Type":"ContainerStarted","Data":"dd68f560c3d46182328e8676a85e23355de23a9cb55fb154c87484060f2987c6"} Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.972006 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb"] Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.983526 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" event={"ID":"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3","Type":"ContainerStarted","Data":"59ace5dc0cdfe6855892e519b393e2a2930c05d79dd08a959e00e54dd13ed758"} Feb 19 21:29:03 crc kubenswrapper[4771]: I0219 21:29:03.993562 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" event={"ID":"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70","Type":"ContainerStarted","Data":"f84817240f57b5d36087278c0507f9c6463ee8f18255d23627b610a2cfb1944a"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.005283 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lggcc" event={"ID":"5c750565-2876-44fb-a7b7-4afd72a47b96","Type":"ContainerStarted","Data":"4698b8ccb137ab163a0a4343818d6490d9d19ca0ebe4e77164b7365838faef25"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.012842 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" event={"ID":"c9e598f5-22b1-4345-ae26-ea5e48640c84","Type":"ContainerStarted","Data":"d8fba92b2ed693a8df251f0103a3e9af46c5d7e3971316c05072ae8e5bdefb3b"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.016760 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" event={"ID":"a808f7e3-3664-4131-9968-5f98c4f8480f","Type":"ContainerStarted","Data":"e0646bac776670350bdba510fb484f672a5feb38230cea8ab734b75bbadc00c5"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.024514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" event={"ID":"6fa7e479-a1ec-4aca-8172-2ce281048f4a","Type":"ContainerStarted","Data":"2de60c7c6215e9620898a0120891f1a80edeb97fc1b4476d8d9da54e96acb75e"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.031591 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" event={"ID":"2dd71911-b029-4e67-882c-eb6ad1983c00","Type":"ContainerStarted","Data":"00e6c9337a770c10f81a29b2f56c38016d7e891f50bb1d23ff2535e0391c5159"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.033868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.034890 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" event={"ID":"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6","Type":"ContainerStarted","Data":"120ac13071b92f5aa392328d069db580392779279705134cb006b99c48de74ec"} Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.034941 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.53492335 +0000 UTC m=+44.806365820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.041393 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tjnz7" event={"ID":"a0153e8a-327b-4622-895d-02017e5b03d7","Type":"ContainerStarted","Data":"966b228f3546c4bcb7e937b035e3e2b64911fce45d4acf15aed4d3941d4f3cf2"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.054989 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" event={"ID":"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e","Type":"ContainerStarted","Data":"f03d51bf7672cec151cad7b5f85a4e98fa801e682d2b4a86da3d9778d7f4de56"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.068064 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" event={"ID":"f93a7a0c-c589-42dc-bc58-9e0ccca50250","Type":"ContainerStarted","Data":"d8f1ec62ca0776a1c1487fc31ebf7b05b60de9657a9a839be792b2b9c3ad8d5e"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.071393 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" event={"ID":"407f05a8-4462-4951-8d34-d0d3f0e5a604","Type":"ContainerStarted","Data":"bc870913f9379b94fe051d481871fbe79e5a524497d29fa16b75a70cf1fc35c2"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.075710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" event={"ID":"aff4fff0-509c-411a-b055-595fc81f61c3","Type":"ContainerStarted","Data":"e85f5141d98a18f2b9cc83dd0e56cd4e61b92a250558d91b8ca0a544a739c3ed"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.077461 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.092238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" event={"ID":"c82393e3-5edd-4eac-b692-04f21c0e4c10","Type":"ContainerStarted","Data":"395c0133bc525d4710b54b13ddebb985a3fbc9ea5a87a5fc59f19f023d241ffd"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.092288 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" event={"ID":"c82393e3-5edd-4eac-b692-04f21c0e4c10","Type":"ContainerStarted","Data":"d8c1256667e419e5d626dce90bfec7baef15124ebfc183f12c61e9f53104f5c3"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.097474 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-5tqd2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.097526 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" podUID="aff4fff0-509c-411a-b055-595fc81f61c3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.118524 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" event={"ID":"e7550f22-e38d-40af-a730-6e5f8ef3fef3","Type":"ContainerStarted","Data":"8953588a282e4e22b8741a32b7dd43b17cd0e00dfa86cfd911f548c51f669fba"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.123771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" event={"ID":"96dfe4d4-62ec-4dca-824e-a7770af9db35","Type":"ContainerStarted","Data":"3aa2a2d6de9dafd0d43cbbbdfdcb5a81cf4fc9f479b5cda636683fe9a7c4eca4"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.128415 4771 generic.go:334] "Generic (PLEG): container finished" podID="afc30e40-775d-4371-b176-2320928155bb" containerID="5400b7a867f6d89490129ebb54a95c6d055e415a379881314550b224dc8fbe32" exitCode=0 Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.128473 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" event={"ID":"afc30e40-775d-4371-b176-2320928155bb","Type":"ContainerDied","Data":"5400b7a867f6d89490129ebb54a95c6d055e415a379881314550b224dc8fbe32"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.128492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" event={"ID":"afc30e40-775d-4371-b176-2320928155bb","Type":"ContainerStarted","Data":"99b6a0c4daa06128dede226e5cddc90dd4a8907eb0ba638998c9e11d5160a052"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.135751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.136005 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.635993505 +0000 UTC m=+44.907435985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.151790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" event={"ID":"cb5d417b-c982-4c90-a56c-73eb5672adce","Type":"ContainerStarted","Data":"d18649c73c90ed0bae0bdd6de85771f6e74fd1e421cdb6c499783b0dccc44f0d"} Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.245756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.247660 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.747632707 +0000 UTC m=+45.019075177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.272188 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" podStartSLOduration=22.272172843 podStartE2EDuration="22.272172843s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:04.232336562 +0000 UTC m=+44.503779032" watchObservedRunningTime="2026-02-19 21:29:04.272172843 +0000 UTC m=+44.543615313" Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.347546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.347948 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.84793024 +0000 UTC m=+45.119372710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.430290 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8ls29"] Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.449203 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.449364 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.949338795 +0000 UTC m=+45.220781265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.450607 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.450952 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:04.950942579 +0000 UTC m=+45.222385049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.504550 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n"] Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.504602 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm"] Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.504615 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbd5d"] Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.551645 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.551985 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.051971442 +0000 UTC m=+45.323413912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: W0219 21:29:04.595720 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7897d6_b26c_4f91_ab7f_64da32791052.slice/crio-8742a43f66ee0872b490a165c8349f346f3526eacaced774ecf922d0e3feba2e WatchSource:0}: Error finding container 8742a43f66ee0872b490a165c8349f346f3526eacaced774ecf922d0e3feba2e: Status 404 returned error can't find the container with id 8742a43f66ee0872b490a165c8349f346f3526eacaced774ecf922d0e3feba2e Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.657611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.657878 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.157866749 +0000 UTC m=+45.429309209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.672195 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gdxph"] Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.698414 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fp5bm"] Feb 19 21:29:04 crc kubenswrapper[4771]: W0219 21:29:04.705766 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06df3295_b598_4767_b2c1_6fe0a9dbaf37.slice/crio-e107dce0815c23036b7af4e780e62ed3c351d8c4ec09fd0cf2056b6ffb16454e WatchSource:0}: Error finding container e107dce0815c23036b7af4e780e62ed3c351d8c4ec09fd0cf2056b6ffb16454e: Status 404 returned error can't find the container with id e107dce0815c23036b7af4e780e62ed3c351d8c4ec09fd0cf2056b6ffb16454e Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.770987 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.771733 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.27171403 +0000 UTC m=+45.543156500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.841568 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wr7t7" podStartSLOduration=22.841546077 podStartE2EDuration="22.841546077s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:04.831875785 +0000 UTC m=+45.103318255" watchObservedRunningTime="2026-02-19 21:29:04.841546077 +0000 UTC m=+45.112988537" Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.875529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.875803 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.375791997 +0000 UTC m=+45.647234467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.926532 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf"] Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.931538 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kkkts" podStartSLOduration=22.931519831 podStartE2EDuration="22.931519831s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:04.905263027 +0000 UTC m=+45.176705497" watchObservedRunningTime="2026-02-19 21:29:04.931519831 +0000 UTC m=+45.202962311" Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.934396 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7b8hx"] Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.943542 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" podStartSLOduration=22.943521607 podStartE2EDuration="22.943521607s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:04.943112185 +0000 UTC m=+45.214554665" watchObservedRunningTime="2026-02-19 21:29:04.943521607 +0000 UTC m=+45.214964097" Feb 19 21:29:04 crc kubenswrapper[4771]: I0219 21:29:04.976762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:04 crc kubenswrapper[4771]: E0219 21:29:04.977217 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.477184331 +0000 UTC m=+45.748626811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.012863 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64"] Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.015946 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8jph"] Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.027816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk"] Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.030950 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k86sj"] Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.078253 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.078778 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.57876738 +0000 UTC m=+45.850209850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.084785 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nl82r"] Feb 19 21:29:05 crc kubenswrapper[4771]: W0219 21:29:05.104187 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e29256_0243_48a0_b33d_138f7031c12a.slice/crio-69183f3f9292d0ad7614c3b495c1ba87617fb50356537d88eefb7134794cc371 WatchSource:0}: Error finding container 69183f3f9292d0ad7614c3b495c1ba87617fb50356537d88eefb7134794cc371: Status 404 returned error can't find the container with id 69183f3f9292d0ad7614c3b495c1ba87617fb50356537d88eefb7134794cc371 Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.165252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fp5bm" event={"ID":"a42ac7d5-b603-4acc-bdd3-89b710caf6f4","Type":"ContainerStarted","Data":"9f92b01b185d48f820fee2374d541ea7bc8e3202097f530c0c387e9c884bbce3"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.171415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k86sj" event={"ID":"c9e29256-0243-48a0-b33d-138f7031c12a","Type":"ContainerStarted","Data":"69183f3f9292d0ad7614c3b495c1ba87617fb50356537d88eefb7134794cc371"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.182567 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.182914 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.682899748 +0000 UTC m=+45.954342218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.184238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" event={"ID":"d632d3b3-abbf-4aa5-9317-64588dcf1fdd","Type":"ContainerStarted","Data":"fdc2c4e35acb8a7b96850a1ec7a8b80a7a3bb2338c19a90589c8224d5929d99e"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.195592 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7"] Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.217377 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" event={"ID":"7c9f264f-27e9-419d-8163-3d16ce6c5eda","Type":"ContainerStarted","Data":"5f15150d1481472d7c818c37d2773d3c99435d08507ff9467bd6ccd8f0a85be2"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.217660 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" event={"ID":"7c9f264f-27e9-419d-8163-3d16ce6c5eda","Type":"ContainerStarted","Data":"babd7ee10ef7b0c73c6d67410460cd877415547532617a04652ec2a4654fbedc"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.227338 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" event={"ID":"e7550f22-e38d-40af-a730-6e5f8ef3fef3","Type":"ContainerStarted","Data":"f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.227679 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.231942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" event={"ID":"df159dd8-1cb1-4aa6-90c5-75168e2029fb","Type":"ContainerStarted","Data":"192334385aa2c15b276001e484552912e921903447d1f4edd6b9990ad4c80c13"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.234947 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" event={"ID":"a4f14dad-9e69-4a48-bcc4-8fadf648a2be","Type":"ContainerStarted","Data":"889db011f6eb111732419e924c3d17d4ddb6898fb88613af1dce8ba2a4beb953"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.244996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" event={"ID":"c8f2d6cf-016a-4a9c-9209-cd45506de9ea","Type":"ContainerStarted","Data":"5d63101ff4360d1938409bb313d596585ee185548d4341ecfc2e537055fdba3a"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.254078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" event={"ID":"d901f22a-3e48-4f3f-8516-b1176abe76e0","Type":"ContainerStarted","Data":"15e71931fa1c5a5e2576d2d637a48ee07941953bb556e6a37cced59836bd2e66"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.265117 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" podStartSLOduration=23.26509737 podStartE2EDuration="23.26509737s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.264113994 +0000 UTC m=+45.535556474" watchObservedRunningTime="2026-02-19 21:29:05.26509737 +0000 UTC m=+45.536539840" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.284259 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.284956 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.784940299 +0000 UTC m=+46.056382769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.295398 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lggcc" event={"ID":"5c750565-2876-44fb-a7b7-4afd72a47b96","Type":"ContainerStarted","Data":"5e7ce8da36b5614943fd414cdfa596270698b688511e101c5fb74bfd7a6aadf0"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.296518 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lggcc" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.301384 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-lggcc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.301424 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lggcc" podUID="5c750565-2876-44fb-a7b7-4afd72a47b96" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.303107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" event={"ID":"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3","Type":"ContainerStarted","Data":"d1f0906957fb350b11ebf760ddd760da1c4faf24a9de3102b1e76923bd7fcb3a"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.305544 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" event={"ID":"69705410-9808-46bd-8e5c-64a46eedf641","Type":"ContainerStarted","Data":"9ea7bbe591c4b7fda14f069be29e16e603cad746de56a5b92a0100751df759a5"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.313795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r4rmw" event={"ID":"8f986201-78c2-4e70-bad2-86a6f1fd68b1","Type":"ContainerStarted","Data":"d3529a2eda64a7998f3c180a459d814ea11431e34e82737caf5989d2d8bb4a5c"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.347336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" event={"ID":"1c07c12f-1fa9-421e-b0da-889714e045d7","Type":"ContainerStarted","Data":"23c556229eca38c7c0b8adcd181c81b9def06da33f8bbcfb969e9ce9f6c8562b"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.347603 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" event={"ID":"1c07c12f-1fa9-421e-b0da-889714e045d7","Type":"ContainerStarted","Data":"0374bf6077007a6f9ebd9258f2bc2f370eba294b40e83c7e6ffd86978112f6b8"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.348819 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" event={"ID":"407f05a8-4462-4951-8d34-d0d3f0e5a604","Type":"ContainerStarted","Data":"683d886f2741d74e06d2bf9745e5bab767ffca6062b799e952730d537b4458d7"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.365685 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" event={"ID":"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7","Type":"ContainerStarted","Data":"e6c425a1ce7c5f87c97fcaeb3f1cefb50092a8ea3e540c2ec1c0c00e81d480eb"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.365733 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" event={"ID":"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7","Type":"ContainerStarted","Data":"0fe0c39b55a1cef5d49e4af599dce7850102984d8d41fa77746305668194946b"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.381182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" event={"ID":"74547cb9-9778-4565-a709-7d29768bf23f","Type":"ContainerStarted","Data":"b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.381393 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.385620 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.386522 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.886506998 +0000 UTC m=+46.157949468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.392224 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" event={"ID":"2dd71911-b029-4e67-882c-eb6ad1983c00","Type":"ContainerStarted","Data":"e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.392614 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.400252 4771 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c8n2h container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.400416 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" podUID="2dd71911-b029-4e67-882c-eb6ad1983c00" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.408705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" event={"ID":"0c7897d6-b26c-4f91-ab7f-64da32791052","Type":"ContainerStarted","Data":"8742a43f66ee0872b490a165c8349f346f3526eacaced774ecf922d0e3feba2e"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.409220 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.433933 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lggcc" podStartSLOduration=23.433919826 podStartE2EDuration="23.433919826s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.431284104 +0000 UTC m=+45.702726564" watchObservedRunningTime="2026-02-19 21:29:05.433919826 +0000 UTC m=+45.705362296" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.446962 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vbd5d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.447005 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.482986 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" podStartSLOduration=23.482968508 podStartE2EDuration="23.482968508s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.481000754 +0000 UTC m=+45.752443244" watchObservedRunningTime="2026-02-19 21:29:05.482968508 +0000 UTC m=+45.754410978" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.487736 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.489717 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:05.989703231 +0000 UTC m=+46.261145691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.539100 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" event={"ID":"44818bf7-6be3-4f7f-97c2-0920e255bbba","Type":"ContainerStarted","Data":"6b55f5fcfef61ccaeeae05fd1a79096468f6628f18364af527077d9e9df400e1"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.539139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.558126 4771 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qddxl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.558176 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" podUID="44818bf7-6be3-4f7f-97c2-0920e255bbba" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.558327 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" event={"ID":"fde96967-28c6-414f-af23-6333d9636d0d","Type":"ContainerStarted","Data":"d08eaf6c8421e0cbe20ba4a1a298dbb2cc73c8971a66279c1179e6e8494e57db"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.587291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" event={"ID":"96dfe4d4-62ec-4dca-824e-a7770af9db35","Type":"ContainerStarted","Data":"d026911f5fcbfecb75750adc155ace57a3b6b4e99670b8cec3a032a03a513445"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.588582 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.589760 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.089733287 +0000 UTC m=+46.361175757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.590446 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.594086 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" podStartSLOduration=6.594070305 podStartE2EDuration="6.594070305s" podCreationTimestamp="2026-02-19 21:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.528413872 +0000 UTC m=+45.799856342" watchObservedRunningTime="2026-02-19 21:29:05.594070305 +0000 UTC m=+45.865512775" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.594569 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r4rmw" podStartSLOduration=23.594564789 podStartE2EDuration="23.594564789s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.593820038 +0000 UTC m=+45.865262518" watchObservedRunningTime="2026-02-19 21:29:05.594564789 +0000 UTC m=+45.866007259" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.626582 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" event={"ID":"aff4fff0-509c-411a-b055-595fc81f61c3","Type":"ContainerStarted","Data":"7ca405484e3e4272184c81d08e5b007c87af4177e39730ef9addffbd30b4f744"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.629423 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-km8wt" podStartSLOduration=23.629410154 podStartE2EDuration="23.629410154s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.627262457 +0000 UTC m=+45.898704927" watchObservedRunningTime="2026-02-19 21:29:05.629410154 +0000 UTC m=+45.900852624" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.640710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h7w47" event={"ID":"67d4896e-f092-419b-971d-aac9dbaed6d0","Type":"ContainerStarted","Data":"41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.649691 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" podStartSLOduration=23.649680296 podStartE2EDuration="23.649680296s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.648330089 +0000 UTC m=+45.919772559" watchObservedRunningTime="2026-02-19 21:29:05.649680296 +0000 UTC m=+45.921122766" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.673075 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-h7w47" podStartSLOduration=23.673060811 podStartE2EDuration="23.673060811s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.672527856 +0000 UTC m=+45.943970346" watchObservedRunningTime="2026-02-19 21:29:05.673060811 +0000 UTC m=+45.944503281" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.693958 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.696413 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.196400234 +0000 UTC m=+46.467842704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.699209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" event={"ID":"cb5d417b-c982-4c90-a56c-73eb5672adce","Type":"ContainerStarted","Data":"37adb3cb79ea5c7f9c3e785ceb723bca7f3bc326180ce7a76cd80cdd8cc056fd"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.699249 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" event={"ID":"cb5d417b-c982-4c90-a56c-73eb5672adce","Type":"ContainerStarted","Data":"cfef9571b4fa664e7ee9c8e13f3a691ccd68966539b5375854b4f2052243a1ee"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.733775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" event={"ID":"a808f7e3-3664-4131-9968-5f98c4f8480f","Type":"ContainerStarted","Data":"84180b5eb7e279204585487b12264b369dfed77f8fdf2ad9156b1bde0821617e"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.733951 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8s6z" podStartSLOduration=25.733935654 podStartE2EDuration="25.733935654s" podCreationTimestamp="2026-02-19 21:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.73343033 +0000 UTC m=+46.004872800" watchObservedRunningTime="2026-02-19 21:29:05.733935654 +0000 UTC m=+46.005378124" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.735301 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-h7gx8" podStartSLOduration=23.735294131 podStartE2EDuration="23.735294131s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.714070694 +0000 UTC m=+45.985513174" watchObservedRunningTime="2026-02-19 21:29:05.735294131 +0000 UTC m=+46.006736601" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.750216 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" event={"ID":"e17ac1b5-90d3-4f7e-a806-0285862f3c05","Type":"ContainerStarted","Data":"e6ad5ef802bfded730505cfb1078b67d8d7ffe856f29823187dc4a3bc5a9d436"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.751602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" event={"ID":"06df3295-b598-4767-b2c1-6fe0a9dbaf37","Type":"ContainerStarted","Data":"e107dce0815c23036b7af4e780e62ed3c351d8c4ec09fd0cf2056b6ffb16454e"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.760128 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" podStartSLOduration=23.760113855 podStartE2EDuration="23.760113855s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.759474267 +0000 UTC m=+46.030916747" watchObservedRunningTime="2026-02-19 21:29:05.760113855 +0000 UTC m=+46.031556315" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.784740 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" event={"ID":"6fa7e479-a1ec-4aca-8172-2ce281048f4a","Type":"ContainerStarted","Data":"81213429b7ead68497e0d7a3fa83fc957435ac5a3bd148e8e69f832e9620b2a7"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.793276 4771 generic.go:334] "Generic (PLEG): container finished" podID="f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6" containerID="7b495af2815118a0683f921477e5d1254f25279d093be5721a4a9da5725baa51" exitCode=0 Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.793338 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" event={"ID":"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6","Type":"ContainerDied","Data":"7b495af2815118a0683f921477e5d1254f25279d093be5721a4a9da5725baa51"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.794731 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.796097 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.296082442 +0000 UTC m=+46.567524912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.836805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" event={"ID":"afc30e40-775d-4371-b176-2320928155bb","Type":"ContainerStarted","Data":"b63c7a4b532cdb0467761cb49c47a51994c9eb38ad2105cac8aeecf71b8d4b77"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.866340 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" podStartSLOduration=23.86632724 podStartE2EDuration="23.86632724s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.865515267 +0000 UTC m=+46.136957737" watchObservedRunningTime="2026-02-19 21:29:05.86632724 +0000 UTC m=+46.137769700" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.878309 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" event={"ID":"4037bcfd-4c3e-4c7a-95c7-208c2c14ef70","Type":"ContainerStarted","Data":"cfa7e23583314022672abefb7b2e6b6733d48cf0ee1d30c270dd110bc4278c05"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.879355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.882179 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" event={"ID":"3dd706f5-ac29-450c-9f17-cec521732d11","Type":"ContainerStarted","Data":"0e725805160b38fe2830b26d437ba16309c4919479b0d946ef3777d05b88bf77"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.882220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" event={"ID":"3dd706f5-ac29-450c-9f17-cec521732d11","Type":"ContainerStarted","Data":"894e74f8e0b2618b25f3bdf799906b10d10dc2e619d46cd7ebf2370ac2580df6"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.884338 4771 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-cs5n6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.884369 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" podUID="4037bcfd-4c3e-4c7a-95c7-208c2c14ef70" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.885490 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qvwzh" podStartSLOduration=23.885474739 podStartE2EDuration="23.885474739s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.885056718 +0000 UTC m=+46.156499188" watchObservedRunningTime="2026-02-19 21:29:05.885474739 +0000 UTC m=+46.156917209" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.896624 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:05 crc kubenswrapper[4771]: E0219 21:29:05.897628 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.397542297 +0000 UTC m=+46.668984767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.909262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tjnz7" event={"ID":"a0153e8a-327b-4622-895d-02017e5b03d7","Type":"ContainerStarted","Data":"ebcdfca5ea7166120b898a5ade01a0d511f5085974f33c7a77c2a7990c1c8f2c"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.933563 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-th9zl" podStartSLOduration=23.933539895 podStartE2EDuration="23.933539895s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.911238369 +0000 UTC m=+46.182680849" watchObservedRunningTime="2026-02-19 21:29:05.933539895 +0000 UTC m=+46.204982365" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.960191 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.968658 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:05 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:05 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:05 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.968700 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.974776 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" event={"ID":"f93a7a0c-c589-42dc-bc58-9e0ccca50250","Type":"ContainerStarted","Data":"a62dcdbef551e37f3f45812820db9188f8a648b05918d0de64b3bca1c69cf93c"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.995344 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" podStartSLOduration=23.995328113 podStartE2EDuration="23.995328113s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:05.958802601 +0000 UTC m=+46.230245071" watchObservedRunningTime="2026-02-19 21:29:05.995328113 +0000 UTC m=+46.266770583" Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.997713 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" event={"ID":"e962ec8f-cfdf-4c17-8ae0-061d016a43d7","Type":"ContainerStarted","Data":"afdc03bb62936d245bcee90c1deb638fb062df47a5a247b6da5319b1ffa7f43e"} Feb 19 21:29:05 crc kubenswrapper[4771]: I0219 21:29:05.999141 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.006115 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.506071284 +0000 UTC m=+46.777513764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.042835 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-shgdd" podStartSLOduration=24.042816103 podStartE2EDuration="24.042816103s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:06.032061601 +0000 UTC m=+46.303504071" watchObservedRunningTime="2026-02-19 21:29:06.042816103 +0000 UTC m=+46.314258573" Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.094159 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tjnz7" podStartSLOduration=7.094140006 podStartE2EDuration="7.094140006s" podCreationTimestamp="2026-02-19 21:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:06.060316498 +0000 UTC m=+46.331758968" watchObservedRunningTime="2026-02-19 21:29:06.094140006 +0000 UTC m=+46.365582476" Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.113245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.117449 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.617433259 +0000 UTC m=+46.888875769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.137569 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" podStartSLOduration=24.137550615 podStartE2EDuration="24.137550615s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:06.095838633 +0000 UTC m=+46.367281103" watchObservedRunningTime="2026-02-19 21:29:06.137550615 +0000 UTC m=+46.408993085" Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.160009 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.209356 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wv4nf" podStartSLOduration=24.209337496 podStartE2EDuration="24.209337496s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:06.14614665 +0000 UTC m=+46.417589140" watchObservedRunningTime="2026-02-19 21:29:06.209337496 +0000 UTC m=+46.480779966" Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.229258 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.229556 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.729539404 +0000 UTC m=+47.000981874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.336706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.337080 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.837064425 +0000 UTC m=+47.108506895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.441900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.442153 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.942128628 +0000 UTC m=+47.213571108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.450257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.450684 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:06.95066907 +0000 UTC m=+47.222111540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.551230 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.552207 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.052186188 +0000 UTC m=+47.323628648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.653467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.653775 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.153763496 +0000 UTC m=+47.425205966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.755976 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.756150 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.256115315 +0000 UTC m=+47.527557785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.756297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.756550 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.256538987 +0000 UTC m=+47.527981457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.857044 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.857656 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.357640923 +0000 UTC m=+47.629083393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.958467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:06 crc kubenswrapper[4771]: E0219 21:29:06.958754 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.458741929 +0000 UTC m=+47.730184399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.962559 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:06 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:06 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:06 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:06 crc kubenswrapper[4771]: I0219 21:29:06.962596 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.005269 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fp5bm" event={"ID":"a42ac7d5-b603-4acc-bdd3-89b710caf6f4","Type":"ContainerStarted","Data":"922ed6a3e0b8af2f3d95f23467eef4e3ff4c201252a0b97714c7c605ffc9a2bd"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.005585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fp5bm" event={"ID":"a42ac7d5-b603-4acc-bdd3-89b710caf6f4","Type":"ContainerStarted","Data":"43f108751d3ab415394057b1b86633bd7536715a3edbec1007d02d3b3eff8f53"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.005601 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.008047 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" event={"ID":"afc30e40-775d-4371-b176-2320928155bb","Type":"ContainerStarted","Data":"badfc77928df19f76e67fd96f36ac7d7901a8482b4a59d655484a4d19a80d824"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.009809 4771 generic.go:334] "Generic (PLEG): container finished" podID="6fa7e479-a1ec-4aca-8172-2ce281048f4a" containerID="81213429b7ead68497e0d7a3fa83fc957435ac5a3bd148e8e69f832e9620b2a7" exitCode=0 Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.009973 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" event={"ID":"6fa7e479-a1ec-4aca-8172-2ce281048f4a","Type":"ContainerDied","Data":"81213429b7ead68497e0d7a3fa83fc957435ac5a3bd148e8e69f832e9620b2a7"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.009998 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" event={"ID":"6fa7e479-a1ec-4aca-8172-2ce281048f4a","Type":"ContainerStarted","Data":"aa9b29cb188ff025951805144381f8c758ff7894ab630d67000721281f3e6ce7"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.011524 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" event={"ID":"7c9f264f-27e9-419d-8163-3d16ce6c5eda","Type":"ContainerStarted","Data":"cb372d06d8ee1c87802721d502ddf218e6fc1ba8b8fbc50513148a57a6752a8f"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.013568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" event={"ID":"f355f2ce-0f2d-4e1d-ac8f-6dda6548dda6","Type":"ContainerStarted","Data":"0fefa50f78c022597c0312d3ae3c026d3d5008f6c067dbfd89fafd3df0273394"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.013896 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.015671 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" event={"ID":"308e6697-8ef5-46ec-b57e-6c86e2cfe4c3","Type":"ContainerStarted","Data":"bfa8fb716f8476d87c86c8d5b8de48bbe6c7e9833d92f7d71692c840b18535dd"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.016115 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.017444 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" event={"ID":"1c07c12f-1fa9-421e-b0da-889714e045d7","Type":"ContainerStarted","Data":"82752450d98bb0696db04ad0b61f28426b1709a6ea4ab42e045bd28f0214ebe0"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.018885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nl82r" event={"ID":"b799beef-7d4d-4ea2-8c0c-4b98c52de22b","Type":"ContainerStarted","Data":"c6ac406e459a7dbe9c5049b606fb80d684da049a5a750b9601b060604ea94e94"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.018904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nl82r" event={"ID":"b799beef-7d4d-4ea2-8c0c-4b98c52de22b","Type":"ContainerStarted","Data":"fbf8740d76e641e8eb457ba883b5cdf90bf411c36b5dadd27692460b11552e1d"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.020462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" event={"ID":"df159dd8-1cb1-4aa6-90c5-75168e2029fb","Type":"ContainerStarted","Data":"2058709892d70c6a4a6c4153f7231038d0c9cfd71cb8c6529722706548f24f3c"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.021935 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" event={"ID":"dfccc5a0-1d91-433b-8c5d-41734a1f8ab7","Type":"ContainerStarted","Data":"435841f61253e724c3061ea5310d59a202886e4b395de16386073f89d7cef07e"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.023412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" event={"ID":"ae356e32-2ecf-4479-bb89-b4f3ce3a2c9e","Type":"ContainerStarted","Data":"3b0faa4e9037b53593e1814806090668ec8598d2741a3492f2968e6f3db90cb5"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.027432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" event={"ID":"d632d3b3-abbf-4aa5-9317-64588dcf1fdd","Type":"ContainerStarted","Data":"0959d7f7b2a3bcabe6e7f1bdc204aaccc3dda71b6cff8a494b5b3ae7a0978d26"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.027462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" event={"ID":"d632d3b3-abbf-4aa5-9317-64588dcf1fdd","Type":"ContainerStarted","Data":"097604a47b0e2b608e5a992da241a10f141a3c106268cb9f4b51d33b0c2f5eec"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.028922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" event={"ID":"69705410-9808-46bd-8e5c-64a46eedf641","Type":"ContainerStarted","Data":"c97621e578751f29de6855bacdba89f2373313df14f2a45a62adca50462d31f9"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.030546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" event={"ID":"06df3295-b598-4767-b2c1-6fe0a9dbaf37","Type":"ContainerStarted","Data":"2a98662daf41e5fbf0654b65df289550b89a916a8e22da75da3fee17f2c09438"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.031626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" event={"ID":"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b","Type":"ContainerStarted","Data":"194fd4226e12c65f0ec1c49681d5051c610239707e6d8240ba196ad02b89358d"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.032709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" event={"ID":"0c7897d6-b26c-4f91-ab7f-64da32791052","Type":"ContainerStarted","Data":"648ecd3015733aecc2c84ee53556799b6b4f07a4cba6cf9534cd2b8369a3be7a"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.033390 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vbd5d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.033420 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.035363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" event={"ID":"c9e598f5-22b1-4345-ae26-ea5e48640c84","Type":"ContainerStarted","Data":"acb723e81d157189c9fd3c7ce4b2516f6782ee6fd67b26266e59f401f9427f6d"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.036854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" event={"ID":"e17ac1b5-90d3-4f7e-a806-0285862f3c05","Type":"ContainerStarted","Data":"baca917a7f73f33e96a153d47ed65da73de4423bc3dc5620ceeda79f7f1bfd30"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.038717 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.040142 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" event={"ID":"d901f22a-3e48-4f3f-8516-b1176abe76e0","Type":"ContainerStarted","Data":"8535884e8e6ac54acb27a01f583aaaf15f89ca446618c80cfdf2fdfb6fb14e0b"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.041151 4771 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vclgf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.041198 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" podUID="e17ac1b5-90d3-4f7e-a806-0285862f3c05" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.042596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-249lb" event={"ID":"e962ec8f-cfdf-4c17-8ae0-061d016a43d7","Type":"ContainerStarted","Data":"dfab9deb91ecda814e71b74f80bc5ae792e4cbe52cedfbb5c4cbb4a8f6708e39"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.044163 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" event={"ID":"a4f14dad-9e69-4a48-bcc4-8fadf648a2be","Type":"ContainerStarted","Data":"17c4abf23eb73de3a127cb062c658f7a424741c65a31fc25232225c805b2ef67"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.044195 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" event={"ID":"a4f14dad-9e69-4a48-bcc4-8fadf648a2be","Type":"ContainerStarted","Data":"7e88cfad564b2634c40f365f97df3e5738987460913964bb05cda6fd490651d3"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.045820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k86sj" event={"ID":"c9e29256-0243-48a0-b33d-138f7031c12a","Type":"ContainerStarted","Data":"c25ae4ce54b5aaf387a1ce04b8ad755cf27b8bc8c1ff2b8fce46477674ae0651"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.045869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k86sj" event={"ID":"c9e29256-0243-48a0-b33d-138f7031c12a","Type":"ContainerStarted","Data":"3e3b9d12390fd254fea2300ebd354774f98b5ed6f89e7e937c7605f39f7f9724"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.047184 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" event={"ID":"c8f2d6cf-016a-4a9c-9209-cd45506de9ea","Type":"ContainerStarted","Data":"4e0008ec077e78110833fa9780d61450c269cc16dd51f9f856e366f32d7c05c2"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.048746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" event={"ID":"44818bf7-6be3-4f7f-97c2-0920e255bbba","Type":"ContainerStarted","Data":"d3a7a0da16b4c37883489478e8d31cede8ea66a6ec4d776e4d5e90949a137480"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.051805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" event={"ID":"410fd892-777e-4c9d-bed5-69f1a797501f","Type":"ContainerStarted","Data":"fa9d7668531795310c9f09f3354aa4b5fca51c5d62a937baa1527970166195b9"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.051842 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" event={"ID":"410fd892-777e-4c9d-bed5-69f1a797501f","Type":"ContainerStarted","Data":"7b5d6a8ca18712c4056d7772bf5681d341c9569bde7dfb7233143d7d83ce0040"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.051851 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" event={"ID":"410fd892-777e-4c9d-bed5-69f1a797501f","Type":"ContainerStarted","Data":"00067af51f2686ea6f6168388b57153aecdb942e1adb970400d89bbb18c018da"} Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.051992 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-lggcc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.052090 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lggcc" podUID="5c750565-2876-44fb-a7b7-4afd72a47b96" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.058605 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.059338 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.059727 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.559715041 +0000 UTC m=+47.831157511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.060746 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fp5bm" podStartSLOduration=8.060726209 podStartE2EDuration="8.060726209s" podCreationTimestamp="2026-02-19 21:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.058615401 +0000 UTC m=+47.330057871" watchObservedRunningTime="2026-02-19 21:29:07.060726209 +0000 UTC m=+47.332168679" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.068387 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs5n6" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.099782 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.129889 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gmjpk" podStartSLOduration=25.129873197 podStartE2EDuration="25.129873197s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.128736196 +0000 UTC m=+47.400178666" watchObservedRunningTime="2026-02-19 21:29:07.129873197 +0000 UTC m=+47.401315667" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.161607 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.182512 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.682498776 +0000 UTC m=+47.953941246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.192615 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-65zw7" podStartSLOduration=25.19260056 podStartE2EDuration="25.19260056s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.148261996 +0000 UTC m=+47.419704466" watchObservedRunningTime="2026-02-19 21:29:07.19260056 +0000 UTC m=+47.464043030" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.236658 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" podStartSLOduration=25.236641956 podStartE2EDuration="25.236641956s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.194919894 +0000 UTC m=+47.466362374" watchObservedRunningTime="2026-02-19 21:29:07.236641956 +0000 UTC m=+47.508084416" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.264567 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.265159 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.765144681 +0000 UTC m=+48.036587151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.278268 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gdxph" podStartSLOduration=25.278252597 podStartE2EDuration="25.278252597s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.277357002 +0000 UTC m=+47.548799472" watchObservedRunningTime="2026-02-19 21:29:07.278252597 +0000 UTC m=+47.549695067" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.278880 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cbz4n" podStartSLOduration=25.278876574 podStartE2EDuration="25.278876574s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.238131137 +0000 UTC m=+47.509573607" watchObservedRunningTime="2026-02-19 21:29:07.278876574 +0000 UTC m=+47.550319044" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.325311 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vrkjw" podStartSLOduration=25.325297115 podStartE2EDuration="25.325297115s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.324975665 +0000 UTC m=+47.596418135" watchObservedRunningTime="2026-02-19 21:29:07.325297115 +0000 UTC m=+47.596739585" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.326936 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k86sj" podStartSLOduration=25.326931409 podStartE2EDuration="25.326931409s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.295600767 +0000 UTC m=+47.567043237" watchObservedRunningTime="2026-02-19 21:29:07.326931409 +0000 UTC m=+47.598373879" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.345894 4771 csr.go:261] certificate signing request csr-57gwx is approved, waiting to be issued Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.348607 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" podStartSLOduration=25.348594077 podStartE2EDuration="25.348594077s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.346336966 +0000 UTC m=+47.617779446" watchObservedRunningTime="2026-02-19 21:29:07.348594077 +0000 UTC m=+47.620036547" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.362046 4771 csr.go:257] certificate signing request csr-57gwx is issued Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.366766 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.367209 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.867195213 +0000 UTC m=+48.138637683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.367605 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qpfhv" podStartSLOduration=25.367587173 podStartE2EDuration="25.367587173s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.366238146 +0000 UTC m=+47.637680616" watchObservedRunningTime="2026-02-19 21:29:07.367587173 +0000 UTC m=+47.639029643" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.413116 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" podStartSLOduration=25.413102359 podStartE2EDuration="25.413102359s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.41240985 +0000 UTC m=+47.683852330" watchObservedRunningTime="2026-02-19 21:29:07.413102359 +0000 UTC m=+47.684544819" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.425312 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.425845 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.435538 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nl82r" podStartSLOduration=8.435524928 podStartE2EDuration="8.435524928s" podCreationTimestamp="2026-02-19 21:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.433678058 +0000 UTC m=+47.705120538" watchObservedRunningTime="2026-02-19 21:29:07.435524928 +0000 UTC m=+47.706967398" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.458388 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5699f" podStartSLOduration=25.458371059 podStartE2EDuration="25.458371059s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.456027525 +0000 UTC m=+47.727469995" watchObservedRunningTime="2026-02-19 21:29:07.458371059 +0000 UTC m=+47.729813529" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.469729 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.470150 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:07.970125168 +0000 UTC m=+48.241567638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.489287 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s24kp" podStartSLOduration=25.489269888 podStartE2EDuration="25.489269888s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.488048285 +0000 UTC m=+47.759490755" watchObservedRunningTime="2026-02-19 21:29:07.489269888 +0000 UTC m=+47.760712358" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.542576 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8ls29" podStartSLOduration=25.542557895 podStartE2EDuration="25.542557895s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.516814985 +0000 UTC m=+47.788257475" watchObservedRunningTime="2026-02-19 21:29:07.542557895 +0000 UTC m=+47.814000365" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.561338 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.561702 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.568149 4771 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-286qc container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.568214 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" podUID="6fa7e479-a1ec-4aca-8172-2ce281048f4a" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.572655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.581050 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.08103532 +0000 UTC m=+48.352477790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.582732 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5b7s6" podStartSLOduration=25.582716126 podStartE2EDuration="25.582716126s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.544747674 +0000 UTC m=+47.816190164" watchObservedRunningTime="2026-02-19 21:29:07.582716126 +0000 UTC m=+47.854158586" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.593936 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6h64" podStartSLOduration=25.59392009 podStartE2EDuration="25.59392009s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.580348892 +0000 UTC m=+47.851791372" watchObservedRunningTime="2026-02-19 21:29:07.59392009 +0000 UTC m=+47.865362560" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.608793 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" podStartSLOduration=25.608776323 podStartE2EDuration="25.608776323s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.608231189 +0000 UTC m=+47.879673669" watchObservedRunningTime="2026-02-19 21:29:07.608776323 +0000 UTC m=+47.880218793" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.647792 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" podStartSLOduration=25.647776643 podStartE2EDuration="25.647776643s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:07.64694843 +0000 UTC m=+47.918390910" watchObservedRunningTime="2026-02-19 21:29:07.647776643 +0000 UTC m=+47.919219113" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.676844 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.677208 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.177194882 +0000 UTC m=+48.448637342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.778488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.779052 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.279041328 +0000 UTC m=+48.550483798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.880346 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.880539 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.380496563 +0000 UTC m=+48.651939033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.880636 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.881038 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.381008428 +0000 UTC m=+48.652450898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.964880 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:07 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:07 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:07 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.964959 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.982046 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.982213 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.482191125 +0000 UTC m=+48.753633595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:07 crc kubenswrapper[4771]: I0219 21:29:07.982342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:07 crc kubenswrapper[4771]: E0219 21:29:07.982640 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.482633507 +0000 UTC m=+48.754075977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.049807 4771 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qddxl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.049883 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" podUID="44818bf7-6be3-4f7f-97c2-0920e255bbba" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.057411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" event={"ID":"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b","Type":"ContainerStarted","Data":"d499b9301dafc959c924afdf87d8be9cd0bee5fc09ac37dde6fb81627d3493e0"} Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.058644 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" event={"ID":"df159dd8-1cb1-4aa6-90c5-75168e2029fb","Type":"ContainerStarted","Data":"748e5aee0b907824b68ddc8e57c437488d0cd79c04638faab0f7a3ebf545d3cc"} Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.059643 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-lggcc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.059689 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lggcc" podUID="5c750565-2876-44fb-a7b7-4afd72a47b96" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.059742 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vbd5d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.059775 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.066655 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vclgf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.083407 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.083497 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.583478036 +0000 UTC m=+48.854920506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.083641 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.083944 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.583936899 +0000 UTC m=+48.855379369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.090064 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7b8hx" podStartSLOduration=26.090045685 podStartE2EDuration="26.090045685s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:08.08986811 +0000 UTC m=+48.361310590" watchObservedRunningTime="2026-02-19 21:29:08.090045685 +0000 UTC m=+48.361488155" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.142992 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ctvtj"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.143980 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.145986 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.147505 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ctvtj"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.189481 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.190029 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb2kp"] Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.190800 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.690784021 +0000 UTC m=+48.962226491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.234216 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.303888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-catalog-content\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.303972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.304042 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-utilities\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.304065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvvvv\" (UniqueName: \"kubernetes.io/projected/7b38fbe5-f497-4deb-aaff-6d95b89a1783-kube-api-access-zvvvv\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.304378 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.804366095 +0000 UTC m=+49.075808565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.352147 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6bkf"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.353308 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.356591 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.365219 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 21:24:07 +0000 UTC, rotation deadline is 2026-12-24 07:07:40.376038844 +0000 UTC Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.365276 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7377h38m32.010765165s for next certificate rotation Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.400780 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6bkf"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.422489 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.422783 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-utilities\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.422811 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvvvv\" (UniqueName: \"kubernetes.io/projected/7b38fbe5-f497-4deb-aaff-6d95b89a1783-kube-api-access-zvvvv\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.422857 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-catalog-content\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.423246 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-catalog-content\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.423311 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:08.923296496 +0000 UTC m=+49.194738966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.423617 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-utilities\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.471182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvvvv\" (UniqueName: \"kubernetes.io/projected/7b38fbe5-f497-4deb-aaff-6d95b89a1783-kube-api-access-zvvvv\") pod \"certified-operators-ctvtj\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.527327 4771 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jwglk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 19 21:29:08 crc kubenswrapper[4771]: [+]log ok Feb 19 21:29:08 crc kubenswrapper[4771]: [+]etcd ok Feb 19 21:29:08 crc kubenswrapper[4771]: [-]poststarthook/start-apiserver-admission-initializer failed: reason withheld Feb 19 21:29:08 crc kubenswrapper[4771]: [+]poststarthook/generic-apiserver-start-informers ok Feb 19 21:29:08 crc kubenswrapper[4771]: [-]poststarthook/max-in-flight-filter failed: reason withheld Feb 19 21:29:08 crc kubenswrapper[4771]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 19 21:29:08 crc kubenswrapper[4771]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 19 21:29:08 crc kubenswrapper[4771]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 19 21:29:08 crc kubenswrapper[4771]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 19 21:29:08 crc kubenswrapper[4771]: [+]poststarthook/project.openshift.io-projectcache ok Feb 19 21:29:08 crc kubenswrapper[4771]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 19 21:29:08 crc kubenswrapper[4771]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 19 21:29:08 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 19 21:29:08 crc kubenswrapper[4771]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 19 21:29:08 crc kubenswrapper[4771]: livez check failed Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.527381 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" podUID="afc30e40-775d-4371-b176-2320928155bb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.529617 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-utilities\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.529689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-catalog-content\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.529712 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.529729 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxcm9\" (UniqueName: \"kubernetes.io/projected/a21efe71-f4d3-4155-af2a-06ebc4646f14-kube-api-access-sxcm9\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.530085 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.030073135 +0000 UTC m=+49.301515605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.542903 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9pf5"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.543759 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.576347 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9pf5"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.630585 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.630768 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.13074311 +0000 UTC m=+49.402185580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sls\" (UniqueName: \"kubernetes.io/projected/563611ad-e721-4b19-85a3-f5a37eb965d3-kube-api-access-p8sls\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631182 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-catalog-content\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxcm9\" (UniqueName: \"kubernetes.io/projected/a21efe71-f4d3-4155-af2a-06ebc4646f14-kube-api-access-sxcm9\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-utilities\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631289 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-catalog-content\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-utilities\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.631503 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.13148934 +0000 UTC m=+49.402931810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631593 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-catalog-content\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.631682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-utilities\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.651633 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxcm9\" (UniqueName: \"kubernetes.io/projected/a21efe71-f4d3-4155-af2a-06ebc4646f14-kube-api-access-sxcm9\") pod \"community-operators-k6bkf\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.674423 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.732798 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.732958 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.232934225 +0000 UTC m=+49.504376695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.733296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sls\" (UniqueName: \"kubernetes.io/projected/563611ad-e721-4b19-85a3-f5a37eb965d3-kube-api-access-p8sls\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.733353 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.733378 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-utilities\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.733418 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-catalog-content\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.733669 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.233660605 +0000 UTC m=+49.505103075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.733840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-catalog-content\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.733989 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-utilities\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.752426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sls\" (UniqueName: \"kubernetes.io/projected/563611ad-e721-4b19-85a3-f5a37eb965d3-kube-api-access-p8sls\") pod \"certified-operators-h9pf5\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.753158 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d7msc"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.753986 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.762194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.771368 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7msc"] Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.834361 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.834593 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-utilities\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.834614 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-catalog-content\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.834649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjn67\" (UniqueName: \"kubernetes.io/projected/b2eb12a8-477f-462f-be65-0ee4bec101d3-kube-api-access-jjn67\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.834764 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.334750701 +0000 UTC m=+49.606193171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.859844 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.936970 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.937064 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-utilities\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.937103 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-catalog-content\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.937170 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjn67\" (UniqueName: \"kubernetes.io/projected/b2eb12a8-477f-462f-be65-0ee4bec101d3-kube-api-access-jjn67\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: E0219 21:29:08.937919 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.437900971 +0000 UTC m=+49.709343481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.939493 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-utilities\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.945077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-catalog-content\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.967781 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjn67\" (UniqueName: \"kubernetes.io/projected/b2eb12a8-477f-462f-be65-0ee4bec101d3-kube-api-access-jjn67\") pod \"community-operators-d7msc\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.967869 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:08 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:08 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:08 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:08 crc kubenswrapper[4771]: I0219 21:29:08.968042 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.046706 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.047382 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.547367965 +0000 UTC m=+49.818810425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.063275 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6bkf"] Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.075554 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.080277 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" gracePeriod=30 Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.080426 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" event={"ID":"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b","Type":"ContainerStarted","Data":"b9b72a0b094e3a1f541e53d6db651065956b2e156c9106455119142ddab421ae"} Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.080449 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" event={"ID":"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b","Type":"ContainerStarted","Data":"53050ef840dcfa0af4831299bbfa928c7d515ab6f2d171c4b376ac965aa0b528"} Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.094731 4771 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.148749 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.149039 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.649010985 +0000 UTC m=+49.920453455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.164552 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ctvtj"] Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.194677 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9pf5"] Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.249945 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.250121 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.75009004 +0000 UTC m=+50.021532510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.250264 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.253106 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.753091033 +0000 UTC m=+50.024533503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.294133 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgdkp" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.351448 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.351727 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.851712591 +0000 UTC m=+50.123155061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.420576 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7msc"] Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.455594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.455920 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:09.95590718 +0000 UTC m=+50.227349650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.557053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.557218 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:10.057200942 +0000 UTC m=+50.328643412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.557273 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: E0219 21:29:09.557577 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:29:10.057569922 +0000 UTC m=+50.329012382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b4lf5" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.612068 4771 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T21:29:09.094747121Z","Handler":null,"Name":""} Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.623448 4771 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.623481 4771 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.658855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.664662 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.760294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.763888 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.763926 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.787278 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b4lf5\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.815578 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.963746 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:09 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:09 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:09 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.963805 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.966314 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.967319 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.970617 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.973061 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 21:29:09 crc kubenswrapper[4771]: I0219 21:29:09.974375 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.064348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/178e55ea-2254-4bf3-9cba-da71eae52721-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.064702 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178e55ea-2254-4bf3-9cba-da71eae52721-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.087724 4771 generic.go:334] "Generic (PLEG): container finished" podID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerID="393ddc06d07027a1dabd658f4c7fc978efbab3d835af865e8c919f3c4233b740" exitCode=0 Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.087771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9pf5" event={"ID":"563611ad-e721-4b19-85a3-f5a37eb965d3","Type":"ContainerDied","Data":"393ddc06d07027a1dabd658f4c7fc978efbab3d835af865e8c919f3c4233b740"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.087792 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9pf5" event={"ID":"563611ad-e721-4b19-85a3-f5a37eb965d3","Type":"ContainerStarted","Data":"0566e2ffaf3e76bef342e3f4e0515685bf415e4102882f56bd3f255cc64a2583"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.089775 4771 generic.go:334] "Generic (PLEG): container finished" podID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerID="5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d" exitCode=0 Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.089825 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7msc" event={"ID":"b2eb12a8-477f-462f-be65-0ee4bec101d3","Type":"ContainerDied","Data":"5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.089844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7msc" event={"ID":"b2eb12a8-477f-462f-be65-0ee4bec101d3","Type":"ContainerStarted","Data":"8320ada589cc17308d11fffae8918bcfb1adb061853840db48732a3a387c6b72"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.090712 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.094473 4771 generic.go:334] "Generic (PLEG): container finished" podID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerID="f1c7e3616234a57fcf8d54c7bb1f54ae9ce707cfb01d5377290494ed1c6ac38c" exitCode=0 Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.094522 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6bkf" event={"ID":"a21efe71-f4d3-4155-af2a-06ebc4646f14","Type":"ContainerDied","Data":"f1c7e3616234a57fcf8d54c7bb1f54ae9ce707cfb01d5377290494ed1c6ac38c"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.094546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6bkf" event={"ID":"a21efe71-f4d3-4155-af2a-06ebc4646f14","Type":"ContainerStarted","Data":"b6816599a945c21fa18578fc2365d74e9b8af2a5152e30d0c230e9f3cd23b05b"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.095717 4771 generic.go:334] "Generic (PLEG): container finished" podID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerID="9a93e820793880ba2f3b9e87a134060100400fecc8493dfe72809d83b70e38d0" exitCode=0 Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.095755 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctvtj" event={"ID":"7b38fbe5-f497-4deb-aaff-6d95b89a1783","Type":"ContainerDied","Data":"9a93e820793880ba2f3b9e87a134060100400fecc8493dfe72809d83b70e38d0"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.095768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctvtj" event={"ID":"7b38fbe5-f497-4deb-aaff-6d95b89a1783","Type":"ContainerStarted","Data":"7776d49f10461512ce74807265c66ac7d1b81b687a52cb29c5a31bf832056f2e"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.097724 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" event={"ID":"6f6406cd-da81-49c4-b8d6-c2a7578b8b3b","Type":"ContainerStarted","Data":"16e28dd7337e3d1480e56011373bd14d9dd3c62631a0676324034978b0ae18bd"} Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.097745 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b4lf5"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.103658 4771 generic.go:334] "Generic (PLEG): container finished" podID="06df3295-b598-4767-b2c1-6fe0a9dbaf37" containerID="2a98662daf41e5fbf0654b65df289550b89a916a8e22da75da3fee17f2c09438" exitCode=0 Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.103705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" event={"ID":"06df3295-b598-4767-b2c1-6fe0a9dbaf37","Type":"ContainerDied","Data":"2a98662daf41e5fbf0654b65df289550b89a916a8e22da75da3fee17f2c09438"} Feb 19 21:29:10 crc kubenswrapper[4771]: W0219 21:29:10.117095 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb0d9dfc_00bf_48fb_b62f_9267f59db52f.slice/crio-1ecd8ef9d63684c2f91451b565ea6b66ed9087d0c539194ee7b148c4580a0341 WatchSource:0}: Error finding container 1ecd8ef9d63684c2f91451b565ea6b66ed9087d0c539194ee7b148c4580a0341: Status 404 returned error can't find the container with id 1ecd8ef9d63684c2f91451b565ea6b66ed9087d0c539194ee7b148c4580a0341 Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.168859 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/178e55ea-2254-4bf3-9cba-da71eae52721-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.168902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178e55ea-2254-4bf3-9cba-da71eae52721-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.169245 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/178e55ea-2254-4bf3-9cba-da71eae52721-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.192979 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-h8jph" podStartSLOduration=11.192961579 podStartE2EDuration="11.192961579s" podCreationTimestamp="2026-02-19 21:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:10.188703403 +0000 UTC m=+50.460145873" watchObservedRunningTime="2026-02-19 21:29:10.192961579 +0000 UTC m=+50.464404049" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.194703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178e55ea-2254-4bf3-9cba-da71eae52721-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.292439 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.332716 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bkhxs"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.333680 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.336644 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.348044 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkhxs"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.446639 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.477917 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-catalog-content\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.478005 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmsb5\" (UniqueName: \"kubernetes.io/projected/52f892c4-4660-4b08-8102-ff370236d676-kube-api-access-fmsb5\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.478137 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-utilities\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.498661 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.579367 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-utilities\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.579480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-catalog-content\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.579499 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmsb5\" (UniqueName: \"kubernetes.io/projected/52f892c4-4660-4b08-8102-ff370236d676-kube-api-access-fmsb5\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.579965 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-utilities\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.580259 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-catalog-content\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.603304 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmsb5\" (UniqueName: \"kubernetes.io/projected/52f892c4-4660-4b08-8102-ff370236d676-kube-api-access-fmsb5\") pod \"redhat-marketplace-bkhxs\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.675838 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.731282 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6n8n2"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.732433 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.745036 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n8n2"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.883810 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-catalog-content\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.883850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktsgr\" (UniqueName: \"kubernetes.io/projected/f9d83ad4-726c-486d-a763-b765c53a6cc2-kube-api-access-ktsgr\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.883910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-utilities\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.901439 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkhxs"] Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.968570 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:10 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:10 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:10 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.968935 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.984606 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-utilities\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.984695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-catalog-content\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.984719 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktsgr\" (UniqueName: \"kubernetes.io/projected/f9d83ad4-726c-486d-a763-b765c53a6cc2-kube-api-access-ktsgr\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.985906 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-utilities\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:10 crc kubenswrapper[4771]: I0219 21:29:10.987142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-catalog-content\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.003753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktsgr\" (UniqueName: \"kubernetes.io/projected/f9d83ad4-726c-486d-a763-b765c53a6cc2-kube-api-access-ktsgr\") pod \"redhat-marketplace-6n8n2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.047531 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.126651 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" event={"ID":"bb0d9dfc-00bf-48fb-b62f-9267f59db52f","Type":"ContainerStarted","Data":"f8bb52494c87f5501671a411bb1f77da07f326741145af9bdd8d4cbd68d3dc4d"} Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.126701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" event={"ID":"bb0d9dfc-00bf-48fb-b62f-9267f59db52f","Type":"ContainerStarted","Data":"1ecd8ef9d63684c2f91451b565ea6b66ed9087d0c539194ee7b148c4580a0341"} Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.127123 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.129571 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"178e55ea-2254-4bf3-9cba-da71eae52721","Type":"ContainerStarted","Data":"4adc24b6b69830608d03da925b6aa5c14882d6a3946621b047ae9456c053e1a8"} Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.129602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"178e55ea-2254-4bf3-9cba-da71eae52721","Type":"ContainerStarted","Data":"9d6f70eac7e5ed39e6f1db758410554e14f99c07600c85a177efdc1483d06f11"} Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.132808 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkhxs" event={"ID":"52f892c4-4660-4b08-8102-ff370236d676","Type":"ContainerStarted","Data":"82caee848aaceeec512c04dac76fe3f45edd3e1a38db3618ea7e4db3596c66b0"} Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.132843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkhxs" event={"ID":"52f892c4-4660-4b08-8102-ff370236d676","Type":"ContainerStarted","Data":"ddd14b953115769a1b7e1bcf32bce3399cfe66dcb53760169bbb312d5c6ef8fa"} Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.146807 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" podStartSLOduration=29.146791294 podStartE2EDuration="29.146791294s" podCreationTimestamp="2026-02-19 21:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:11.146528297 +0000 UTC m=+51.417970777" watchObservedRunningTime="2026-02-19 21:29:11.146791294 +0000 UTC m=+51.418233764" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.188538 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.188517737 podStartE2EDuration="2.188517737s" podCreationTimestamp="2026-02-19 21:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:11.184409575 +0000 UTC m=+51.455852045" watchObservedRunningTime="2026-02-19 21:29:11.188517737 +0000 UTC m=+51.459960207" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.281724 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n8n2"] Feb 19 21:29:11 crc kubenswrapper[4771]: W0219 21:29:11.304173 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d83ad4_726c_486d_a763_b765c53a6cc2.slice/crio-05237d2536a249e3343bf7432baafe4b555f0f6622522ab604b7d6e8de651938 WatchSource:0}: Error finding container 05237d2536a249e3343bf7432baafe4b555f0f6622522ab604b7d6e8de651938: Status 404 returned error can't find the container with id 05237d2536a249e3343bf7432baafe4b555f0f6622522ab604b7d6e8de651938 Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.328923 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hp8qv"] Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.330885 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.333131 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.348523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hp8qv"] Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.453065 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.492236 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rzv9\" (UniqueName: \"kubernetes.io/projected/18293c18-339b-4740-9828-5b4d07850142-kube-api-access-8rzv9\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.492382 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-catalog-content\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.492428 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-utilities\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.594085 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df3295-b598-4767-b2c1-6fe0a9dbaf37-secret-volume\") pod \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.594137 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df3295-b598-4767-b2c1-6fe0a9dbaf37-config-volume\") pod \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.594184 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjfsv\" (UniqueName: \"kubernetes.io/projected/06df3295-b598-4767-b2c1-6fe0a9dbaf37-kube-api-access-gjfsv\") pod \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\" (UID: \"06df3295-b598-4767-b2c1-6fe0a9dbaf37\") " Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.594345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-catalog-content\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.594376 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-utilities\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.594459 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rzv9\" (UniqueName: \"kubernetes.io/projected/18293c18-339b-4740-9828-5b4d07850142-kube-api-access-8rzv9\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.595607 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06df3295-b598-4767-b2c1-6fe0a9dbaf37-config-volume" (OuterVolumeSpecName: "config-volume") pod "06df3295-b598-4767-b2c1-6fe0a9dbaf37" (UID: "06df3295-b598-4767-b2c1-6fe0a9dbaf37"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.596108 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-catalog-content\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.596240 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-utilities\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.599948 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06df3295-b598-4767-b2c1-6fe0a9dbaf37-kube-api-access-gjfsv" (OuterVolumeSpecName: "kube-api-access-gjfsv") pod "06df3295-b598-4767-b2c1-6fe0a9dbaf37" (UID: "06df3295-b598-4767-b2c1-6fe0a9dbaf37"). InnerVolumeSpecName "kube-api-access-gjfsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.600254 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06df3295-b598-4767-b2c1-6fe0a9dbaf37-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06df3295-b598-4767-b2c1-6fe0a9dbaf37" (UID: "06df3295-b598-4767-b2c1-6fe0a9dbaf37"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.608609 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rzv9\" (UniqueName: \"kubernetes.io/projected/18293c18-339b-4740-9828-5b4d07850142-kube-api-access-8rzv9\") pod \"redhat-operators-hp8qv\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.651268 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.695768 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06df3295-b598-4767-b2c1-6fe0a9dbaf37-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.695793 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06df3295-b598-4767-b2c1-6fe0a9dbaf37-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.695804 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjfsv\" (UniqueName: \"kubernetes.io/projected/06df3295-b598-4767-b2c1-6fe0a9dbaf37-kube-api-access-gjfsv\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.729233 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r8bw8"] Feb 19 21:29:11 crc kubenswrapper[4771]: E0219 21:29:11.729444 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06df3295-b598-4767-b2c1-6fe0a9dbaf37" containerName="collect-profiles" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.729456 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="06df3295-b598-4767-b2c1-6fe0a9dbaf37" containerName="collect-profiles" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.729555 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="06df3295-b598-4767-b2c1-6fe0a9dbaf37" containerName="collect-profiles" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.730213 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.742565 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8bw8"] Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.796770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-catalog-content\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.796817 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-utilities\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.796837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f559\" (UniqueName: \"kubernetes.io/projected/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-kube-api-access-9f559\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.796891 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.796934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.796954 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.798320 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.800203 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.800911 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.859473 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.862244 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hp8qv"] Feb 19 21:29:11 crc kubenswrapper[4771]: W0219 21:29:11.865425 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18293c18_339b_4740_9828_5b4d07850142.slice/crio-95e55da8bc839af8a10f36b8019413d29da76febc3c8dd48e20d562ba5568431 WatchSource:0}: Error finding container 95e55da8bc839af8a10f36b8019413d29da76febc3c8dd48e20d562ba5568431: Status 404 returned error can't find the container with id 95e55da8bc839af8a10f36b8019413d29da76febc3c8dd48e20d562ba5568431 Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.898473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.898576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-catalog-content\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.898617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-utilities\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.898643 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f559\" (UniqueName: \"kubernetes.io/projected/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-kube-api-access-9f559\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.899119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-catalog-content\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.899144 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-utilities\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.902912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.916078 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f559\" (UniqueName: \"kubernetes.io/projected/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-kube-api-access-9f559\") pod \"redhat-operators-r8bw8\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.962475 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:11 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:11 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:11 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.962703 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.964753 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:11 crc kubenswrapper[4771]: I0219 21:29:11.979169 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.043597 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.143076 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerID="d3b96ba6cbb67d54fa37b4ae349142b9357c87d21d36d830bb9d10dea3b3ef64" exitCode=0 Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.143139 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n8n2" event={"ID":"f9d83ad4-726c-486d-a763-b765c53a6cc2","Type":"ContainerDied","Data":"d3b96ba6cbb67d54fa37b4ae349142b9357c87d21d36d830bb9d10dea3b3ef64"} Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.143170 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n8n2" event={"ID":"f9d83ad4-726c-486d-a763-b765c53a6cc2","Type":"ContainerStarted","Data":"05237d2536a249e3343bf7432baafe4b555f0f6622522ab604b7d6e8de651938"} Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.151548 4771 generic.go:334] "Generic (PLEG): container finished" podID="178e55ea-2254-4bf3-9cba-da71eae52721" containerID="4adc24b6b69830608d03da925b6aa5c14882d6a3946621b047ae9456c053e1a8" exitCode=0 Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.151633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"178e55ea-2254-4bf3-9cba-da71eae52721","Type":"ContainerDied","Data":"4adc24b6b69830608d03da925b6aa5c14882d6a3946621b047ae9456c053e1a8"} Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.155701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" event={"ID":"06df3295-b598-4767-b2c1-6fe0a9dbaf37","Type":"ContainerDied","Data":"e107dce0815c23036b7af4e780e62ed3c351d8c4ec09fd0cf2056b6ffb16454e"} Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.155734 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e107dce0815c23036b7af4e780e62ed3c351d8c4ec09fd0cf2056b6ffb16454e" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.155806 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.158363 4771 generic.go:334] "Generic (PLEG): container finished" podID="52f892c4-4660-4b08-8102-ff370236d676" containerID="82caee848aaceeec512c04dac76fe3f45edd3e1a38db3618ea7e4db3596c66b0" exitCode=0 Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.158605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkhxs" event={"ID":"52f892c4-4660-4b08-8102-ff370236d676","Type":"ContainerDied","Data":"82caee848aaceeec512c04dac76fe3f45edd3e1a38db3618ea7e4db3596c66b0"} Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.165734 4771 generic.go:334] "Generic (PLEG): container finished" podID="18293c18-339b-4740-9828-5b4d07850142" containerID="0fecba4366d662ffb5ad6ca71a29decfd330d48a017718253fdd10d90e908d91" exitCode=0 Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.166798 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8qv" event={"ID":"18293c18-339b-4740-9828-5b4d07850142","Type":"ContainerDied","Data":"0fecba4366d662ffb5ad6ca71a29decfd330d48a017718253fdd10d90e908d91"} Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.166845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8qv" event={"ID":"18293c18-339b-4740-9828-5b4d07850142","Type":"ContainerStarted","Data":"95e55da8bc839af8a10f36b8019413d29da76febc3c8dd48e20d562ba5568431"} Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.358857 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8bw8"] Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.462314 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.471253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jwglk" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.565208 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.565575 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.565606 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.577627 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.589128 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-lggcc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.589129 4771 patch_prober.go:28] interesting pod/console-f9d7485db-h7w47 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.589180 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lggcc" podUID="5c750565-2876-44fb-a7b7-4afd72a47b96" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.589132 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-lggcc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.589256 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-h7w47" podUID="67d4896e-f092-419b-971d-aac9dbaed6d0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.589315 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lggcc" podUID="5c750565-2876-44fb-a7b7-4afd72a47b96" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 21:29:12 crc kubenswrapper[4771]: W0219 21:29:12.695566 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-2291d45e01763f5af1ef63bdd14465ed262cd78fe6ce89247864dd3a6269d6a0 WatchSource:0}: Error finding container 2291d45e01763f5af1ef63bdd14465ed262cd78fe6ce89247864dd3a6269d6a0: Status 404 returned error can't find the container with id 2291d45e01763f5af1ef63bdd14465ed262cd78fe6ce89247864dd3a6269d6a0 Feb 19 21:29:12 crc kubenswrapper[4771]: W0219 21:29:12.696454 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-be2384804b127853d508a663f35cd72e45a0e66e09400fe36b08102bea931942 WatchSource:0}: Error finding container be2384804b127853d508a663f35cd72e45a0e66e09400fe36b08102bea931942: Status 404 returned error can't find the container with id be2384804b127853d508a663f35cd72e45a0e66e09400fe36b08102bea931942 Feb 19 21:29:12 crc kubenswrapper[4771]: E0219 21:29:12.758412 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:12 crc kubenswrapper[4771]: E0219 21:29:12.792813 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:12 crc kubenswrapper[4771]: E0219 21:29:12.813455 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:12 crc kubenswrapper[4771]: E0219 21:29:12.813513 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.959406 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.969937 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:12 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:12 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:12 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:12 crc kubenswrapper[4771]: I0219 21:29:12.970057 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.086095 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.176101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e2ef35bac2cd1010d9d2a5cdc63ea90a809ffa4a9009a50f131d8be81ea088e8"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.176170 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"be2384804b127853d508a663f35cd72e45a0e66e09400fe36b08102bea931942"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.180282 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"96c4feede371fe130fd37590cf9348331277c41dc0fcc1903d01bbd3004f4c20"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.180348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"22d69a0982a56fe1a1b025b6ddfd00c6553869ced3c33be9744e6f7cf58c6a4e"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.181040 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.183296 4771 generic.go:334] "Generic (PLEG): container finished" podID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerID="2b79731b1a768d66171eae18a58606cebd19ddf07ff026bf76cb096e1eb644ad" exitCode=0 Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.183341 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bw8" event={"ID":"6f263313-b6ee-4f44-9fb5-1a4a92d162ee","Type":"ContainerDied","Data":"2b79731b1a768d66171eae18a58606cebd19ddf07ff026bf76cb096e1eb644ad"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.183357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bw8" event={"ID":"6f263313-b6ee-4f44-9fb5-1a4a92d162ee","Type":"ContainerStarted","Data":"f18ece5d37c56a69ed323ac9a7e04247e82403ed4fecff43d239e05a5e49ff68"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.229601 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c0c81e90d50f7419de104459da98a1157596bbb3d8482234711594a3bcd07a3e"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.229642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2291d45e01763f5af1ef63bdd14465ed262cd78fe6ce89247864dd3a6269d6a0"} Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.801878 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.923073 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.938586 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.939467 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/178e55ea-2254-4bf3-9cba-da71eae52721-kubelet-dir\") pod \"178e55ea-2254-4bf3-9cba-da71eae52721\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.939511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178e55ea-2254-4bf3-9cba-da71eae52721-kube-api-access\") pod \"178e55ea-2254-4bf3-9cba-da71eae52721\" (UID: \"178e55ea-2254-4bf3-9cba-da71eae52721\") " Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.939660 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/178e55ea-2254-4bf3-9cba-da71eae52721-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "178e55ea-2254-4bf3-9cba-da71eae52721" (UID: "178e55ea-2254-4bf3-9cba-da71eae52721"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.939841 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/178e55ea-2254-4bf3-9cba-da71eae52721-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.945424 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178e55ea-2254-4bf3-9cba-da71eae52721-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "178e55ea-2254-4bf3-9cba-da71eae52721" (UID: "178e55ea-2254-4bf3-9cba-da71eae52721"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.961766 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:13 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:13 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:13 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:13 crc kubenswrapper[4771]: I0219 21:29:13.961814 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.040999 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178e55ea-2254-4bf3-9cba-da71eae52721-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.231403 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:29:14 crc kubenswrapper[4771]: E0219 21:29:14.231662 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178e55ea-2254-4bf3-9cba-da71eae52721" containerName="pruner" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.231679 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="178e55ea-2254-4bf3-9cba-da71eae52721" containerName="pruner" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.231787 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="178e55ea-2254-4bf3-9cba-da71eae52721" containerName="pruner" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.232146 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.237705 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.237850 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.240996 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.248843 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.255370 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"178e55ea-2254-4bf3-9cba-da71eae52721","Type":"ContainerDied","Data":"9d6f70eac7e5ed39e6f1db758410554e14f99c07600c85a177efdc1483d06f11"} Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.255426 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d6f70eac7e5ed39e6f1db758410554e14f99c07600c85a177efdc1483d06f11" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.271193 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.2711762 podStartE2EDuration="1.2711762s" podCreationTimestamp="2026-02-19 21:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:14.267004477 +0000 UTC m=+54.538446947" watchObservedRunningTime="2026-02-19 21:29:14.2711762 +0000 UTC m=+54.542618660" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.344723 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00912544-8a55-451c-a924-76aa6c2a00da-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.344804 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00912544-8a55-451c-a924-76aa6c2a00da-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.445810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00912544-8a55-451c-a924-76aa6c2a00da-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.445882 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00912544-8a55-451c-a924-76aa6c2a00da-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.445985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00912544-8a55-451c-a924-76aa6c2a00da-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.481155 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00912544-8a55-451c-a924-76aa6c2a00da-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.552657 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.730185 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:29:14 crc kubenswrapper[4771]: W0219 21:29:14.774825 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod00912544_8a55_451c_a924_76aa6c2a00da.slice/crio-5965bbcd5e9e48c1c9bbb790b0ef1981d6f6cc8a230e852165255938d5962958 WatchSource:0}: Error finding container 5965bbcd5e9e48c1c9bbb790b0ef1981d6f6cc8a230e852165255938d5962958: Status 404 returned error can't find the container with id 5965bbcd5e9e48c1c9bbb790b0ef1981d6f6cc8a230e852165255938d5962958 Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.963632 4771 patch_prober.go:28] interesting pod/router-default-5444994796-r4rmw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:29:14 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 19 21:29:14 crc kubenswrapper[4771]: [+]process-running ok Feb 19 21:29:14 crc kubenswrapper[4771]: healthz check failed Feb 19 21:29:14 crc kubenswrapper[4771]: I0219 21:29:14.963966 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4rmw" podUID="8f986201-78c2-4e70-bad2-86a6f1fd68b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:29:15 crc kubenswrapper[4771]: I0219 21:29:15.314686 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"00912544-8a55-451c-a924-76aa6c2a00da","Type":"ContainerStarted","Data":"5965bbcd5e9e48c1c9bbb790b0ef1981d6f6cc8a230e852165255938d5962958"} Feb 19 21:29:15 crc kubenswrapper[4771]: I0219 21:29:15.963801 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:15 crc kubenswrapper[4771]: I0219 21:29:15.967672 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r4rmw" Feb 19 21:29:16 crc kubenswrapper[4771]: I0219 21:29:16.341496 4771 generic.go:334] "Generic (PLEG): container finished" podID="00912544-8a55-451c-a924-76aa6c2a00da" containerID="997dd127fea4c8ee7f122c6fa558050775e50beef9f576f95c564eb8e32167cd" exitCode=0 Feb 19 21:29:16 crc kubenswrapper[4771]: I0219 21:29:16.341543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"00912544-8a55-451c-a924-76aa6c2a00da","Type":"ContainerDied","Data":"997dd127fea4c8ee7f122c6fa558050775e50beef9f576f95c564eb8e32167cd"} Feb 19 21:29:18 crc kubenswrapper[4771]: I0219 21:29:18.049558 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fp5bm" Feb 19 21:29:21 crc kubenswrapper[4771]: I0219 21:29:21.020377 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:29:22 crc kubenswrapper[4771]: I0219 21:29:22.593229 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:22 crc kubenswrapper[4771]: I0219 21:29:22.596683 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:29:22 crc kubenswrapper[4771]: I0219 21:29:22.599297 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lggcc" Feb 19 21:29:22 crc kubenswrapper[4771]: E0219 21:29:22.755366 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:22 crc kubenswrapper[4771]: E0219 21:29:22.765725 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:22 crc kubenswrapper[4771]: E0219 21:29:22.768849 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:22 crc kubenswrapper[4771]: E0219 21:29:22.768931 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.421619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"00912544-8a55-451c-a924-76aa6c2a00da","Type":"ContainerDied","Data":"5965bbcd5e9e48c1c9bbb790b0ef1981d6f6cc8a230e852165255938d5962958"} Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.421918 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5965bbcd5e9e48c1c9bbb790b0ef1981d6f6cc8a230e852165255938d5962958" Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.458742 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.532673 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00912544-8a55-451c-a924-76aa6c2a00da-kube-api-access\") pod \"00912544-8a55-451c-a924-76aa6c2a00da\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.532728 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00912544-8a55-451c-a924-76aa6c2a00da-kubelet-dir\") pod \"00912544-8a55-451c-a924-76aa6c2a00da\" (UID: \"00912544-8a55-451c-a924-76aa6c2a00da\") " Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.532889 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00912544-8a55-451c-a924-76aa6c2a00da-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "00912544-8a55-451c-a924-76aa6c2a00da" (UID: "00912544-8a55-451c-a924-76aa6c2a00da"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.533075 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00912544-8a55-451c-a924-76aa6c2a00da-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.546172 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00912544-8a55-451c-a924-76aa6c2a00da-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "00912544-8a55-451c-a924-76aa6c2a00da" (UID: "00912544-8a55-451c-a924-76aa6c2a00da"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:29:25 crc kubenswrapper[4771]: I0219 21:29:25.634599 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00912544-8a55-451c-a924-76aa6c2a00da-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:26 crc kubenswrapper[4771]: I0219 21:29:26.427071 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:29:29 crc kubenswrapper[4771]: I0219 21:29:29.819960 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:29:32 crc kubenswrapper[4771]: E0219 21:29:32.747691 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:32 crc kubenswrapper[4771]: E0219 21:29:32.749884 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:32 crc kubenswrapper[4771]: E0219 21:29:32.752103 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:32 crc kubenswrapper[4771]: E0219 21:29:32.752164 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:39 crc kubenswrapper[4771]: I0219 21:29:39.514756 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jb2kp_74547cb9-9778-4565-a709-7d29768bf23f/kube-multus-additional-cni-plugins/0.log" Feb 19 21:29:39 crc kubenswrapper[4771]: I0219 21:29:39.516577 4771 generic.go:334] "Generic (PLEG): container finished" podID="74547cb9-9778-4565-a709-7d29768bf23f" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" exitCode=137 Feb 19 21:29:39 crc kubenswrapper[4771]: I0219 21:29:39.516642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" event={"ID":"74547cb9-9778-4565-a709-7d29768bf23f","Type":"ContainerDied","Data":"b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d"} Feb 19 21:29:41 crc kubenswrapper[4771]: E0219 21:29:41.114896 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 21:29:41 crc kubenswrapper[4771]: E0219 21:29:41.115468 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9f559,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-r8bw8_openshift-marketplace(6f263313-b6ee-4f44-9fb5-1a4a92d162ee): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:41 crc kubenswrapper[4771]: E0219 21:29:41.116765 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-r8bw8" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" Feb 19 21:29:42 crc kubenswrapper[4771]: I0219 21:29:42.684644 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4pp7p" Feb 19 21:29:42 crc kubenswrapper[4771]: E0219 21:29:42.744333 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:42 crc kubenswrapper[4771]: E0219 21:29:42.745062 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:42 crc kubenswrapper[4771]: E0219 21:29:42.745797 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:42 crc kubenswrapper[4771]: E0219 21:29:42.745852 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:43 crc kubenswrapper[4771]: E0219 21:29:43.474584 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-r8bw8" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" Feb 19 21:29:43 crc kubenswrapper[4771]: E0219 21:29:43.711349 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 21:29:43 crc kubenswrapper[4771]: E0219 21:29:43.711633 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8sls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-h9pf5_openshift-marketplace(563611ad-e721-4b19-85a3-f5a37eb965d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:43 crc kubenswrapper[4771]: E0219 21:29:43.713164 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-h9pf5" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" Feb 19 21:29:48 crc kubenswrapper[4771]: E0219 21:29:48.598325 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-h9pf5" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" Feb 19 21:29:51 crc kubenswrapper[4771]: E0219 21:29:51.430348 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 21:29:51 crc kubenswrapper[4771]: E0219 21:29:51.430522 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jjn67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d7msc_openshift-marketplace(b2eb12a8-477f-462f-be65-0ee4bec101d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:51 crc kubenswrapper[4771]: E0219 21:29:51.431855 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d7msc" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" Feb 19 21:29:51 crc kubenswrapper[4771]: E0219 21:29:51.839474 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d7msc" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" Feb 19 21:29:51 crc kubenswrapper[4771]: I0219 21:29:51.986721 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.239475 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:29:52 crc kubenswrapper[4771]: E0219 21:29:52.239814 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00912544-8a55-451c-a924-76aa6c2a00da" containerName="pruner" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.239834 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="00912544-8a55-451c-a924-76aa6c2a00da" containerName="pruner" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.240057 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="00912544-8a55-451c-a924-76aa6c2a00da" containerName="pruner" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.240577 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.244645 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.245588 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.251139 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.338863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52a6bfad-69b9-476d-a34f-35df582b3951-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.339357 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52a6bfad-69b9-476d-a34f-35df582b3951-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.440629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52a6bfad-69b9-476d-a34f-35df582b3951-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.440771 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52a6bfad-69b9-476d-a34f-35df582b3951-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.441270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52a6bfad-69b9-476d-a34f-35df582b3951-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.463723 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52a6bfad-69b9-476d-a34f-35df582b3951-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: I0219 21:29:52.559560 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:52 crc kubenswrapper[4771]: E0219 21:29:52.744496 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:52 crc kubenswrapper[4771]: E0219 21:29:52.745230 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:52 crc kubenswrapper[4771]: E0219 21:29:52.745615 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 19 21:29:52 crc kubenswrapper[4771]: E0219 21:29:52.745702 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.177390 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jb2kp_74547cb9-9778-4565-a709-7d29768bf23f/kube-multus-additional-cni-plugins/0.log" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.177458 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.252294 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/74547cb9-9778-4565-a709-7d29768bf23f-ready\") pod \"74547cb9-9778-4565-a709-7d29768bf23f\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.253131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6dh9\" (UniqueName: \"kubernetes.io/projected/74547cb9-9778-4565-a709-7d29768bf23f-kube-api-access-c6dh9\") pod \"74547cb9-9778-4565-a709-7d29768bf23f\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.253166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74547cb9-9778-4565-a709-7d29768bf23f-cni-sysctl-allowlist\") pod \"74547cb9-9778-4565-a709-7d29768bf23f\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.252952 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.253415 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ktsgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6n8n2_openshift-marketplace(f9d83ad4-726c-486d-a763-b765c53a6cc2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.253443 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74547cb9-9778-4565-a709-7d29768bf23f-tuning-conf-dir\") pod \"74547cb9-9778-4565-a709-7d29768bf23f\" (UID: \"74547cb9-9778-4565-a709-7d29768bf23f\") " Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.253583 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74547cb9-9778-4565-a709-7d29768bf23f-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "74547cb9-9778-4565-a709-7d29768bf23f" (UID: "74547cb9-9778-4565-a709-7d29768bf23f"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.253855 4771 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74547cb9-9778-4565-a709-7d29768bf23f-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.253924 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74547cb9-9778-4565-a709-7d29768bf23f-ready" (OuterVolumeSpecName: "ready") pod "74547cb9-9778-4565-a709-7d29768bf23f" (UID: "74547cb9-9778-4565-a709-7d29768bf23f"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.254728 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6n8n2" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.257455 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74547cb9-9778-4565-a709-7d29768bf23f-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "74547cb9-9778-4565-a709-7d29768bf23f" (UID: "74547cb9-9778-4565-a709-7d29768bf23f"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.265329 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.265516 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxcm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-k6bkf_openshift-marketplace(a21efe71-f4d3-4155-af2a-06ebc4646f14): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.266827 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-k6bkf" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.267675 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74547cb9-9778-4565-a709-7d29768bf23f-kube-api-access-c6dh9" (OuterVolumeSpecName: "kube-api-access-c6dh9") pod "74547cb9-9778-4565-a709-7d29768bf23f" (UID: "74547cb9-9778-4565-a709-7d29768bf23f"). InnerVolumeSpecName "kube-api-access-c6dh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.355460 4771 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/74547cb9-9778-4565-a709-7d29768bf23f-ready\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.355495 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6dh9\" (UniqueName: \"kubernetes.io/projected/74547cb9-9778-4565-a709-7d29768bf23f-kube-api-access-c6dh9\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.355508 4771 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74547cb9-9778-4565-a709-7d29768bf23f-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.360967 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.361140 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvvvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ctvtj_openshift-marketplace(7b38fbe5-f497-4deb-aaff-6d95b89a1783): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.362304 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ctvtj" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.427390 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.427611 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmsb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bkhxs_openshift-marketplace(52f892c4-4660-4b08-8102-ff370236d676): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.428858 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bkhxs" podUID="52f892c4-4660-4b08-8102-ff370236d676" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.454836 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.569515 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.569813 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8rzv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hp8qv_openshift-marketplace(18293c18-339b-4740-9828-5b4d07850142): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.572719 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hp8qv" podUID="18293c18-339b-4740-9828-5b4d07850142" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.617866 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-jb2kp_74547cb9-9778-4565-a709-7d29768bf23f/kube-multus-additional-cni-plugins/0.log" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.619067 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.619756 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-jb2kp" event={"ID":"74547cb9-9778-4565-a709-7d29768bf23f","Type":"ContainerDied","Data":"dd68f560c3d46182328e8676a85e23355de23a9cb55fb154c87484060f2987c6"} Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.619848 4771 scope.go:117] "RemoveContainer" containerID="b63d0ce89a00ef41636c100f50e57cb42fefdfca11be2dfbfc34f88dc7cd6c4d" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.621773 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.624651 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6n8n2" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.625212 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bkhxs" podUID="52f892c4-4660-4b08-8102-ff370236d676" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.626732 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-k6bkf" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.626858 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hp8qv" podUID="18293c18-339b-4740-9828-5b4d07850142" Feb 19 21:29:53 crc kubenswrapper[4771]: E0219 21:29:53.626875 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ctvtj" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" Feb 19 21:29:53 crc kubenswrapper[4771]: W0219 21:29:53.634308 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod52a6bfad_69b9_476d_a34f_35df582b3951.slice/crio-dc9e830f5db2e8995fd3cc57795df4cedf5733e91ef85281e4b2f95240cc7058 WatchSource:0}: Error finding container dc9e830f5db2e8995fd3cc57795df4cedf5733e91ef85281e4b2f95240cc7058: Status 404 returned error can't find the container with id dc9e830f5db2e8995fd3cc57795df4cedf5733e91ef85281e4b2f95240cc7058 Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.727672 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.727652162 podStartE2EDuration="727.652162ms" podCreationTimestamp="2026-02-19 21:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:53.721324939 +0000 UTC m=+93.992767439" watchObservedRunningTime="2026-02-19 21:29:53.727652162 +0000 UTC m=+93.999094642" Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.784944 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb2kp"] Feb 19 21:29:53 crc kubenswrapper[4771]: I0219 21:29:53.792540 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-jb2kp"] Feb 19 21:29:54 crc kubenswrapper[4771]: I0219 21:29:54.444899 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74547cb9-9778-4565-a709-7d29768bf23f" path="/var/lib/kubelet/pods/74547cb9-9778-4565-a709-7d29768bf23f/volumes" Feb 19 21:29:54 crc kubenswrapper[4771]: I0219 21:29:54.623675 4771 generic.go:334] "Generic (PLEG): container finished" podID="52a6bfad-69b9-476d-a34f-35df582b3951" containerID="78b81515e32f5f316a77d611698ac9ba8f6f60f7dcf4680dc54f9c3f3efe9f8d" exitCode=0 Feb 19 21:29:54 crc kubenswrapper[4771]: I0219 21:29:54.623742 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"52a6bfad-69b9-476d-a34f-35df582b3951","Type":"ContainerDied","Data":"78b81515e32f5f316a77d611698ac9ba8f6f60f7dcf4680dc54f9c3f3efe9f8d"} Feb 19 21:29:54 crc kubenswrapper[4771]: I0219 21:29:54.623769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"52a6bfad-69b9-476d-a34f-35df582b3951","Type":"ContainerStarted","Data":"dc9e830f5db2e8995fd3cc57795df4cedf5733e91ef85281e4b2f95240cc7058"} Feb 19 21:29:55 crc kubenswrapper[4771]: I0219 21:29:55.922126 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:55 crc kubenswrapper[4771]: I0219 21:29:55.993208 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52a6bfad-69b9-476d-a34f-35df582b3951-kubelet-dir\") pod \"52a6bfad-69b9-476d-a34f-35df582b3951\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " Feb 19 21:29:55 crc kubenswrapper[4771]: I0219 21:29:55.993287 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52a6bfad-69b9-476d-a34f-35df582b3951-kube-api-access\") pod \"52a6bfad-69b9-476d-a34f-35df582b3951\" (UID: \"52a6bfad-69b9-476d-a34f-35df582b3951\") " Feb 19 21:29:55 crc kubenswrapper[4771]: I0219 21:29:55.993331 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52a6bfad-69b9-476d-a34f-35df582b3951-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "52a6bfad-69b9-476d-a34f-35df582b3951" (UID: "52a6bfad-69b9-476d-a34f-35df582b3951"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:29:55 crc kubenswrapper[4771]: I0219 21:29:55.993561 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52a6bfad-69b9-476d-a34f-35df582b3951-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:56 crc kubenswrapper[4771]: I0219 21:29:56.002536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a6bfad-69b9-476d-a34f-35df582b3951-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "52a6bfad-69b9-476d-a34f-35df582b3951" (UID: "52a6bfad-69b9-476d-a34f-35df582b3951"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:29:56 crc kubenswrapper[4771]: I0219 21:29:56.094741 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52a6bfad-69b9-476d-a34f-35df582b3951-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:29:56 crc kubenswrapper[4771]: I0219 21:29:56.637215 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"52a6bfad-69b9-476d-a34f-35df582b3951","Type":"ContainerDied","Data":"dc9e830f5db2e8995fd3cc57795df4cedf5733e91ef85281e4b2f95240cc7058"} Feb 19 21:29:56 crc kubenswrapper[4771]: I0219 21:29:56.637830 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc9e830f5db2e8995fd3cc57795df4cedf5733e91ef85281e4b2f95240cc7058" Feb 19 21:29:56 crc kubenswrapper[4771]: I0219 21:29:56.637317 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.436294 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:29:59 crc kubenswrapper[4771]: E0219 21:29:59.437753 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.437833 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:59 crc kubenswrapper[4771]: E0219 21:29:59.437901 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a6bfad-69b9-476d-a34f-35df582b3951" containerName="pruner" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.437956 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a6bfad-69b9-476d-a34f-35df582b3951" containerName="pruner" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.438150 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="74547cb9-9778-4565-a709-7d29768bf23f" containerName="kube-multus-additional-cni-plugins" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.438230 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a6bfad-69b9-476d-a34f-35df582b3951" containerName="pruner" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.438718 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.441360 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.441702 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.458704 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.540558 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.540624 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-var-lock\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.540683 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5777718c-5a56-48a9-84bd-568b6a9d5346-kube-api-access\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.642485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.642812 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.642973 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-var-lock\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.643262 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5777718c-5a56-48a9-84bd-568b6a9d5346-kube-api-access\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.643001 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-var-lock\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.665612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5777718c-5a56-48a9-84bd-568b6a9d5346-kube-api-access\") pod \"installer-9-crc\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:29:59 crc kubenswrapper[4771]: I0219 21:29:59.829940 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.080321 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.137602 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7"] Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.138377 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.140077 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.140632 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.151401 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7"] Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.251596 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-secret-volume\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.251957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-config-volume\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.251983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzvp\" (UniqueName: \"kubernetes.io/projected/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-kube-api-access-qkzvp\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.352666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-config-volume\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.352723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzvp\" (UniqueName: \"kubernetes.io/projected/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-kube-api-access-qkzvp\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.352798 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-secret-volume\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.354227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-config-volume\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.469527 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzvp\" (UniqueName: \"kubernetes.io/projected/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-kube-api-access-qkzvp\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.471148 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-secret-volume\") pod \"collect-profiles-29525610-kggq7\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.669543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5777718c-5a56-48a9-84bd-568b6a9d5346","Type":"ContainerStarted","Data":"e166c9bf88f47fb3752f0e05919527b611303651c9b02ff7d149fc0fa8189fd4"} Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.758224 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:00 crc kubenswrapper[4771]: I0219 21:30:00.982291 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7"] Feb 19 21:30:01 crc kubenswrapper[4771]: W0219 21:30:01.372976 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8aebf05_d80b_4528_bc5e_4c7e158cc7e0.slice/crio-db42230b3813e5f3cd9575bfed4a270c2b851a3b63f83d7ed2be7c6154ab97ab WatchSource:0}: Error finding container db42230b3813e5f3cd9575bfed4a270c2b851a3b63f83d7ed2be7c6154ab97ab: Status 404 returned error can't find the container with id db42230b3813e5f3cd9575bfed4a270c2b851a3b63f83d7ed2be7c6154ab97ab Feb 19 21:30:01 crc kubenswrapper[4771]: I0219 21:30:01.675912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" event={"ID":"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0","Type":"ContainerStarted","Data":"eb5c38b7ec2ab3f2b06b6d44c4775348682ced0b07893cd77c09bb92e55436e6"} Feb 19 21:30:01 crc kubenswrapper[4771]: I0219 21:30:01.676220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" event={"ID":"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0","Type":"ContainerStarted","Data":"db42230b3813e5f3cd9575bfed4a270c2b851a3b63f83d7ed2be7c6154ab97ab"} Feb 19 21:30:01 crc kubenswrapper[4771]: I0219 21:30:01.678428 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5777718c-5a56-48a9-84bd-568b6a9d5346","Type":"ContainerStarted","Data":"ccdf3b1019487f3de57ed50ca997afaab5e92871f6a7bd54fb111c6cc7c22040"} Feb 19 21:30:01 crc kubenswrapper[4771]: I0219 21:30:01.680203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9pf5" event={"ID":"563611ad-e721-4b19-85a3-f5a37eb965d3","Type":"ContainerStarted","Data":"fbc91517ea4f64c91a53c6663a240f58e96ede7e91ef369b92836105d1c250d0"} Feb 19 21:30:01 crc kubenswrapper[4771]: I0219 21:30:01.681954 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bw8" event={"ID":"6f263313-b6ee-4f44-9fb5-1a4a92d162ee","Type":"ContainerStarted","Data":"14dd9f452d3792e5e098b13e35a058896c1ae6e8184ee1b2a88cbdd6b825185b"} Feb 19 21:30:01 crc kubenswrapper[4771]: I0219 21:30:01.747343 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" podStartSLOduration=1.747322059 podStartE2EDuration="1.747322059s" podCreationTimestamp="2026-02-19 21:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:01.722210025 +0000 UTC m=+101.993652515" watchObservedRunningTime="2026-02-19 21:30:01.747322059 +0000 UTC m=+102.018764529" Feb 19 21:30:01 crc kubenswrapper[4771]: I0219 21:30:01.767126 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.767105188 podStartE2EDuration="2.767105188s" podCreationTimestamp="2026-02-19 21:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:01.766343807 +0000 UTC m=+102.037786277" watchObservedRunningTime="2026-02-19 21:30:01.767105188 +0000 UTC m=+102.038547668" Feb 19 21:30:02 crc kubenswrapper[4771]: I0219 21:30:02.690094 4771 generic.go:334] "Generic (PLEG): container finished" podID="a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" containerID="eb5c38b7ec2ab3f2b06b6d44c4775348682ced0b07893cd77c09bb92e55436e6" exitCode=0 Feb 19 21:30:02 crc kubenswrapper[4771]: I0219 21:30:02.690199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" event={"ID":"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0","Type":"ContainerDied","Data":"eb5c38b7ec2ab3f2b06b6d44c4775348682ced0b07893cd77c09bb92e55436e6"} Feb 19 21:30:02 crc kubenswrapper[4771]: I0219 21:30:02.694824 4771 generic.go:334] "Generic (PLEG): container finished" podID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerID="fbc91517ea4f64c91a53c6663a240f58e96ede7e91ef369b92836105d1c250d0" exitCode=0 Feb 19 21:30:02 crc kubenswrapper[4771]: I0219 21:30:02.694903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9pf5" event={"ID":"563611ad-e721-4b19-85a3-f5a37eb965d3","Type":"ContainerDied","Data":"fbc91517ea4f64c91a53c6663a240f58e96ede7e91ef369b92836105d1c250d0"} Feb 19 21:30:02 crc kubenswrapper[4771]: I0219 21:30:02.697148 4771 generic.go:334] "Generic (PLEG): container finished" podID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerID="14dd9f452d3792e5e098b13e35a058896c1ae6e8184ee1b2a88cbdd6b825185b" exitCode=0 Feb 19 21:30:02 crc kubenswrapper[4771]: I0219 21:30:02.697635 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bw8" event={"ID":"6f263313-b6ee-4f44-9fb5-1a4a92d162ee","Type":"ContainerDied","Data":"14dd9f452d3792e5e098b13e35a058896c1ae6e8184ee1b2a88cbdd6b825185b"} Feb 19 21:30:03 crc kubenswrapper[4771]: I0219 21:30:03.720411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9pf5" event={"ID":"563611ad-e721-4b19-85a3-f5a37eb965d3","Type":"ContainerStarted","Data":"99ef029be5c2b010234857f401350e225353aef24a8e9cd3a79b53fe644ee1cb"} Feb 19 21:30:03 crc kubenswrapper[4771]: I0219 21:30:03.724488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bw8" event={"ID":"6f263313-b6ee-4f44-9fb5-1a4a92d162ee","Type":"ContainerStarted","Data":"c5fbe00cd92b09375e6e66e7ff0ea5b8f96bbbb50cfb0e2ab3ca680ac0fc27e4"} Feb 19 21:30:03 crc kubenswrapper[4771]: I0219 21:30:03.740061 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9pf5" podStartSLOduration=2.729338933 podStartE2EDuration="55.740044504s" podCreationTimestamp="2026-02-19 21:29:08 +0000 UTC" firstStartedPulling="2026-02-19 21:29:10.090460655 +0000 UTC m=+50.361903115" lastFinishedPulling="2026-02-19 21:30:03.101166206 +0000 UTC m=+103.372608686" observedRunningTime="2026-02-19 21:30:03.738576234 +0000 UTC m=+104.010018704" watchObservedRunningTime="2026-02-19 21:30:03.740044504 +0000 UTC m=+104.011486984" Feb 19 21:30:03 crc kubenswrapper[4771]: I0219 21:30:03.752154 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r8bw8" podStartSLOduration=2.80161596 podStartE2EDuration="52.752138343s" podCreationTimestamp="2026-02-19 21:29:11 +0000 UTC" firstStartedPulling="2026-02-19 21:29:13.185195365 +0000 UTC m=+53.456637835" lastFinishedPulling="2026-02-19 21:30:03.135717748 +0000 UTC m=+103.407160218" observedRunningTime="2026-02-19 21:30:03.751887716 +0000 UTC m=+104.023330186" watchObservedRunningTime="2026-02-19 21:30:03.752138343 +0000 UTC m=+104.023580813" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.007756 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.099914 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-config-volume\") pod \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.099967 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-secret-volume\") pod \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.100029 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzvp\" (UniqueName: \"kubernetes.io/projected/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-kube-api-access-qkzvp\") pod \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\" (UID: \"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0\") " Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.100896 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" (UID: "a8aebf05-d80b-4528-bc5e-4c7e158cc7e0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.104835 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" (UID: "a8aebf05-d80b-4528-bc5e-4c7e158cc7e0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.106128 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-kube-api-access-qkzvp" (OuterVolumeSpecName: "kube-api-access-qkzvp") pod "a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" (UID: "a8aebf05-d80b-4528-bc5e-4c7e158cc7e0"). InnerVolumeSpecName "kube-api-access-qkzvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.201161 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.201211 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.201224 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzvp\" (UniqueName: \"kubernetes.io/projected/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0-kube-api-access-qkzvp\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.730305 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" event={"ID":"a8aebf05-d80b-4528-bc5e-4c7e158cc7e0","Type":"ContainerDied","Data":"db42230b3813e5f3cd9575bfed4a270c2b851a3b63f83d7ed2be7c6154ab97ab"} Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.730342 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db42230b3813e5f3cd9575bfed4a270c2b851a3b63f83d7ed2be7c6154ab97ab" Feb 19 21:30:04 crc kubenswrapper[4771]: I0219 21:30:04.730391 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7" Feb 19 21:30:06 crc kubenswrapper[4771]: I0219 21:30:06.746068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7msc" event={"ID":"b2eb12a8-477f-462f-be65-0ee4bec101d3","Type":"ContainerStarted","Data":"339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8"} Feb 19 21:30:06 crc kubenswrapper[4771]: I0219 21:30:06.750725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6bkf" event={"ID":"a21efe71-f4d3-4155-af2a-06ebc4646f14","Type":"ContainerStarted","Data":"fff4dad894c6a38d885e4c46da7242e1e63ec9f50304814a4ade86f0487f2bd4"} Feb 19 21:30:06 crc kubenswrapper[4771]: I0219 21:30:06.753564 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8qv" event={"ID":"18293c18-339b-4740-9828-5b4d07850142","Type":"ContainerStarted","Data":"4ff5a45f9c34255904272abb688c2adb12be0f4089ff761093886e196d2e5661"} Feb 19 21:30:07 crc kubenswrapper[4771]: I0219 21:30:07.759887 4771 generic.go:334] "Generic (PLEG): container finished" podID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerID="339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8" exitCode=0 Feb 19 21:30:07 crc kubenswrapper[4771]: I0219 21:30:07.759965 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7msc" event={"ID":"b2eb12a8-477f-462f-be65-0ee4bec101d3","Type":"ContainerDied","Data":"339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8"} Feb 19 21:30:07 crc kubenswrapper[4771]: I0219 21:30:07.761507 4771 generic.go:334] "Generic (PLEG): container finished" podID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerID="fff4dad894c6a38d885e4c46da7242e1e63ec9f50304814a4ade86f0487f2bd4" exitCode=0 Feb 19 21:30:07 crc kubenswrapper[4771]: I0219 21:30:07.761566 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6bkf" event={"ID":"a21efe71-f4d3-4155-af2a-06ebc4646f14","Type":"ContainerDied","Data":"fff4dad894c6a38d885e4c46da7242e1e63ec9f50304814a4ade86f0487f2bd4"} Feb 19 21:30:07 crc kubenswrapper[4771]: I0219 21:30:07.764475 4771 generic.go:334] "Generic (PLEG): container finished" podID="18293c18-339b-4740-9828-5b4d07850142" containerID="4ff5a45f9c34255904272abb688c2adb12be0f4089ff761093886e196d2e5661" exitCode=0 Feb 19 21:30:07 crc kubenswrapper[4771]: I0219 21:30:07.764527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8qv" event={"ID":"18293c18-339b-4740-9828-5b4d07850142","Type":"ContainerDied","Data":"4ff5a45f9c34255904272abb688c2adb12be0f4089ff761093886e196d2e5661"} Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.771107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7msc" event={"ID":"b2eb12a8-477f-462f-be65-0ee4bec101d3","Type":"ContainerStarted","Data":"1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef"} Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.773766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6bkf" event={"ID":"a21efe71-f4d3-4155-af2a-06ebc4646f14","Type":"ContainerStarted","Data":"91b9fb574745b688c1ae4c8f1f9edb890dfe2aaefdcf1bf06a3488bb8e5e000f"} Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.775867 4771 generic.go:334] "Generic (PLEG): container finished" podID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerID="378b167719243a2cc38cf6f1e593889205115c2d2577ab71bc0cbe2b0201acac" exitCode=0 Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.775930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctvtj" event={"ID":"7b38fbe5-f497-4deb-aaff-6d95b89a1783","Type":"ContainerDied","Data":"378b167719243a2cc38cf6f1e593889205115c2d2577ab71bc0cbe2b0201acac"} Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.778575 4771 generic.go:334] "Generic (PLEG): container finished" podID="52f892c4-4660-4b08-8102-ff370236d676" containerID="796e47747a3f9d9af6a0576da790fafdc118fe9840d25624636b9d5f506caa05" exitCode=0 Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.778641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkhxs" event={"ID":"52f892c4-4660-4b08-8102-ff370236d676","Type":"ContainerDied","Data":"796e47747a3f9d9af6a0576da790fafdc118fe9840d25624636b9d5f506caa05"} Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.782038 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8qv" event={"ID":"18293c18-339b-4740-9828-5b4d07850142","Type":"ContainerStarted","Data":"b0826642160b1327710fb042b86300f4cbb4f67f1cec1d32a569dce430d7b4c1"} Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.793867 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d7msc" podStartSLOduration=2.603315357 podStartE2EDuration="1m0.793852692s" podCreationTimestamp="2026-02-19 21:29:08 +0000 UTC" firstStartedPulling="2026-02-19 21:29:10.093044214 +0000 UTC m=+50.364486684" lastFinishedPulling="2026-02-19 21:30:08.283581549 +0000 UTC m=+108.555024019" observedRunningTime="2026-02-19 21:30:08.791260611 +0000 UTC m=+109.062703101" watchObservedRunningTime="2026-02-19 21:30:08.793852692 +0000 UTC m=+109.065295162" Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.830487 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hp8qv" podStartSLOduration=1.774748948 podStartE2EDuration="57.830465519s" podCreationTimestamp="2026-02-19 21:29:11 +0000 UTC" firstStartedPulling="2026-02-19 21:29:12.192561476 +0000 UTC m=+52.464003946" lastFinishedPulling="2026-02-19 21:30:08.248278047 +0000 UTC m=+108.519720517" observedRunningTime="2026-02-19 21:30:08.825913055 +0000 UTC m=+109.097355535" watchObservedRunningTime="2026-02-19 21:30:08.830465519 +0000 UTC m=+109.101907999" Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.847056 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6bkf" podStartSLOduration=2.675017011 podStartE2EDuration="1m0.84701215s" podCreationTimestamp="2026-02-19 21:29:08 +0000 UTC" firstStartedPulling="2026-02-19 21:29:10.09543912 +0000 UTC m=+50.366881590" lastFinishedPulling="2026-02-19 21:30:08.267434259 +0000 UTC m=+108.538876729" observedRunningTime="2026-02-19 21:30:08.843627198 +0000 UTC m=+109.115069698" watchObservedRunningTime="2026-02-19 21:30:08.84701215 +0000 UTC m=+109.118454620" Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.860464 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:30:08 crc kubenswrapper[4771]: I0219 21:30:08.860500 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.007797 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.076853 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.076898 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.788945 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctvtj" event={"ID":"7b38fbe5-f497-4deb-aaff-6d95b89a1783","Type":"ContainerStarted","Data":"5a15357fe4a2c2d6153bf88b17e8dcd0ee422a52edc941dd0eaf55a94ad3f181"} Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.791348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkhxs" event={"ID":"52f892c4-4660-4b08-8102-ff370236d676","Type":"ContainerStarted","Data":"fe7bdd1d995d51dad5949e2c9e7a2f7974d18ac6752c394aa8871538bd276ed9"} Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.793788 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerID="b03f8eba856e7830d535368c92d8e67248696b8376b645e44a90b857b5a15004" exitCode=0 Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.793860 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n8n2" event={"ID":"f9d83ad4-726c-486d-a763-b765c53a6cc2","Type":"ContainerDied","Data":"b03f8eba856e7830d535368c92d8e67248696b8376b645e44a90b857b5a15004"} Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.814011 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ctvtj" podStartSLOduration=2.749095779 podStartE2EDuration="1m1.813993627s" podCreationTimestamp="2026-02-19 21:29:08 +0000 UTC" firstStartedPulling="2026-02-19 21:29:10.096352905 +0000 UTC m=+50.367795375" lastFinishedPulling="2026-02-19 21:30:09.161250753 +0000 UTC m=+109.432693223" observedRunningTime="2026-02-19 21:30:09.812537258 +0000 UTC m=+110.083979738" watchObservedRunningTime="2026-02-19 21:30:09.813993627 +0000 UTC m=+110.085436097" Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.869846 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bkhxs" podStartSLOduration=1.779437191 podStartE2EDuration="59.869819348s" podCreationTimestamp="2026-02-19 21:29:10 +0000 UTC" firstStartedPulling="2026-02-19 21:29:11.134381296 +0000 UTC m=+51.405823766" lastFinishedPulling="2026-02-19 21:30:09.224763453 +0000 UTC m=+109.496205923" observedRunningTime="2026-02-19 21:30:09.866380044 +0000 UTC m=+110.137822564" watchObservedRunningTime="2026-02-19 21:30:09.869819348 +0000 UTC m=+110.141261808" Feb 19 21:30:09 crc kubenswrapper[4771]: I0219 21:30:09.910262 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:30:10 crc kubenswrapper[4771]: I0219 21:30:10.120126 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-d7msc" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="registry-server" probeResult="failure" output=< Feb 19 21:30:10 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 21:30:10 crc kubenswrapper[4771]: > Feb 19 21:30:10 crc kubenswrapper[4771]: I0219 21:30:10.676223 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:30:10 crc kubenswrapper[4771]: I0219 21:30:10.676619 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:30:10 crc kubenswrapper[4771]: I0219 21:30:10.802102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n8n2" event={"ID":"f9d83ad4-726c-486d-a763-b765c53a6cc2","Type":"ContainerStarted","Data":"324b4ff18fddf32e97dee3bfb2cb095b72407f8cefacc2eab394f72e49cd7493"} Feb 19 21:30:10 crc kubenswrapper[4771]: I0219 21:30:10.827411 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6n8n2" podStartSLOduration=2.737163589 podStartE2EDuration="1m0.827392649s" podCreationTimestamp="2026-02-19 21:29:10 +0000 UTC" firstStartedPulling="2026-02-19 21:29:12.147089821 +0000 UTC m=+52.418532291" lastFinishedPulling="2026-02-19 21:30:10.237318881 +0000 UTC m=+110.508761351" observedRunningTime="2026-02-19 21:30:10.826453173 +0000 UTC m=+111.097895653" watchObservedRunningTime="2026-02-19 21:30:10.827392649 +0000 UTC m=+111.098835119" Feb 19 21:30:11 crc kubenswrapper[4771]: I0219 21:30:11.048114 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:30:11 crc kubenswrapper[4771]: I0219 21:30:11.048185 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:30:11 crc kubenswrapper[4771]: I0219 21:30:11.651724 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:30:11 crc kubenswrapper[4771]: I0219 21:30:11.651947 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:30:11 crc kubenswrapper[4771]: I0219 21:30:11.712389 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bkhxs" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="registry-server" probeResult="failure" output=< Feb 19 21:30:11 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 21:30:11 crc kubenswrapper[4771]: > Feb 19 21:30:11 crc kubenswrapper[4771]: I0219 21:30:11.932882 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8n2h"] Feb 19 21:30:12 crc kubenswrapper[4771]: I0219 21:30:12.044433 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:30:12 crc kubenswrapper[4771]: I0219 21:30:12.044479 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:30:12 crc kubenswrapper[4771]: I0219 21:30:12.079896 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:30:12 crc kubenswrapper[4771]: I0219 21:30:12.093156 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6n8n2" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="registry-server" probeResult="failure" output=< Feb 19 21:30:12 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 21:30:12 crc kubenswrapper[4771]: > Feb 19 21:30:12 crc kubenswrapper[4771]: I0219 21:30:12.689593 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hp8qv" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="registry-server" probeResult="failure" output=< Feb 19 21:30:12 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 21:30:12 crc kubenswrapper[4771]: > Feb 19 21:30:12 crc kubenswrapper[4771]: I0219 21:30:12.867641 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:30:13 crc kubenswrapper[4771]: I0219 21:30:13.958940 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9pf5"] Feb 19 21:30:13 crc kubenswrapper[4771]: I0219 21:30:13.959222 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9pf5" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="registry-server" containerID="cri-o://99ef029be5c2b010234857f401350e225353aef24a8e9cd3a79b53fe644ee1cb" gracePeriod=2 Feb 19 21:30:14 crc kubenswrapper[4771]: I0219 21:30:14.828718 4771 generic.go:334] "Generic (PLEG): container finished" podID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerID="99ef029be5c2b010234857f401350e225353aef24a8e9cd3a79b53fe644ee1cb" exitCode=0 Feb 19 21:30:14 crc kubenswrapper[4771]: I0219 21:30:14.828789 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9pf5" event={"ID":"563611ad-e721-4b19-85a3-f5a37eb965d3","Type":"ContainerDied","Data":"99ef029be5c2b010234857f401350e225353aef24a8e9cd3a79b53fe644ee1cb"} Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.295524 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.333641 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-utilities\") pod \"563611ad-e721-4b19-85a3-f5a37eb965d3\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.333729 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-catalog-content\") pod \"563611ad-e721-4b19-85a3-f5a37eb965d3\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.333806 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8sls\" (UniqueName: \"kubernetes.io/projected/563611ad-e721-4b19-85a3-f5a37eb965d3-kube-api-access-p8sls\") pod \"563611ad-e721-4b19-85a3-f5a37eb965d3\" (UID: \"563611ad-e721-4b19-85a3-f5a37eb965d3\") " Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.334903 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-utilities" (OuterVolumeSpecName: "utilities") pod "563611ad-e721-4b19-85a3-f5a37eb965d3" (UID: "563611ad-e721-4b19-85a3-f5a37eb965d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.339982 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/563611ad-e721-4b19-85a3-f5a37eb965d3-kube-api-access-p8sls" (OuterVolumeSpecName: "kube-api-access-p8sls") pod "563611ad-e721-4b19-85a3-f5a37eb965d3" (UID: "563611ad-e721-4b19-85a3-f5a37eb965d3"). InnerVolumeSpecName "kube-api-access-p8sls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.386699 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "563611ad-e721-4b19-85a3-f5a37eb965d3" (UID: "563611ad-e721-4b19-85a3-f5a37eb965d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.434681 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8sls\" (UniqueName: \"kubernetes.io/projected/563611ad-e721-4b19-85a3-f5a37eb965d3-kube-api-access-p8sls\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.434723 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.434735 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/563611ad-e721-4b19-85a3-f5a37eb965d3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.837476 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9pf5" event={"ID":"563611ad-e721-4b19-85a3-f5a37eb965d3","Type":"ContainerDied","Data":"0566e2ffaf3e76bef342e3f4e0515685bf415e4102882f56bd3f255cc64a2583"} Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.837530 4771 scope.go:117] "RemoveContainer" containerID="99ef029be5c2b010234857f401350e225353aef24a8e9cd3a79b53fe644ee1cb" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.837554 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9pf5" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.910156 4771 scope.go:117] "RemoveContainer" containerID="fbc91517ea4f64c91a53c6663a240f58e96ede7e91ef369b92836105d1c250d0" Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.916602 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9pf5"] Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.918411 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9pf5"] Feb 19 21:30:15 crc kubenswrapper[4771]: I0219 21:30:15.956382 4771 scope.go:117] "RemoveContainer" containerID="393ddc06d07027a1dabd658f4c7fc978efbab3d835af865e8c919f3c4233b740" Feb 19 21:30:16 crc kubenswrapper[4771]: I0219 21:30:16.163008 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8bw8"] Feb 19 21:30:16 crc kubenswrapper[4771]: I0219 21:30:16.163450 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r8bw8" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="registry-server" containerID="cri-o://c5fbe00cd92b09375e6e66e7ff0ea5b8f96bbbb50cfb0e2ab3ca680ac0fc27e4" gracePeriod=2 Feb 19 21:30:16 crc kubenswrapper[4771]: I0219 21:30:16.444605 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" path="/var/lib/kubelet/pods/563611ad-e721-4b19-85a3-f5a37eb965d3/volumes" Feb 19 21:30:16 crc kubenswrapper[4771]: I0219 21:30:16.848252 4771 generic.go:334] "Generic (PLEG): container finished" podID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerID="c5fbe00cd92b09375e6e66e7ff0ea5b8f96bbbb50cfb0e2ab3ca680ac0fc27e4" exitCode=0 Feb 19 21:30:16 crc kubenswrapper[4771]: I0219 21:30:16.848288 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bw8" event={"ID":"6f263313-b6ee-4f44-9fb5-1a4a92d162ee","Type":"ContainerDied","Data":"c5fbe00cd92b09375e6e66e7ff0ea5b8f96bbbb50cfb0e2ab3ca680ac0fc27e4"} Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.235976 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.259577 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f559\" (UniqueName: \"kubernetes.io/projected/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-kube-api-access-9f559\") pod \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.259654 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-utilities\") pod \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.259738 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-catalog-content\") pod \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\" (UID: \"6f263313-b6ee-4f44-9fb5-1a4a92d162ee\") " Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.261206 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-utilities" (OuterVolumeSpecName: "utilities") pod "6f263313-b6ee-4f44-9fb5-1a4a92d162ee" (UID: "6f263313-b6ee-4f44-9fb5-1a4a92d162ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.265315 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-kube-api-access-9f559" (OuterVolumeSpecName: "kube-api-access-9f559") pod "6f263313-b6ee-4f44-9fb5-1a4a92d162ee" (UID: "6f263313-b6ee-4f44-9fb5-1a4a92d162ee"). InnerVolumeSpecName "kube-api-access-9f559". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.362380 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f559\" (UniqueName: \"kubernetes.io/projected/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-kube-api-access-9f559\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.362561 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.384989 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f263313-b6ee-4f44-9fb5-1a4a92d162ee" (UID: "6f263313-b6ee-4f44-9fb5-1a4a92d162ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.465739 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f263313-b6ee-4f44-9fb5-1a4a92d162ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.855223 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8bw8" event={"ID":"6f263313-b6ee-4f44-9fb5-1a4a92d162ee","Type":"ContainerDied","Data":"f18ece5d37c56a69ed323ac9a7e04247e82403ed4fecff43d239e05a5e49ff68"} Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.855271 4771 scope.go:117] "RemoveContainer" containerID="c5fbe00cd92b09375e6e66e7ff0ea5b8f96bbbb50cfb0e2ab3ca680ac0fc27e4" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.855302 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8bw8" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.875723 4771 scope.go:117] "RemoveContainer" containerID="14dd9f452d3792e5e098b13e35a058896c1ae6e8184ee1b2a88cbdd6b825185b" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.903142 4771 scope.go:117] "RemoveContainer" containerID="2b79731b1a768d66171eae18a58606cebd19ddf07ff026bf76cb096e1eb644ad" Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.903930 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8bw8"] Feb 19 21:30:17 crc kubenswrapper[4771]: I0219 21:30:17.906339 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r8bw8"] Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.448973 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" path="/var/lib/kubelet/pods/6f263313-b6ee-4f44-9fb5-1a4a92d162ee/volumes" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.675451 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.675975 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.746053 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.762621 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.762668 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.820149 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.931894 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:30:18 crc kubenswrapper[4771]: I0219 21:30:18.938096 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:30:19 crc kubenswrapper[4771]: I0219 21:30:19.126442 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:30:19 crc kubenswrapper[4771]: I0219 21:30:19.169927 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:30:20 crc kubenswrapper[4771]: I0219 21:30:20.725239 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:30:20 crc kubenswrapper[4771]: I0219 21:30:20.779147 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:30:21 crc kubenswrapper[4771]: I0219 21:30:21.114602 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:30:21 crc kubenswrapper[4771]: I0219 21:30:21.169887 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d7msc"] Feb 19 21:30:21 crc kubenswrapper[4771]: I0219 21:30:21.170309 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d7msc" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="registry-server" containerID="cri-o://1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef" gracePeriod=2 Feb 19 21:30:21 crc kubenswrapper[4771]: I0219 21:30:21.200352 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:30:21 crc kubenswrapper[4771]: I0219 21:30:21.719975 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:30:21 crc kubenswrapper[4771]: I0219 21:30:21.789722 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.590669 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.633429 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-catalog-content\") pod \"b2eb12a8-477f-462f-be65-0ee4bec101d3\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.633528 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-utilities\") pod \"b2eb12a8-477f-462f-be65-0ee4bec101d3\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.633587 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjn67\" (UniqueName: \"kubernetes.io/projected/b2eb12a8-477f-462f-be65-0ee4bec101d3-kube-api-access-jjn67\") pod \"b2eb12a8-477f-462f-be65-0ee4bec101d3\" (UID: \"b2eb12a8-477f-462f-be65-0ee4bec101d3\") " Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.636157 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-utilities" (OuterVolumeSpecName: "utilities") pod "b2eb12a8-477f-462f-be65-0ee4bec101d3" (UID: "b2eb12a8-477f-462f-be65-0ee4bec101d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.641635 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2eb12a8-477f-462f-be65-0ee4bec101d3-kube-api-access-jjn67" (OuterVolumeSpecName: "kube-api-access-jjn67") pod "b2eb12a8-477f-462f-be65-0ee4bec101d3" (UID: "b2eb12a8-477f-462f-be65-0ee4bec101d3"). InnerVolumeSpecName "kube-api-access-jjn67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.727513 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2eb12a8-477f-462f-be65-0ee4bec101d3" (UID: "b2eb12a8-477f-462f-be65-0ee4bec101d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.734976 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.735093 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2eb12a8-477f-462f-be65-0ee4bec101d3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.735106 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjn67\" (UniqueName: \"kubernetes.io/projected/b2eb12a8-477f-462f-be65-0ee4bec101d3-kube-api-access-jjn67\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.897224 4771 generic.go:334] "Generic (PLEG): container finished" podID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerID="1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef" exitCode=0 Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.897270 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7msc" event={"ID":"b2eb12a8-477f-462f-be65-0ee4bec101d3","Type":"ContainerDied","Data":"1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef"} Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.897300 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7msc" event={"ID":"b2eb12a8-477f-462f-be65-0ee4bec101d3","Type":"ContainerDied","Data":"8320ada589cc17308d11fffae8918bcfb1adb061853840db48732a3a387c6b72"} Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.897306 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7msc" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.897316 4771 scope.go:117] "RemoveContainer" containerID="1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.932165 4771 scope.go:117] "RemoveContainer" containerID="339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.937424 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d7msc"] Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.944234 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d7msc"] Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.958326 4771 scope.go:117] "RemoveContainer" containerID="5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.995158 4771 scope.go:117] "RemoveContainer" containerID="1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef" Feb 19 21:30:22 crc kubenswrapper[4771]: E0219 21:30:22.995677 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef\": container with ID starting with 1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef not found: ID does not exist" containerID="1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.995733 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef"} err="failed to get container status \"1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef\": rpc error: code = NotFound desc = could not find container \"1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef\": container with ID starting with 1cf24b88f858fa8b252126c4db37758ab0ac7bfab14b1337287a01f3fa742aef not found: ID does not exist" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.995791 4771 scope.go:117] "RemoveContainer" containerID="339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8" Feb 19 21:30:22 crc kubenswrapper[4771]: E0219 21:30:22.996290 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8\": container with ID starting with 339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8 not found: ID does not exist" containerID="339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.996323 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8"} err="failed to get container status \"339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8\": rpc error: code = NotFound desc = could not find container \"339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8\": container with ID starting with 339a05d405f95e591322bba3036e522384bec0c885714933c1d69210d8c662f8 not found: ID does not exist" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.996351 4771 scope.go:117] "RemoveContainer" containerID="5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d" Feb 19 21:30:22 crc kubenswrapper[4771]: E0219 21:30:22.996821 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d\": container with ID starting with 5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d not found: ID does not exist" containerID="5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d" Feb 19 21:30:22 crc kubenswrapper[4771]: I0219 21:30:22.996851 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d"} err="failed to get container status \"5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d\": rpc error: code = NotFound desc = could not find container \"5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d\": container with ID starting with 5cd195b4ff8a8f6921aaafd599775cc649481f986ac2c64f09b4885f79235a6d not found: ID does not exist" Feb 19 21:30:23 crc kubenswrapper[4771]: I0219 21:30:23.575078 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n8n2"] Feb 19 21:30:23 crc kubenswrapper[4771]: I0219 21:30:23.575607 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6n8n2" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="registry-server" containerID="cri-o://324b4ff18fddf32e97dee3bfb2cb095b72407f8cefacc2eab394f72e49cd7493" gracePeriod=2 Feb 19 21:30:23 crc kubenswrapper[4771]: I0219 21:30:23.912853 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerID="324b4ff18fddf32e97dee3bfb2cb095b72407f8cefacc2eab394f72e49cd7493" exitCode=0 Feb 19 21:30:23 crc kubenswrapper[4771]: I0219 21:30:23.912959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n8n2" event={"ID":"f9d83ad4-726c-486d-a763-b765c53a6cc2","Type":"ContainerDied","Data":"324b4ff18fddf32e97dee3bfb2cb095b72407f8cefacc2eab394f72e49cd7493"} Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.058739 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.154495 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-utilities\") pod \"f9d83ad4-726c-486d-a763-b765c53a6cc2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.154571 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktsgr\" (UniqueName: \"kubernetes.io/projected/f9d83ad4-726c-486d-a763-b765c53a6cc2-kube-api-access-ktsgr\") pod \"f9d83ad4-726c-486d-a763-b765c53a6cc2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.154746 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-catalog-content\") pod \"f9d83ad4-726c-486d-a763-b765c53a6cc2\" (UID: \"f9d83ad4-726c-486d-a763-b765c53a6cc2\") " Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.155442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-utilities" (OuterVolumeSpecName: "utilities") pod "f9d83ad4-726c-486d-a763-b765c53a6cc2" (UID: "f9d83ad4-726c-486d-a763-b765c53a6cc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.162948 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d83ad4-726c-486d-a763-b765c53a6cc2-kube-api-access-ktsgr" (OuterVolumeSpecName: "kube-api-access-ktsgr") pod "f9d83ad4-726c-486d-a763-b765c53a6cc2" (UID: "f9d83ad4-726c-486d-a763-b765c53a6cc2"). InnerVolumeSpecName "kube-api-access-ktsgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.183693 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9d83ad4-726c-486d-a763-b765c53a6cc2" (UID: "f9d83ad4-726c-486d-a763-b765c53a6cc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.256454 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.256510 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9d83ad4-726c-486d-a763-b765c53a6cc2-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.256524 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktsgr\" (UniqueName: \"kubernetes.io/projected/f9d83ad4-726c-486d-a763-b765c53a6cc2-kube-api-access-ktsgr\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.443701 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" path="/var/lib/kubelet/pods/b2eb12a8-477f-462f-be65-0ee4bec101d3/volumes" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.925110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n8n2" event={"ID":"f9d83ad4-726c-486d-a763-b765c53a6cc2","Type":"ContainerDied","Data":"05237d2536a249e3343bf7432baafe4b555f0f6622522ab604b7d6e8de651938"} Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.925191 4771 scope.go:117] "RemoveContainer" containerID="324b4ff18fddf32e97dee3bfb2cb095b72407f8cefacc2eab394f72e49cd7493" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.925200 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n8n2" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.954775 4771 scope.go:117] "RemoveContainer" containerID="b03f8eba856e7830d535368c92d8e67248696b8376b645e44a90b857b5a15004" Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.954931 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n8n2"] Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.958787 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n8n2"] Feb 19 21:30:24 crc kubenswrapper[4771]: I0219 21:30:24.984851 4771 scope.go:117] "RemoveContainer" containerID="d3b96ba6cbb67d54fa37b4ae349142b9357c87d21d36d830bb9d10dea3b3ef64" Feb 19 21:30:26 crc kubenswrapper[4771]: I0219 21:30:26.449691 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" path="/var/lib/kubelet/pods/f9d83ad4-726c-486d-a763-b765c53a6cc2/volumes" Feb 19 21:30:36 crc kubenswrapper[4771]: I0219 21:30:36.958175 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" podUID="2dd71911-b029-4e67-882c-eb6ad1983c00" containerName="oauth-openshift" containerID="cri-o://e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8" gracePeriod=15 Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.435972 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-service-ca\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540171 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-cliconfig\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540311 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-session\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540395 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-policies\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540423 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-ocp-branding-template\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540453 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-provider-selection\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zsh8\" (UniqueName: \"kubernetes.io/projected/2dd71911-b029-4e67-882c-eb6ad1983c00-kube-api-access-2zsh8\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540516 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-idp-0-file-data\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540543 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-router-certs\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540570 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-error\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-serving-cert\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-trusted-ca-bundle\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540677 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-login\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.540710 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-dir\") pod \"2dd71911-b029-4e67-882c-eb6ad1983c00\" (UID: \"2dd71911-b029-4e67-882c-eb6ad1983c00\") " Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.541060 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.541116 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.542061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.543923 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.544294 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.546832 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.547238 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.547465 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.547899 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd71911-b029-4e67-882c-eb6ad1983c00-kube-api-access-2zsh8" (OuterVolumeSpecName: "kube-api-access-2zsh8") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "kube-api-access-2zsh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.548133 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.549564 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.552525 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.553349 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.559735 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2dd71911-b029-4e67-882c-eb6ad1983c00" (UID: "2dd71911-b029-4e67-882c-eb6ad1983c00"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.641980 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642010 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642022 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642057 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642067 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642075 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642085 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642095 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642104 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zsh8\" (UniqueName: \"kubernetes.io/projected/2dd71911-b029-4e67-882c-eb6ad1983c00-kube-api-access-2zsh8\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642114 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642123 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642134 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642144 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:37 crc kubenswrapper[4771]: I0219 21:30:37.642152 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2dd71911-b029-4e67-882c-eb6ad1983c00-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.011368 4771 generic.go:334] "Generic (PLEG): container finished" podID="2dd71911-b029-4e67-882c-eb6ad1983c00" containerID="e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.011434 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" event={"ID":"2dd71911-b029-4e67-882c-eb6ad1983c00","Type":"ContainerDied","Data":"e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8"} Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.011464 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.011502 4771 scope.go:117] "RemoveContainer" containerID="e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8" Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.011485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8n2h" event={"ID":"2dd71911-b029-4e67-882c-eb6ad1983c00","Type":"ContainerDied","Data":"00e6c9337a770c10f81a29b2f56c38016d7e891f50bb1d23ff2535e0391c5159"} Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.049208 4771 scope.go:117] "RemoveContainer" containerID="e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8" Feb 19 21:30:38 crc kubenswrapper[4771]: E0219 21:30:38.049763 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8\": container with ID starting with e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8 not found: ID does not exist" containerID="e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8" Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.049797 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8"} err="failed to get container status \"e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8\": rpc error: code = NotFound desc = could not find container \"e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8\": container with ID starting with e312cee88a2077204c5408fe51247de92d823a7a8b2e12ff154554734150a8d8 not found: ID does not exist" Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.066535 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8n2h"] Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.071799 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8n2h"] Feb 19 21:30:38 crc kubenswrapper[4771]: I0219 21:30:38.451684 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd71911-b029-4e67-882c-eb6ad1983c00" path="/var/lib/kubelet/pods/2dd71911-b029-4e67-882c-eb6ad1983c00/volumes" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.075707 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.075945 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.075960 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.075972 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.075981 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.075996 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076004 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076014 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076026 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076053 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076063 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076075 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076086 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076098 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" containerName="collect-profiles" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076106 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" containerName="collect-profiles" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076120 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076128 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076143 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076151 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076161 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076169 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076181 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076188 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076200 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd71911-b029-4e67-882c-eb6ad1983c00" containerName="oauth-openshift" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076207 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd71911-b029-4e67-882c-eb6ad1983c00" containerName="oauth-openshift" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076217 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076224 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="extract-content" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.076237 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076245 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="extract-utilities" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076351 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d83ad4-726c-486d-a763-b765c53a6cc2" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076367 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="563611ad-e721-4b19-85a3-f5a37eb965d3" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076380 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f263313-b6ee-4f44-9fb5-1a4a92d162ee" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076390 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" containerName="collect-profiles" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076400 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2eb12a8-477f-462f-be65-0ee4bec101d3" containerName="registry-server" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076409 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd71911-b029-4e67-882c-eb6ad1983c00" containerName="oauth-openshift" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076778 4771 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.076941 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.077066 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d" gracePeriod=15 Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.077203 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6" gracePeriod=15 Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.077227 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee" gracePeriod=15 Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.077277 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789" gracePeriod=15 Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.077212 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb" gracePeriod=15 Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.079852 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.080114 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080128 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.080138 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080146 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.080158 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080165 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.080175 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080182 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.080198 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080206 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.080219 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080226 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080330 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080340 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080356 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080521 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.080532 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.159234 4771 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.248:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.163504 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.163541 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.163593 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.163630 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.163752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.163906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.163969 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.164054 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265233 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265292 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265311 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265377 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265455 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265527 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265545 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265562 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265582 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.265602 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: I0219 21:30:39.460086 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:39 crc kubenswrapper[4771]: W0219 21:30:39.490079 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a24d2181387eea4de14ca0e737d5fb6e629b1f5a223e69250d4c9feaa3ccc0a6 WatchSource:0}: Error finding container a24d2181387eea4de14ca0e737d5fb6e629b1f5a223e69250d4c9feaa3ccc0a6: Status 404 returned error can't find the container with id a24d2181387eea4de14ca0e737d5fb6e629b1f5a223e69250d4c9feaa3ccc0a6 Feb 19 21:30:39 crc kubenswrapper[4771]: E0219 21:30:39.493970 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.248:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895c330079725d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:30:39.493350873 +0000 UTC m=+139.764793343,LastTimestamp:2026-02-19 21:30:39.493350873 +0000 UTC m=+139.764793343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.038937 4771 generic.go:334] "Generic (PLEG): container finished" podID="5777718c-5a56-48a9-84bd-568b6a9d5346" containerID="ccdf3b1019487f3de57ed50ca997afaab5e92871f6a7bd54fb111c6cc7c22040" exitCode=0 Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.039138 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5777718c-5a56-48a9-84bd-568b6a9d5346","Type":"ContainerDied","Data":"ccdf3b1019487f3de57ed50ca997afaab5e92871f6a7bd54fb111c6cc7c22040"} Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.043302 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.043953 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.047890 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.049177 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789" exitCode=0 Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.049231 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb" exitCode=0 Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.049246 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6" exitCode=0 Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.049260 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee" exitCode=2 Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.050973 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9aac4866cb87d6055c70ee01babb6bf5c0579d21de63596fca1694ef52ed30b9"} Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.051053 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a24d2181387eea4de14ca0e737d5fb6e629b1f5a223e69250d4c9feaa3ccc0a6"} Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.442590 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:40 crc kubenswrapper[4771]: I0219 21:30:40.443754 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.058421 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:41 crc kubenswrapper[4771]: E0219 21:30:41.058440 4771 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.248:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.467615 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.468209 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.495037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5777718c-5a56-48a9-84bd-568b6a9d5346-kube-api-access\") pod \"5777718c-5a56-48a9-84bd-568b6a9d5346\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.495429 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-var-lock\") pod \"5777718c-5a56-48a9-84bd-568b6a9d5346\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.495470 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-kubelet-dir\") pod \"5777718c-5a56-48a9-84bd-568b6a9d5346\" (UID: \"5777718c-5a56-48a9-84bd-568b6a9d5346\") " Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.495474 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-var-lock" (OuterVolumeSpecName: "var-lock") pod "5777718c-5a56-48a9-84bd-568b6a9d5346" (UID: "5777718c-5a56-48a9-84bd-568b6a9d5346"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.495663 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5777718c-5a56-48a9-84bd-568b6a9d5346" (UID: "5777718c-5a56-48a9-84bd-568b6a9d5346"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.495774 4771 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.495801 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5777718c-5a56-48a9-84bd-568b6a9d5346-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.502366 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5777718c-5a56-48a9-84bd-568b6a9d5346-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5777718c-5a56-48a9-84bd-568b6a9d5346" (UID: "5777718c-5a56-48a9-84bd-568b6a9d5346"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:41 crc kubenswrapper[4771]: I0219 21:30:41.596408 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5777718c-5a56-48a9-84bd-568b6a9d5346-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.070113 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.071173 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5777718c-5a56-48a9-84bd-568b6a9d5346","Type":"ContainerDied","Data":"e166c9bf88f47fb3752f0e05919527b611303651c9b02ff7d149fc0fa8189fd4"} Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.071220 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e166c9bf88f47fb3752f0e05919527b611303651c9b02ff7d149fc0fa8189fd4" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.071262 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.071279 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.071958 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.072413 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.075931 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.077154 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d" exitCode=0 Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.077260 4771 scope.go:117] "RemoveContainer" containerID="83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.094436 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.094858 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.105061 4771 scope.go:117] "RemoveContainer" containerID="9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.127566 4771 scope.go:117] "RemoveContainer" containerID="b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.158440 4771 scope.go:117] "RemoveContainer" containerID="52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.191819 4771 scope.go:117] "RemoveContainer" containerID="944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205040 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205205 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205226 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205230 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205530 4771 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205546 4771 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.205587 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.218816 4771 scope.go:117] "RemoveContainer" containerID="53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.246951 4771 scope.go:117] "RemoveContainer" containerID="83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789" Feb 19 21:30:42 crc kubenswrapper[4771]: E0219 21:30:42.248139 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\": container with ID starting with 83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789 not found: ID does not exist" containerID="83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.248187 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789"} err="failed to get container status \"83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\": rpc error: code = NotFound desc = could not find container \"83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789\": container with ID starting with 83cad3d3b7434b5602be48a2bb8fad650c4b6209339c496f73c06d8fbf1cb789 not found: ID does not exist" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.248220 4771 scope.go:117] "RemoveContainer" containerID="9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb" Feb 19 21:30:42 crc kubenswrapper[4771]: E0219 21:30:42.248754 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\": container with ID starting with 9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb not found: ID does not exist" containerID="9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.248797 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb"} err="failed to get container status \"9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\": rpc error: code = NotFound desc = could not find container \"9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb\": container with ID starting with 9686afe2a387ec1497413ad6e421914aa374855bc4b5df35ee5c64757a392ddb not found: ID does not exist" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.248835 4771 scope.go:117] "RemoveContainer" containerID="b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6" Feb 19 21:30:42 crc kubenswrapper[4771]: E0219 21:30:42.249274 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\": container with ID starting with b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6 not found: ID does not exist" containerID="b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.249298 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6"} err="failed to get container status \"b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\": rpc error: code = NotFound desc = could not find container \"b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6\": container with ID starting with b799564c9a126ea9b2c560865a25dcccf955e2e114ea856b8ad266167163bbd6 not found: ID does not exist" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.249315 4771 scope.go:117] "RemoveContainer" containerID="52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee" Feb 19 21:30:42 crc kubenswrapper[4771]: E0219 21:30:42.249797 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\": container with ID starting with 52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee not found: ID does not exist" containerID="52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.249818 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee"} err="failed to get container status \"52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\": rpc error: code = NotFound desc = could not find container \"52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee\": container with ID starting with 52eb7f6a45073875dd484c8a2ccbc0e7811f0b9a46d3d43b7c2309d7f72eb3ee not found: ID does not exist" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.249830 4771 scope.go:117] "RemoveContainer" containerID="944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d" Feb 19 21:30:42 crc kubenswrapper[4771]: E0219 21:30:42.250197 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\": container with ID starting with 944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d not found: ID does not exist" containerID="944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.250243 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d"} err="failed to get container status \"944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\": rpc error: code = NotFound desc = could not find container \"944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d\": container with ID starting with 944625a9476b39fe326d4d9e2a389bdb8b55a44c15af689003df2db78af5420d not found: ID does not exist" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.250277 4771 scope.go:117] "RemoveContainer" containerID="53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50" Feb 19 21:30:42 crc kubenswrapper[4771]: E0219 21:30:42.250661 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\": container with ID starting with 53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50 not found: ID does not exist" containerID="53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.250691 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50"} err="failed to get container status \"53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\": rpc error: code = NotFound desc = could not find container \"53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50\": container with ID starting with 53dfa9b4459665eead6359a8eefeda5112aa3128df6e497332617ee22fd70a50 not found: ID does not exist" Feb 19 21:30:42 crc kubenswrapper[4771]: I0219 21:30:42.445007 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 21:30:43 crc kubenswrapper[4771]: I0219 21:30:43.090234 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:43 crc kubenswrapper[4771]: I0219 21:30:43.090956 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:43 crc kubenswrapper[4771]: I0219 21:30:43.091342 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:43 crc kubenswrapper[4771]: I0219 21:30:43.094919 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:43 crc kubenswrapper[4771]: I0219 21:30:43.095692 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:47 crc kubenswrapper[4771]: E0219 21:30:47.840574 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.248:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895c330079725d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:30:39.493350873 +0000 UTC m=+139.764793343,LastTimestamp:2026-02-19 21:30:39.493350873 +0000 UTC m=+139.764793343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:30:48 crc kubenswrapper[4771]: E0219 21:30:48.790985 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:48 crc kubenswrapper[4771]: E0219 21:30:48.791990 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:48 crc kubenswrapper[4771]: E0219 21:30:48.792358 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:48 crc kubenswrapper[4771]: E0219 21:30:48.792711 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:48 crc kubenswrapper[4771]: E0219 21:30:48.793046 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:48 crc kubenswrapper[4771]: I0219 21:30:48.793104 4771 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 21:30:48 crc kubenswrapper[4771]: E0219 21:30:48.793461 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="200ms" Feb 19 21:30:48 crc kubenswrapper[4771]: E0219 21:30:48.994078 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="400ms" Feb 19 21:30:49 crc kubenswrapper[4771]: E0219 21:30:49.395083 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="800ms" Feb 19 21:30:49 crc kubenswrapper[4771]: E0219 21:30:49.499081 4771 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.248:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" volumeName="registry-storage" Feb 19 21:30:50 crc kubenswrapper[4771]: E0219 21:30:50.195867 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="1.6s" Feb 19 21:30:50 crc kubenswrapper[4771]: I0219 21:30:50.445553 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:51 crc kubenswrapper[4771]: I0219 21:30:51.477784 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 21:30:51 crc kubenswrapper[4771]: I0219 21:30:51.477850 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 21:30:51 crc kubenswrapper[4771]: E0219 21:30:51.797621 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="3.2s" Feb 19 21:30:52 crc kubenswrapper[4771]: I0219 21:30:52.160893 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 21:30:52 crc kubenswrapper[4771]: I0219 21:30:52.160956 4771 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3" exitCode=1 Feb 19 21:30:52 crc kubenswrapper[4771]: I0219 21:30:52.160996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3"} Feb 19 21:30:52 crc kubenswrapper[4771]: I0219 21:30:52.161575 4771 scope.go:117] "RemoveContainer" containerID="42282a700951ef48633c22a44f0bfd6b4badd1dfca7254d99b50947b93f8dce3" Feb 19 21:30:52 crc kubenswrapper[4771]: I0219 21:30:52.161898 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:52 crc kubenswrapper[4771]: I0219 21:30:52.162326 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:52 crc kubenswrapper[4771]: E0219 21:30:52.751972 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:30:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:30:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:30:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:30:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:52 crc kubenswrapper[4771]: E0219 21:30:52.752923 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:52 crc kubenswrapper[4771]: E0219 21:30:52.753597 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:52 crc kubenswrapper[4771]: E0219 21:30:52.754166 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:52 crc kubenswrapper[4771]: E0219 21:30:52.754554 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:52 crc kubenswrapper[4771]: E0219 21:30:52.754591 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:30:53 crc kubenswrapper[4771]: I0219 21:30:53.168827 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 21:30:53 crc kubenswrapper[4771]: I0219 21:30:53.168917 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac18fd7a66e5cfbfb7f611caab0dccc762a8c8662c843dfc3e65a495fc436388"} Feb 19 21:30:53 crc kubenswrapper[4771]: I0219 21:30:53.169845 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:53 crc kubenswrapper[4771]: I0219 21:30:53.170568 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:54 crc kubenswrapper[4771]: I0219 21:30:54.436769 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:54 crc kubenswrapper[4771]: I0219 21:30:54.438361 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:54 crc kubenswrapper[4771]: I0219 21:30:54.438800 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:54 crc kubenswrapper[4771]: I0219 21:30:54.456498 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:30:54 crc kubenswrapper[4771]: I0219 21:30:54.456535 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:30:54 crc kubenswrapper[4771]: E0219 21:30:54.457054 4771 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:54 crc kubenswrapper[4771]: I0219 21:30:54.457751 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:54 crc kubenswrapper[4771]: W0219 21:30:54.486338 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f82dbf62599b00b364ec0001270005410cdd136b297cb0bfbb0a423129061412 WatchSource:0}: Error finding container f82dbf62599b00b364ec0001270005410cdd136b297cb0bfbb0a423129061412: Status 404 returned error can't find the container with id f82dbf62599b00b364ec0001270005410cdd136b297cb0bfbb0a423129061412 Feb 19 21:30:54 crc kubenswrapper[4771]: E0219 21:30:54.999514 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.248:6443: connect: connection refused" interval="6.4s" Feb 19 21:30:55 crc kubenswrapper[4771]: I0219 21:30:55.184300 4771 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f3eb2de291728da59768d46bc9c8c146c9fcb31f45765f593a0887586f905105" exitCode=0 Feb 19 21:30:55 crc kubenswrapper[4771]: I0219 21:30:55.184370 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f3eb2de291728da59768d46bc9c8c146c9fcb31f45765f593a0887586f905105"} Feb 19 21:30:55 crc kubenswrapper[4771]: I0219 21:30:55.184430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f82dbf62599b00b364ec0001270005410cdd136b297cb0bfbb0a423129061412"} Feb 19 21:30:55 crc kubenswrapper[4771]: I0219 21:30:55.184971 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:30:55 crc kubenswrapper[4771]: I0219 21:30:55.185011 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:30:55 crc kubenswrapper[4771]: I0219 21:30:55.185898 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:55 crc kubenswrapper[4771]: I0219 21:30:55.186610 4771 status_manager.go:851] "Failed to get status for pod" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" Feb 19 21:30:55 crc kubenswrapper[4771]: E0219 21:30:55.186881 4771 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.248:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:56 crc kubenswrapper[4771]: I0219 21:30:56.199197 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8ad3282e1c6a2ede541b2e3b08b5930f8e9b543cd286f22a478ecef5d7c2f25d"} Feb 19 21:30:56 crc kubenswrapper[4771]: I0219 21:30:56.200167 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dac7bbffb8d803e2f2a8fb71cff0fe24c252a7c9db8066488943efa636b0f4fd"} Feb 19 21:30:56 crc kubenswrapper[4771]: I0219 21:30:56.200248 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b2077ecb2ec7bf83a75663f10fc1590b8949f26061a31b0abd6be33cf2632028"} Feb 19 21:30:57 crc kubenswrapper[4771]: I0219 21:30:57.208821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d4917cf6d4dd3ab67bc9e8a3133964fe9e9a1b5fbeaf87448822a0a9632a430a"} Feb 19 21:30:57 crc kubenswrapper[4771]: I0219 21:30:57.209098 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:57 crc kubenswrapper[4771]: I0219 21:30:57.209109 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b150a61e1f3b83030a37234a6a7703c536835232885af5a0434cd2e67c6ed1b9"} Feb 19 21:30:57 crc kubenswrapper[4771]: I0219 21:30:57.209191 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:30:57 crc kubenswrapper[4771]: I0219 21:30:57.209223 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:30:59 crc kubenswrapper[4771]: I0219 21:30:59.458221 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:59 crc kubenswrapper[4771]: I0219 21:30:59.458546 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:30:59 crc kubenswrapper[4771]: I0219 21:30:59.465712 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:31:01 crc kubenswrapper[4771]: I0219 21:31:01.477741 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.157794 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.163826 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.219475 4771 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.241153 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.241419 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.244640 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.247416 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:31:02 crc kubenswrapper[4771]: I0219 21:31:02.307209 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="16abfade-a826-4efe-8434-a421ae7665ab" Feb 19 21:31:03 crc kubenswrapper[4771]: I0219 21:31:03.247581 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:31:03 crc kubenswrapper[4771]: I0219 21:31:03.249097 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="95ed4c7c-f32c-4a55-b662-93e27798acc7" Feb 19 21:31:03 crc kubenswrapper[4771]: I0219 21:31:03.251745 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="16abfade-a826-4efe-8434-a421ae7665ab" Feb 19 21:31:11 crc kubenswrapper[4771]: I0219 21:31:11.921779 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 21:31:12 crc kubenswrapper[4771]: I0219 21:31:12.527992 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 21:31:12 crc kubenswrapper[4771]: I0219 21:31:12.671552 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 21:31:12 crc kubenswrapper[4771]: I0219 21:31:12.800057 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 21:31:12 crc kubenswrapper[4771]: I0219 21:31:12.842051 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:31:12 crc kubenswrapper[4771]: I0219 21:31:12.923811 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 21:31:12 crc kubenswrapper[4771]: I0219 21:31:12.956695 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:31:12 crc kubenswrapper[4771]: I0219 21:31:12.956956 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:13 crc kubenswrapper[4771]: I0219 21:31:13.126515 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 21:31:13 crc kubenswrapper[4771]: I0219 21:31:13.258600 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 21:31:13 crc kubenswrapper[4771]: I0219 21:31:13.294100 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 21:31:13 crc kubenswrapper[4771]: I0219 21:31:13.602402 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 21:31:13 crc kubenswrapper[4771]: I0219 21:31:13.703568 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 21:31:13 crc kubenswrapper[4771]: I0219 21:31:13.809292 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 21:31:14 crc kubenswrapper[4771]: I0219 21:31:14.408249 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 21:31:14 crc kubenswrapper[4771]: I0219 21:31:14.500435 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 21:31:14 crc kubenswrapper[4771]: I0219 21:31:14.856085 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 21:31:14 crc kubenswrapper[4771]: I0219 21:31:14.870743 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.009945 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.012450 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.115911 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.166733 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.267361 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.294602 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.449364 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.581411 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.683540 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.755447 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.798590 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.842322 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.944387 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:31:15 crc kubenswrapper[4771]: I0219 21:31:15.989264 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.009471 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.051374 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.061241 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.089520 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.240352 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.256972 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.258308 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.387118 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.459688 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.525754 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.535887 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.542078 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.569956 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.624066 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.657500 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.662458 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.676129 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.713134 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.784126 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.819091 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.864809 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 21:31:16 crc kubenswrapper[4771]: I0219 21:31:16.897627 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.026010 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.132300 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.286990 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.355425 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.484792 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.530475 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.570998 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.716295 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 21:31:17 crc kubenswrapper[4771]: I0219 21:31:17.826819 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.286722 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.346727 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.401183 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.539151 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.551822 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.552283 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.694328 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.732258 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:31:18 crc kubenswrapper[4771]: I0219 21:31:18.791442 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.011832 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.054283 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.085173 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.086316 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.133994 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.216440 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.233780 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.260317 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.311281 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.345830 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.362134 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.382979 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.434248 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.444606 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.516379 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.546758 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.578641 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.599078 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.764345 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.798237 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.813579 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.825932 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.834462 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.852148 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.889180 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 21:31:19 crc kubenswrapper[4771]: I0219 21:31:19.935840 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.009944 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.018141 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.033850 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.044126 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.156472 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.165383 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.226743 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.249010 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.266976 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.287242 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.400298 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.405560 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.468406 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.510503 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.627642 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.646912 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.675382 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.682498 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.723466 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.854236 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.860833 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.873144 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.945210 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 21:31:20 crc kubenswrapper[4771]: I0219 21:31:20.970761 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.041727 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.076730 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.223889 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.287186 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.318445 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.386675 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.490393 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.498657 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.513849 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.527102 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.542326 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.548104 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.559449 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.589223 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.652931 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.741509 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.742487 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.761374 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.802784 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.812628 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.857253 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.927061 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 21:31:21 crc kubenswrapper[4771]: I0219 21:31:21.943841 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.072140 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.109088 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.141253 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.207793 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.223192 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.263388 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.310805 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.335059 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.528002 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.562739 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.563434 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.590478 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.660740 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 21:31:22 crc kubenswrapper[4771]: I0219 21:31:22.764470 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.093269 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.120907 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.143316 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.222267 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.253397 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.275747 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.298504 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.300361 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.311356 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.335973 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.437681 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.466461 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.474324 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.518522 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.659930 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.678000 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.689897 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.770471 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.803609 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.804273 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.836589 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.900744 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.910782 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.959338 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 21:31:23 crc kubenswrapper[4771]: I0219 21:31:23.971639 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.035128 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.052620 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.083333 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.093564 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.096933 4771 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.105471 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.152881 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.263255 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.291851 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.292646 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.293471 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.392203 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.479160 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.543534 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.613509 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.613964 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.616170 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.689602 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.843407 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.926899 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 21:31:24 crc kubenswrapper[4771]: I0219 21:31:24.935680 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.088873 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.111027 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.143881 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.174044 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.233918 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.438153 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.454375 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.456604 4771 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.532344 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.547735 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 21:31:25 crc kubenswrapper[4771]: I0219 21:31:25.634480 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.010655 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.184458 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.248574 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.277760 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.327126 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.404004 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.590426 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.633577 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.696522 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.862078 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.967796 4771 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.978723 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.978794 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h","openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:31:26 crc kubenswrapper[4771]: E0219 21:31:26.979094 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" containerName="installer" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.979124 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" containerName="installer" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.979403 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5777718c-5a56-48a9-84bd-568b6a9d5346" containerName="installer" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.980067 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.986982 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.990242 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.991298 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.991605 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.992006 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.998376 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:31:26 crc kubenswrapper[4771]: I0219 21:31:26.998653 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.001264 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.001467 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.002605 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.003598 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.005319 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.006100 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010338 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010387 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010412 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010438 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5rch\" (UniqueName: \"kubernetes.io/projected/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-kube-api-access-v5rch\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010607 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010668 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010731 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010795 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010930 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.010962 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.025414 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.025822 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.030561 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.035075 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.090548 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.090533447 podStartE2EDuration="25.090533447s" podCreationTimestamp="2026-02-19 21:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:27.087083023 +0000 UTC m=+187.358525493" watchObservedRunningTime="2026-02-19 21:31:27.090533447 +0000 UTC m=+187.361975917" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5rch\" (UniqueName: \"kubernetes.io/projected/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-kube-api-access-v5rch\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111549 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111585 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111605 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111712 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111766 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.111799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.113684 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.114321 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.115159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.115212 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.126778 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.127931 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.128076 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.128447 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.128468 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.132189 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.132317 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.135792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.138126 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.155761 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5rch\" (UniqueName: \"kubernetes.io/projected/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-kube-api-access-v5rch\") pod \"oauth-openshift-5d4f55d7c5-hqx2h\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.303268 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.427550 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.571190 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h"] Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.615814 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.879371 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 21:31:27 crc kubenswrapper[4771]: I0219 21:31:27.967547 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 21:31:28 crc kubenswrapper[4771]: I0219 21:31:28.426143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" event={"ID":"bd6cd164-e4e0-4e2a-b08e-271e84a62b52","Type":"ContainerStarted","Data":"3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0"} Feb 19 21:31:28 crc kubenswrapper[4771]: I0219 21:31:28.426207 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" event={"ID":"bd6cd164-e4e0-4e2a-b08e-271e84a62b52","Type":"ContainerStarted","Data":"be24286a0061deb29b694e4b16fee4442654b0b2f6eed09aa42a13bcaabeecdc"} Feb 19 21:31:28 crc kubenswrapper[4771]: I0219 21:31:28.426607 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:28 crc kubenswrapper[4771]: I0219 21:31:28.449647 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:31:28 crc kubenswrapper[4771]: I0219 21:31:28.461227 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" podStartSLOduration=77.461208533 podStartE2EDuration="1m17.461208533s" podCreationTimestamp="2026-02-19 21:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:28.458660774 +0000 UTC m=+188.730103314" watchObservedRunningTime="2026-02-19 21:31:28.461208533 +0000 UTC m=+188.732651013" Feb 19 21:31:29 crc kubenswrapper[4771]: I0219 21:31:29.346649 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 21:31:36 crc kubenswrapper[4771]: I0219 21:31:36.067558 4771 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 21:31:36 crc kubenswrapper[4771]: I0219 21:31:36.068581 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9aac4866cb87d6055c70ee01babb6bf5c0579d21de63596fca1694ef52ed30b9" gracePeriod=5 Feb 19 21:31:40 crc kubenswrapper[4771]: I0219 21:31:40.550519 4771 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.510164 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.510513 4771 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9aac4866cb87d6055c70ee01babb6bf5c0579d21de63596fca1694ef52ed30b9" exitCode=137 Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.654634 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.654736 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.722886 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.722924 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.722946 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.722980 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.723011 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.723198 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.723598 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.723611 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.723627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.736413 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.824777 4771 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.824831 4771 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.824849 4771 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.824867 4771 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:41 crc kubenswrapper[4771]: I0219 21:31:41.824883 4771 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:42 crc kubenswrapper[4771]: I0219 21:31:42.447239 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 21:31:42 crc kubenswrapper[4771]: I0219 21:31:42.517450 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 21:31:42 crc kubenswrapper[4771]: I0219 21:31:42.517573 4771 scope.go:117] "RemoveContainer" containerID="9aac4866cb87d6055c70ee01babb6bf5c0579d21de63596fca1694ef52ed30b9" Feb 19 21:31:42 crc kubenswrapper[4771]: I0219 21:31:42.517605 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:31:42 crc kubenswrapper[4771]: I0219 21:31:42.956790 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:31:42 crc kubenswrapper[4771]: I0219 21:31:42.956860 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:47 crc kubenswrapper[4771]: I0219 21:31:47.678552 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 21:31:49 crc kubenswrapper[4771]: I0219 21:31:49.618743 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 21:31:51 crc kubenswrapper[4771]: I0219 21:31:51.155203 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 21:31:59 crc kubenswrapper[4771]: I0219 21:31:59.495951 4771 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:00 crc kubenswrapper[4771]: I0219 21:32:00.336812 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.053485 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xkkw5"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.054234 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" podUID="e971e93e-3654-4871-8809-acf956852f8a" containerName="controller-manager" containerID="cri-o://a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429" gracePeriod=30 Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.147695 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.147906 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" podUID="e7550f22-e38d-40af-a730-6e5f8ef3fef3" containerName="route-controller-manager" containerID="cri-o://f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a" gracePeriod=30 Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.515757 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.539277 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-proxy-ca-bundles\") pod \"e971e93e-3654-4871-8809-acf956852f8a\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.539625 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-config\") pod \"e971e93e-3654-4871-8809-acf956852f8a\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.539647 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e971e93e-3654-4871-8809-acf956852f8a-serving-cert\") pod \"e971e93e-3654-4871-8809-acf956852f8a\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.539703 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-client-ca\") pod \"e971e93e-3654-4871-8809-acf956852f8a\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.539748 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b552v\" (UniqueName: \"kubernetes.io/projected/e971e93e-3654-4871-8809-acf956852f8a-kube-api-access-b552v\") pod \"e971e93e-3654-4871-8809-acf956852f8a\" (UID: \"e971e93e-3654-4871-8809-acf956852f8a\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.540371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e971e93e-3654-4871-8809-acf956852f8a" (UID: "e971e93e-3654-4871-8809-acf956852f8a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.540478 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-config" (OuterVolumeSpecName: "config") pod "e971e93e-3654-4871-8809-acf956852f8a" (UID: "e971e93e-3654-4871-8809-acf956852f8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.540730 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-client-ca" (OuterVolumeSpecName: "client-ca") pod "e971e93e-3654-4871-8809-acf956852f8a" (UID: "e971e93e-3654-4871-8809-acf956852f8a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.546675 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.548183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e971e93e-3654-4871-8809-acf956852f8a-kube-api-access-b552v" (OuterVolumeSpecName: "kube-api-access-b552v") pod "e971e93e-3654-4871-8809-acf956852f8a" (UID: "e971e93e-3654-4871-8809-acf956852f8a"). InnerVolumeSpecName "kube-api-access-b552v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.550440 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e971e93e-3654-4871-8809-acf956852f8a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e971e93e-3654-4871-8809-acf956852f8a" (UID: "e971e93e-3654-4871-8809-acf956852f8a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.631834 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-wzn9l"] Feb 19 21:32:03 crc kubenswrapper[4771]: E0219 21:32:03.632239 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.632304 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:32:03 crc kubenswrapper[4771]: E0219 21:32:03.632379 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7550f22-e38d-40af-a730-6e5f8ef3fef3" containerName="route-controller-manager" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.632914 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7550f22-e38d-40af-a730-6e5f8ef3fef3" containerName="route-controller-manager" Feb 19 21:32:03 crc kubenswrapper[4771]: E0219 21:32:03.632970 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e971e93e-3654-4871-8809-acf956852f8a" containerName="controller-manager" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.634885 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e971e93e-3654-4871-8809-acf956852f8a" containerName="controller-manager" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.635117 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7550f22-e38d-40af-a730-6e5f8ef3fef3" containerName="route-controller-manager" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.635187 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e971e93e-3654-4871-8809-acf956852f8a" containerName="controller-manager" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.635244 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.635669 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.639911 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-wzn9l"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642190 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slmm5\" (UniqueName: \"kubernetes.io/projected/e7550f22-e38d-40af-a730-6e5f8ef3fef3-kube-api-access-slmm5\") pod \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642364 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-config\") pod \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642394 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7550f22-e38d-40af-a730-6e5f8ef3fef3-serving-cert\") pod \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642425 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-client-ca\") pod \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\" (UID: \"e7550f22-e38d-40af-a730-6e5f8ef3fef3\") " Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642946 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e971e93e-3654-4871-8809-acf956852f8a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642963 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642972 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b552v\" (UniqueName: \"kubernetes.io/projected/e971e93e-3654-4871-8809-acf956852f8a-kube-api-access-b552v\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642988 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.642996 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e971e93e-3654-4871-8809-acf956852f8a-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.643646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7550f22-e38d-40af-a730-6e5f8ef3fef3" (UID: "e7550f22-e38d-40af-a730-6e5f8ef3fef3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.644209 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-config" (OuterVolumeSpecName: "config") pod "e7550f22-e38d-40af-a730-6e5f8ef3fef3" (UID: "e7550f22-e38d-40af-a730-6e5f8ef3fef3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.647986 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7550f22-e38d-40af-a730-6e5f8ef3fef3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7550f22-e38d-40af-a730-6e5f8ef3fef3" (UID: "e7550f22-e38d-40af-a730-6e5f8ef3fef3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.648761 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7550f22-e38d-40af-a730-6e5f8ef3fef3-kube-api-access-slmm5" (OuterVolumeSpecName: "kube-api-access-slmm5") pod "e7550f22-e38d-40af-a730-6e5f8ef3fef3" (UID: "e7550f22-e38d-40af-a730-6e5f8ef3fef3"). InnerVolumeSpecName "kube-api-access-slmm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.677567 4771 generic.go:334] "Generic (PLEG): container finished" podID="e7550f22-e38d-40af-a730-6e5f8ef3fef3" containerID="f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.677675 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.677911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" event={"ID":"e7550f22-e38d-40af-a730-6e5f8ef3fef3","Type":"ContainerDied","Data":"f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a"} Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.677958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh" event={"ID":"e7550f22-e38d-40af-a730-6e5f8ef3fef3","Type":"ContainerDied","Data":"8953588a282e4e22b8741a32b7dd43b17cd0e00dfa86cfd911f548c51f669fba"} Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.677974 4771 scope.go:117] "RemoveContainer" containerID="f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.690390 4771 generic.go:334] "Generic (PLEG): container finished" podID="e971e93e-3654-4871-8809-acf956852f8a" containerID="a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.690470 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" event={"ID":"e971e93e-3654-4871-8809-acf956852f8a","Type":"ContainerDied","Data":"a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429"} Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.690482 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.690506 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xkkw5" event={"ID":"e971e93e-3654-4871-8809-acf956852f8a","Type":"ContainerDied","Data":"72544deba9571bcb93e9972457786c4f3653b14e677de6ffb47657c935c16a01"} Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.707647 4771 scope.go:117] "RemoveContainer" containerID="f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a" Feb 19 21:32:03 crc kubenswrapper[4771]: E0219 21:32:03.708054 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a\": container with ID starting with f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a not found: ID does not exist" containerID="f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.708105 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a"} err="failed to get container status \"f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a\": rpc error: code = NotFound desc = could not find container \"f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a\": container with ID starting with f93e043fea5959cbdf7c834a9d2a93026f0f5bf270834e4520f3d2c79d88602a not found: ID does not exist" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.708125 4771 scope.go:117] "RemoveContainer" containerID="a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.716126 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.721212 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sl9xh"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.725133 4771 scope.go:117] "RemoveContainer" containerID="a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429" Feb 19 21:32:03 crc kubenswrapper[4771]: E0219 21:32:03.725508 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429\": container with ID starting with a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429 not found: ID does not exist" containerID="a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.725534 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429"} err="failed to get container status \"a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429\": rpc error: code = NotFound desc = could not find container \"a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429\": container with ID starting with a429280c3c5650a82cd625d61686393860f99de6542c62ed311e5f331e24a429 not found: ID does not exist" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.729702 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.730624 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.734090 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.734288 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.734135 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.734455 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xkkw5"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.734528 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.734662 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.734844 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.738127 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xkkw5"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.740641 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75"] Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.743908 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-client-ca\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.743944 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5tt9\" (UniqueName: \"kubernetes.io/projected/0446aa30-49cb-4936-ac35-ff2ea5f5f227-kube-api-access-n5tt9\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.743978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0446aa30-49cb-4936-ac35-ff2ea5f5f227-serving-cert\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.744001 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-proxy-ca-bundles\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.744045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-config\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.744076 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.744085 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7550f22-e38d-40af-a730-6e5f8ef3fef3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.744095 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7550f22-e38d-40af-a730-6e5f8ef3fef3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.744103 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slmm5\" (UniqueName: \"kubernetes.io/projected/e7550f22-e38d-40af-a730-6e5f8ef3fef3-kube-api-access-slmm5\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-proxy-ca-bundles\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844848 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-config\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-config\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844906 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-client-ca\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844928 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5tt9\" (UniqueName: \"kubernetes.io/projected/0446aa30-49cb-4936-ac35-ff2ea5f5f227-kube-api-access-n5tt9\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844946 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-serving-cert\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844968 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-client-ca\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.844983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znh9\" (UniqueName: \"kubernetes.io/projected/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-kube-api-access-6znh9\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.845007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0446aa30-49cb-4936-ac35-ff2ea5f5f227-serving-cert\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.847203 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-client-ca\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.847741 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-config\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.847923 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-proxy-ca-bundles\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.849698 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0446aa30-49cb-4936-ac35-ff2ea5f5f227-serving-cert\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.865525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5tt9\" (UniqueName: \"kubernetes.io/projected/0446aa30-49cb-4936-ac35-ff2ea5f5f227-kube-api-access-n5tt9\") pod \"controller-manager-86855dfb7-wzn9l\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.946274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-serving-cert\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.946358 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-client-ca\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.946393 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6znh9\" (UniqueName: \"kubernetes.io/projected/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-kube-api-access-6znh9\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.946486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-config\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.948557 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-client-ca\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.948588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-config\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.953815 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-serving-cert\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.961154 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:03 crc kubenswrapper[4771]: I0219 21:32:03.968977 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znh9\" (UniqueName: \"kubernetes.io/projected/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-kube-api-access-6znh9\") pod \"route-controller-manager-7b59895879-j8p75\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.047962 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.216924 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-wzn9l"] Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.272784 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-wzn9l"] Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.323667 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75"] Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.359657 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75"] Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.443414 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7550f22-e38d-40af-a730-6e5f8ef3fef3" path="/var/lib/kubelet/pods/e7550f22-e38d-40af-a730-6e5f8ef3fef3/volumes" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.444248 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e971e93e-3654-4871-8809-acf956852f8a" path="/var/lib/kubelet/pods/e971e93e-3654-4871-8809-acf956852f8a/volumes" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.697492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" event={"ID":"0446aa30-49cb-4936-ac35-ff2ea5f5f227","Type":"ContainerStarted","Data":"39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b"} Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.698085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" event={"ID":"0446aa30-49cb-4936-ac35-ff2ea5f5f227","Type":"ContainerStarted","Data":"b2b6700fb81c394c922186f2da11adfec882887d3671147f9fad7a1e6ef48fa6"} Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.698410 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.697565 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" podUID="0446aa30-49cb-4936-ac35-ff2ea5f5f227" containerName="controller-manager" containerID="cri-o://39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b" gracePeriod=30 Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.701343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" event={"ID":"96b33c65-0cc3-4fea-a7b5-fed6b4a77313","Type":"ContainerStarted","Data":"6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427"} Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.702069 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.702264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" event={"ID":"96b33c65-0cc3-4fea-a7b5-fed6b4a77313","Type":"ContainerStarted","Data":"8f75fd82bc33ad2f3067920c793761b2ebbade61f84ace4d2a401b386cfa7ce1"} Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.701622 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" podUID="96b33c65-0cc3-4fea-a7b5-fed6b4a77313" containerName="route-controller-manager" containerID="cri-o://6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427" gracePeriod=30 Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.703833 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.717893 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" podStartSLOduration=1.717873886 podStartE2EDuration="1.717873886s" podCreationTimestamp="2026-02-19 21:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:04.713407839 +0000 UTC m=+224.984850319" watchObservedRunningTime="2026-02-19 21:32:04.717873886 +0000 UTC m=+224.989316366" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.739985 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" podStartSLOduration=1.739969823 podStartE2EDuration="1.739969823s" podCreationTimestamp="2026-02-19 21:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:04.735888686 +0000 UTC m=+225.007331156" watchObservedRunningTime="2026-02-19 21:32:04.739969823 +0000 UTC m=+225.011412293" Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.836308 4771 patch_prober.go:28] interesting pod/route-controller-manager-7b59895879-j8p75 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:56374->10.217.0.60:8443: read: connection reset by peer" start-of-body= Feb 19 21:32:04 crc kubenswrapper[4771]: I0219 21:32:04.836362 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" podUID="96b33c65-0cc3-4fea-a7b5-fed6b4a77313" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:56374->10.217.0.60:8443: read: connection reset by peer" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.003436 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.070206 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-client-ca\") pod \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.070340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-proxy-ca-bundles\") pod \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.070405 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-config\") pod \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.070435 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5tt9\" (UniqueName: \"kubernetes.io/projected/0446aa30-49cb-4936-ac35-ff2ea5f5f227-kube-api-access-n5tt9\") pod \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.070496 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0446aa30-49cb-4936-ac35-ff2ea5f5f227-serving-cert\") pod \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\" (UID: \"0446aa30-49cb-4936-ac35-ff2ea5f5f227\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.071157 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-client-ca" (OuterVolumeSpecName: "client-ca") pod "0446aa30-49cb-4936-ac35-ff2ea5f5f227" (UID: "0446aa30-49cb-4936-ac35-ff2ea5f5f227"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.071178 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0446aa30-49cb-4936-ac35-ff2ea5f5f227" (UID: "0446aa30-49cb-4936-ac35-ff2ea5f5f227"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.071269 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-config" (OuterVolumeSpecName: "config") pod "0446aa30-49cb-4936-ac35-ff2ea5f5f227" (UID: "0446aa30-49cb-4936-ac35-ff2ea5f5f227"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.076597 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0446aa30-49cb-4936-ac35-ff2ea5f5f227-kube-api-access-n5tt9" (OuterVolumeSpecName: "kube-api-access-n5tt9") pod "0446aa30-49cb-4936-ac35-ff2ea5f5f227" (UID: "0446aa30-49cb-4936-ac35-ff2ea5f5f227"). InnerVolumeSpecName "kube-api-access-n5tt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.078080 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0446aa30-49cb-4936-ac35-ff2ea5f5f227-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0446aa30-49cb-4936-ac35-ff2ea5f5f227" (UID: "0446aa30-49cb-4936-ac35-ff2ea5f5f227"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.138928 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7b59895879-j8p75_96b33c65-0cc3-4fea-a7b5-fed6b4a77313/route-controller-manager/0.log" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.139215 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.171526 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-config\") pod \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.171611 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6znh9\" (UniqueName: \"kubernetes.io/projected/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-kube-api-access-6znh9\") pod \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.171653 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-serving-cert\") pod \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.171708 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-client-ca\") pod \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\" (UID: \"96b33c65-0cc3-4fea-a7b5-fed6b4a77313\") " Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.172121 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.172145 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5tt9\" (UniqueName: \"kubernetes.io/projected/0446aa30-49cb-4936-ac35-ff2ea5f5f227-kube-api-access-n5tt9\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.172166 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0446aa30-49cb-4936-ac35-ff2ea5f5f227-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.172187 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.172204 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0446aa30-49cb-4936-ac35-ff2ea5f5f227-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.172777 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-config" (OuterVolumeSpecName: "config") pod "96b33c65-0cc3-4fea-a7b5-fed6b4a77313" (UID: "96b33c65-0cc3-4fea-a7b5-fed6b4a77313"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.173606 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-client-ca" (OuterVolumeSpecName: "client-ca") pod "96b33c65-0cc3-4fea-a7b5-fed6b4a77313" (UID: "96b33c65-0cc3-4fea-a7b5-fed6b4a77313"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.181784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-kube-api-access-6znh9" (OuterVolumeSpecName: "kube-api-access-6znh9") pod "96b33c65-0cc3-4fea-a7b5-fed6b4a77313" (UID: "96b33c65-0cc3-4fea-a7b5-fed6b4a77313"). InnerVolumeSpecName "kube-api-access-6znh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.198498 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "96b33c65-0cc3-4fea-a7b5-fed6b4a77313" (UID: "96b33c65-0cc3-4fea-a7b5-fed6b4a77313"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.273797 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.273828 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6znh9\" (UniqueName: \"kubernetes.io/projected/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-kube-api-access-6znh9\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.273839 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.273849 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b33c65-0cc3-4fea-a7b5-fed6b4a77313-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.425677 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk"] Feb 19 21:32:05 crc kubenswrapper[4771]: E0219 21:32:05.426088 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b33c65-0cc3-4fea-a7b5-fed6b4a77313" containerName="route-controller-manager" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.426116 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b33c65-0cc3-4fea-a7b5-fed6b4a77313" containerName="route-controller-manager" Feb 19 21:32:05 crc kubenswrapper[4771]: E0219 21:32:05.426217 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0446aa30-49cb-4936-ac35-ff2ea5f5f227" containerName="controller-manager" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.426234 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0446aa30-49cb-4936-ac35-ff2ea5f5f227" containerName="controller-manager" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.426410 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b33c65-0cc3-4fea-a7b5-fed6b4a77313" containerName="route-controller-manager" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.426434 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0446aa30-49cb-4936-ac35-ff2ea5f5f227" containerName="controller-manager" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.427191 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.432075 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t"] Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.433133 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.463188 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t"] Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.475542 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk"] Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.476277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-config\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.476422 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-serving-cert\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.476551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqvdt\" (UniqueName: \"kubernetes.io/projected/6e0c247c-7447-45ec-8c02-e8010744dd2b-kube-api-access-tqvdt\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.476704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-proxy-ca-bundles\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.476780 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0c247c-7447-45ec-8c02-e8010744dd2b-serving-cert\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.476825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-config\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.477095 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwl4b\" (UniqueName: \"kubernetes.io/projected/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-kube-api-access-fwl4b\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.477188 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-client-ca\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.477245 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-client-ca\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578600 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-proxy-ca-bundles\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0c247c-7447-45ec-8c02-e8010744dd2b-serving-cert\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578732 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-config\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwl4b\" (UniqueName: \"kubernetes.io/projected/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-kube-api-access-fwl4b\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-client-ca\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-client-ca\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578895 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-config\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578936 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-serving-cert\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.578996 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqvdt\" (UniqueName: \"kubernetes.io/projected/6e0c247c-7447-45ec-8c02-e8010744dd2b-kube-api-access-tqvdt\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.581381 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-client-ca\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.581582 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-proxy-ca-bundles\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.581791 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-config\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.581846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-client-ca\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.584942 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-config\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.585306 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-serving-cert\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.585529 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0c247c-7447-45ec-8c02-e8010744dd2b-serving-cert\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.600654 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwl4b\" (UniqueName: \"kubernetes.io/projected/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-kube-api-access-fwl4b\") pod \"controller-manager-7bcb6f9d47-vfhgk\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.612489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqvdt\" (UniqueName: \"kubernetes.io/projected/6e0c247c-7447-45ec-8c02-e8010744dd2b-kube-api-access-tqvdt\") pod \"route-controller-manager-6558d68c-fpz5t\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.710557 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7b59895879-j8p75_96b33c65-0cc3-4fea-a7b5-fed6b4a77313/route-controller-manager/0.log" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.710610 4771 generic.go:334] "Generic (PLEG): container finished" podID="96b33c65-0cc3-4fea-a7b5-fed6b4a77313" containerID="6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427" exitCode=255 Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.710667 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" event={"ID":"96b33c65-0cc3-4fea-a7b5-fed6b4a77313","Type":"ContainerDied","Data":"6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427"} Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.710693 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" event={"ID":"96b33c65-0cc3-4fea-a7b5-fed6b4a77313","Type":"ContainerDied","Data":"8f75fd82bc33ad2f3067920c793761b2ebbade61f84ace4d2a401b386cfa7ce1"} Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.710710 4771 scope.go:117] "RemoveContainer" containerID="6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.710736 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.713270 4771 generic.go:334] "Generic (PLEG): container finished" podID="0446aa30-49cb-4936-ac35-ff2ea5f5f227" containerID="39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b" exitCode=0 Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.713327 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" event={"ID":"0446aa30-49cb-4936-ac35-ff2ea5f5f227","Type":"ContainerDied","Data":"39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b"} Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.713362 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.713367 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86855dfb7-wzn9l" event={"ID":"0446aa30-49cb-4936-ac35-ff2ea5f5f227","Type":"ContainerDied","Data":"b2b6700fb81c394c922186f2da11adfec882887d3671147f9fad7a1e6ef48fa6"} Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.737570 4771 scope.go:117] "RemoveContainer" containerID="6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427" Feb 19 21:32:05 crc kubenswrapper[4771]: E0219 21:32:05.738309 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427\": container with ID starting with 6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427 not found: ID does not exist" containerID="6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.738397 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427"} err="failed to get container status \"6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427\": rpc error: code = NotFound desc = could not find container \"6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427\": container with ID starting with 6dd48196a93d0660d79d2567c7f9d8a7b4d09085fca8771ca166fb89be801427 not found: ID does not exist" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.738432 4771 scope.go:117] "RemoveContainer" containerID="39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.760887 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.762286 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75"] Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.771357 4771 scope.go:117] "RemoveContainer" containerID="39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b" Feb 19 21:32:05 crc kubenswrapper[4771]: E0219 21:32:05.771881 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b\": container with ID starting with 39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b not found: ID does not exist" containerID="39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.771944 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b"} err="failed to get container status \"39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b\": rpc error: code = NotFound desc = could not find container \"39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b\": container with ID starting with 39a6ba13914493bd53dd279d57db4288803cf2ad66510afdece952f4862f799b not found: ID does not exist" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.772316 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-j8p75"] Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.778446 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.779442 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-wzn9l"] Feb 19 21:32:05 crc kubenswrapper[4771]: I0219 21:32:05.785569 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-wzn9l"] Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.097848 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t"] Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.172992 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk"] Feb 19 21:32:06 crc kubenswrapper[4771]: W0219 21:32:06.182994 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc131b03_351f_4002_aaa4_e77ea4d9d7cb.slice/crio-3636a557a5c28224a03b0df0e6eea065037f32039286d43d9157d3e54052b87c WatchSource:0}: Error finding container 3636a557a5c28224a03b0df0e6eea065037f32039286d43d9157d3e54052b87c: Status 404 returned error can't find the container with id 3636a557a5c28224a03b0df0e6eea065037f32039286d43d9157d3e54052b87c Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.443821 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0446aa30-49cb-4936-ac35-ff2ea5f5f227" path="/var/lib/kubelet/pods/0446aa30-49cb-4936-ac35-ff2ea5f5f227/volumes" Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.444598 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b33c65-0cc3-4fea-a7b5-fed6b4a77313" path="/var/lib/kubelet/pods/96b33c65-0cc3-4fea-a7b5-fed6b4a77313/volumes" Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.718987 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" event={"ID":"6e0c247c-7447-45ec-8c02-e8010744dd2b","Type":"ContainerStarted","Data":"06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425"} Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.719084 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" event={"ID":"6e0c247c-7447-45ec-8c02-e8010744dd2b","Type":"ContainerStarted","Data":"e773c85c841d180c1978ea12eefb718439d9bc97d2635bb1026cfa120fc7dcd7"} Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.719305 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.721155 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" event={"ID":"bc131b03-351f-4002-aaa4-e77ea4d9d7cb","Type":"ContainerStarted","Data":"3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa"} Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.721196 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" event={"ID":"bc131b03-351f-4002-aaa4-e77ea4d9d7cb","Type":"ContainerStarted","Data":"3636a557a5c28224a03b0df0e6eea065037f32039286d43d9157d3e54052b87c"} Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.721316 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.728236 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.729982 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.743119 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" podStartSLOduration=2.743100316 podStartE2EDuration="2.743100316s" podCreationTimestamp="2026-02-19 21:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:06.739498681 +0000 UTC m=+227.010941271" watchObservedRunningTime="2026-02-19 21:32:06.743100316 +0000 UTC m=+227.014542796" Feb 19 21:32:06 crc kubenswrapper[4771]: I0219 21:32:06.779735 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" podStartSLOduration=2.779715992 podStartE2EDuration="2.779715992s" podCreationTimestamp="2026-02-19 21:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:06.778279064 +0000 UTC m=+227.049721564" watchObservedRunningTime="2026-02-19 21:32:06.779715992 +0000 UTC m=+227.051158462" Feb 19 21:32:12 crc kubenswrapper[4771]: I0219 21:32:12.957329 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:32:12 crc kubenswrapper[4771]: I0219 21:32:12.958280 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:32:12 crc kubenswrapper[4771]: I0219 21:32:12.958345 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:32:12 crc kubenswrapper[4771]: I0219 21:32:12.958941 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:32:12 crc kubenswrapper[4771]: I0219 21:32:12.959056 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04" gracePeriod=600 Feb 19 21:32:13 crc kubenswrapper[4771]: I0219 21:32:13.768548 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04" exitCode=0 Feb 19 21:32:13 crc kubenswrapper[4771]: I0219 21:32:13.768648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04"} Feb 19 21:32:13 crc kubenswrapper[4771]: I0219 21:32:13.768921 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"b73cf65a14c2c7047315612825a835da1e94d5ce583905ac6452c7b7611f53f2"} Feb 19 21:32:37 crc kubenswrapper[4771]: I0219 21:32:37.135431 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h"] Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.037872 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t"] Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.038826 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" podUID="6e0c247c-7447-45ec-8c02-e8010744dd2b" containerName="route-controller-manager" containerID="cri-o://06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425" gracePeriod=30 Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.451056 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.588021 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-config\") pod \"6e0c247c-7447-45ec-8c02-e8010744dd2b\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.588092 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-client-ca\") pod \"6e0c247c-7447-45ec-8c02-e8010744dd2b\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.588145 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqvdt\" (UniqueName: \"kubernetes.io/projected/6e0c247c-7447-45ec-8c02-e8010744dd2b-kube-api-access-tqvdt\") pod \"6e0c247c-7447-45ec-8c02-e8010744dd2b\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.588165 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0c247c-7447-45ec-8c02-e8010744dd2b-serving-cert\") pod \"6e0c247c-7447-45ec-8c02-e8010744dd2b\" (UID: \"6e0c247c-7447-45ec-8c02-e8010744dd2b\") " Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.589993 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-client-ca" (OuterVolumeSpecName: "client-ca") pod "6e0c247c-7447-45ec-8c02-e8010744dd2b" (UID: "6e0c247c-7447-45ec-8c02-e8010744dd2b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.590279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-config" (OuterVolumeSpecName: "config") pod "6e0c247c-7447-45ec-8c02-e8010744dd2b" (UID: "6e0c247c-7447-45ec-8c02-e8010744dd2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.594772 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0c247c-7447-45ec-8c02-e8010744dd2b-kube-api-access-tqvdt" (OuterVolumeSpecName: "kube-api-access-tqvdt") pod "6e0c247c-7447-45ec-8c02-e8010744dd2b" (UID: "6e0c247c-7447-45ec-8c02-e8010744dd2b"). InnerVolumeSpecName "kube-api-access-tqvdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.595121 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0c247c-7447-45ec-8c02-e8010744dd2b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6e0c247c-7447-45ec-8c02-e8010744dd2b" (UID: "6e0c247c-7447-45ec-8c02-e8010744dd2b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.689221 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqvdt\" (UniqueName: \"kubernetes.io/projected/6e0c247c-7447-45ec-8c02-e8010744dd2b-kube-api-access-tqvdt\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.689276 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0c247c-7447-45ec-8c02-e8010744dd2b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.689298 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.689316 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e0c247c-7447-45ec-8c02-e8010744dd2b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.954428 4771 generic.go:334] "Generic (PLEG): container finished" podID="6e0c247c-7447-45ec-8c02-e8010744dd2b" containerID="06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425" exitCode=0 Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.954495 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" event={"ID":"6e0c247c-7447-45ec-8c02-e8010744dd2b","Type":"ContainerDied","Data":"06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425"} Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.954509 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.954546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t" event={"ID":"6e0c247c-7447-45ec-8c02-e8010744dd2b","Type":"ContainerDied","Data":"e773c85c841d180c1978ea12eefb718439d9bc97d2635bb1026cfa120fc7dcd7"} Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.954576 4771 scope.go:117] "RemoveContainer" containerID="06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.979185 4771 scope.go:117] "RemoveContainer" containerID="06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425" Feb 19 21:32:43 crc kubenswrapper[4771]: E0219 21:32:43.979961 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425\": container with ID starting with 06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425 not found: ID does not exist" containerID="06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425" Feb 19 21:32:43 crc kubenswrapper[4771]: I0219 21:32:43.980069 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425"} err="failed to get container status \"06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425\": rpc error: code = NotFound desc = could not find container \"06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425\": container with ID starting with 06bc8e82cb35c7c09774b2310c0e2af9b5146177cf05107e94dccf8fca807425 not found: ID does not exist" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.002088 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t"] Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.008969 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-fpz5t"] Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.453350 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0c247c-7447-45ec-8c02-e8010744dd2b" path="/var/lib/kubelet/pods/6e0c247c-7447-45ec-8c02-e8010744dd2b/volumes" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.459328 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt"] Feb 19 21:32:44 crc kubenswrapper[4771]: E0219 21:32:44.459944 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0c247c-7447-45ec-8c02-e8010744dd2b" containerName="route-controller-manager" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.459977 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0c247c-7447-45ec-8c02-e8010744dd2b" containerName="route-controller-manager" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.460199 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0c247c-7447-45ec-8c02-e8010744dd2b" containerName="route-controller-manager" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.460928 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.470073 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.470321 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.470502 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.470707 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.470712 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.470839 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.475465 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt"] Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.601846 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ebbfef-1456-480c-872e-d2fe83337828-client-ca\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.602012 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ebbfef-1456-480c-872e-d2fe83337828-config\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.602159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ebbfef-1456-480c-872e-d2fe83337828-serving-cert\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.602216 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7zqw\" (UniqueName: \"kubernetes.io/projected/e6ebbfef-1456-480c-872e-d2fe83337828-kube-api-access-l7zqw\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.705827 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ebbfef-1456-480c-872e-d2fe83337828-serving-cert\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.706077 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7zqw\" (UniqueName: \"kubernetes.io/projected/e6ebbfef-1456-480c-872e-d2fe83337828-kube-api-access-l7zqw\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.707390 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ebbfef-1456-480c-872e-d2fe83337828-client-ca\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.707560 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ebbfef-1456-480c-872e-d2fe83337828-config\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.711269 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ebbfef-1456-480c-872e-d2fe83337828-client-ca\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.711944 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ebbfef-1456-480c-872e-d2fe83337828-config\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.720259 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ebbfef-1456-480c-872e-d2fe83337828-serving-cert\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.732101 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7zqw\" (UniqueName: \"kubernetes.io/projected/e6ebbfef-1456-480c-872e-d2fe83337828-kube-api-access-l7zqw\") pod \"route-controller-manager-7b59895879-lwmgt\" (UID: \"e6ebbfef-1456-480c-872e-d2fe83337828\") " pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:44 crc kubenswrapper[4771]: I0219 21:32:44.790585 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:45 crc kubenswrapper[4771]: I0219 21:32:45.312624 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt"] Feb 19 21:32:45 crc kubenswrapper[4771]: I0219 21:32:45.970719 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" event={"ID":"e6ebbfef-1456-480c-872e-d2fe83337828","Type":"ContainerStarted","Data":"9a6686740cc2260159d93bfeace49acf182287bb897fb1edef5d03f87e129556"} Feb 19 21:32:45 crc kubenswrapper[4771]: I0219 21:32:45.970757 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" event={"ID":"e6ebbfef-1456-480c-872e-d2fe83337828","Type":"ContainerStarted","Data":"9ffc895cd6868a170170d54bd653ac7281ce39d35e3a1e4f02f23776e98acda1"} Feb 19 21:32:45 crc kubenswrapper[4771]: I0219 21:32:45.971000 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:45 crc kubenswrapper[4771]: I0219 21:32:45.998656 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" podStartSLOduration=2.998633639 podStartE2EDuration="2.998633639s" podCreationTimestamp="2026-02-19 21:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:45.994910299 +0000 UTC m=+266.266352799" watchObservedRunningTime="2026-02-19 21:32:45.998633639 +0000 UTC m=+266.270076149" Feb 19 21:32:46 crc kubenswrapper[4771]: I0219 21:32:46.385084 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b59895879-lwmgt" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.402781 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzfd7"] Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.404674 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.432349 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzfd7"] Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.586210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddk9b\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-kube-api-access-ddk9b\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.586280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c860d8ce-626b-4858-a7c9-32dba1dba1b2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.586454 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c860d8ce-626b-4858-a7c9-32dba1dba1b2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.586503 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-registry-tls\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.586583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c860d8ce-626b-4858-a7c9-32dba1dba1b2-registry-certificates\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.586717 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.586906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-bound-sa-token\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.587010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c860d8ce-626b-4858-a7c9-32dba1dba1b2-trusted-ca\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.678442 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.688084 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddk9b\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-kube-api-access-ddk9b\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.688410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c860d8ce-626b-4858-a7c9-32dba1dba1b2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.688472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-registry-tls\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.688505 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c860d8ce-626b-4858-a7c9-32dba1dba1b2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.688556 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c860d8ce-626b-4858-a7c9-32dba1dba1b2-registry-certificates\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.688673 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-bound-sa-token\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.688737 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c860d8ce-626b-4858-a7c9-32dba1dba1b2-trusted-ca\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.689840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c860d8ce-626b-4858-a7c9-32dba1dba1b2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.690465 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c860d8ce-626b-4858-a7c9-32dba1dba1b2-registry-certificates\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.694587 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-registry-tls\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.695464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c860d8ce-626b-4858-a7c9-32dba1dba1b2-trusted-ca\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.696051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c860d8ce-626b-4858-a7c9-32dba1dba1b2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.723313 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddk9b\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-kube-api-access-ddk9b\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.724004 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c860d8ce-626b-4858-a7c9-32dba1dba1b2-bound-sa-token\") pod \"image-registry-66df7c8f76-tzfd7\" (UID: \"c860d8ce-626b-4858-a7c9-32dba1dba1b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.733608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:56 crc kubenswrapper[4771]: I0219 21:32:56.955355 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tzfd7"] Feb 19 21:32:57 crc kubenswrapper[4771]: I0219 21:32:57.035757 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" event={"ID":"c860d8ce-626b-4858-a7c9-32dba1dba1b2","Type":"ContainerStarted","Data":"00f2115ccb1aab4fb35d611d051210374736aacd273dc9b447720ad9e99beda4"} Feb 19 21:32:58 crc kubenswrapper[4771]: I0219 21:32:58.043712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" event={"ID":"c860d8ce-626b-4858-a7c9-32dba1dba1b2","Type":"ContainerStarted","Data":"23d80b07050073bb7a6c7e9f1171f1d90beb51531f9ed37ed908d5b17041d9dc"} Feb 19 21:32:58 crc kubenswrapper[4771]: I0219 21:32:58.045302 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:32:58 crc kubenswrapper[4771]: I0219 21:32:58.080785 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" podStartSLOduration=2.080756843 podStartE2EDuration="2.080756843s" podCreationTimestamp="2026-02-19 21:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:58.076848488 +0000 UTC m=+278.348291028" watchObservedRunningTime="2026-02-19 21:32:58.080756843 +0000 UTC m=+278.352199353" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.180673 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" podUID="bd6cd164-e4e0-4e2a-b08e-271e84a62b52" containerName="oauth-openshift" containerID="cri-o://3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0" gracePeriod=15 Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.620352 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.686427 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-868cdf4b87-nhbp8"] Feb 19 21:33:02 crc kubenswrapper[4771]: E0219 21:33:02.686714 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6cd164-e4e0-4e2a-b08e-271e84a62b52" containerName="oauth-openshift" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.686741 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6cd164-e4e0-4e2a-b08e-271e84a62b52" containerName="oauth-openshift" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.686919 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6cd164-e4e0-4e2a-b08e-271e84a62b52" containerName="oauth-openshift" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.687511 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.704574 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-868cdf4b87-nhbp8"] Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.787849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-trusted-ca-bundle\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.787912 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-ocp-branding-template\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.787935 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-login\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.788191 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-policies\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.788211 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-dir\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789034 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-error\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789073 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-session\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.788394 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789098 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-router-certs\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789141 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5rch\" (UniqueName: \"kubernetes.io/projected/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-kube-api-access-v5rch\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789188 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-provider-selection\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789248 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-idp-0-file-data\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789289 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-service-ca\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789391 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-cliconfig\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789695 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-serving-cert\") pod \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\" (UID: \"bd6cd164-e4e0-4e2a-b08e-271e84a62b52\") " Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-service-ca\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr2s9\" (UniqueName: \"kubernetes.io/projected/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-kube-api-access-qr2s9\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.789997 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-audit-policies\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-router-certs\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790069 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-session\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790104 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790130 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-login\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790151 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790170 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790214 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790235 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-error\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790262 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-audit-dir\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790294 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790335 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790389 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790403 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.790429 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.791080 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.791317 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.793411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-kube-api-access-v5rch" (OuterVolumeSpecName: "kube-api-access-v5rch") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "kube-api-access-v5rch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.793454 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.793618 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.794648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.796611 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.797157 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.797607 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.797846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.798748 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "bd6cd164-e4e0-4e2a-b08e-271e84a62b52" (UID: "bd6cd164-e4e0-4e2a-b08e-271e84a62b52"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-service-ca\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891755 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr2s9\" (UniqueName: \"kubernetes.io/projected/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-kube-api-access-qr2s9\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-audit-policies\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891802 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-router-certs\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891819 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-session\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-login\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891954 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.891985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-error\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892040 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-audit-dir\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892073 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892138 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5rch\" (UniqueName: \"kubernetes.io/projected/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-kube-api-access-v5rch\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892153 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892164 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892174 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892186 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892195 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892204 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892213 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892224 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892233 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892243 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892251 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd6cd164-e4e0-4e2a-b08e-271e84a62b52-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.892703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-audit-dir\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.893073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-service-ca\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.896290 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.897060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.897829 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.897942 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-router-certs\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.898343 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-audit-policies\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.898705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.898804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-session\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.899984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.900598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-login\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.901834 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.902422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-v4-0-config-user-template-error\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:02 crc kubenswrapper[4771]: I0219 21:33:02.916756 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr2s9\" (UniqueName: \"kubernetes.io/projected/4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2-kube-api-access-qr2s9\") pod \"oauth-openshift-868cdf4b87-nhbp8\" (UID: \"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2\") " pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.020384 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.021169 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk"] Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.021469 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" podUID="bc131b03-351f-4002-aaa4-e77ea4d9d7cb" containerName="controller-manager" containerID="cri-o://3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.077636 4771 generic.go:334] "Generic (PLEG): container finished" podID="bd6cd164-e4e0-4e2a-b08e-271e84a62b52" containerID="3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.077733 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.077763 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" event={"ID":"bd6cd164-e4e0-4e2a-b08e-271e84a62b52","Type":"ContainerDied","Data":"3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0"} Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.078659 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h" event={"ID":"bd6cd164-e4e0-4e2a-b08e-271e84a62b52","Type":"ContainerDied","Data":"be24286a0061deb29b694e4b16fee4442654b0b2f6eed09aa42a13bcaabeecdc"} Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.078695 4771 scope.go:117] "RemoveContainer" containerID="3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.152339 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h"] Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.157651 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-hqx2h"] Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.162108 4771 scope.go:117] "RemoveContainer" containerID="3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0" Feb 19 21:33:03 crc kubenswrapper[4771]: E0219 21:33:03.162696 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0\": container with ID starting with 3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0 not found: ID does not exist" containerID="3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.162742 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0"} err="failed to get container status \"3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0\": rpc error: code = NotFound desc = could not find container \"3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0\": container with ID starting with 3899933145e1ea6b451330fa98080caa7001eadeda362fe8532d63057dab58b0 not found: ID does not exist" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.455697 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.490382 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-868cdf4b87-nhbp8"] Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.606132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-proxy-ca-bundles\") pod \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.606268 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-config\") pod \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.606351 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-client-ca\") pod \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.607323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-client-ca" (OuterVolumeSpecName: "client-ca") pod "bc131b03-351f-4002-aaa4-e77ea4d9d7cb" (UID: "bc131b03-351f-4002-aaa4-e77ea4d9d7cb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.607398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-config" (OuterVolumeSpecName: "config") pod "bc131b03-351f-4002-aaa4-e77ea4d9d7cb" (UID: "bc131b03-351f-4002-aaa4-e77ea4d9d7cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.607539 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bc131b03-351f-4002-aaa4-e77ea4d9d7cb" (UID: "bc131b03-351f-4002-aaa4-e77ea4d9d7cb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.608052 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-serving-cert\") pod \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.608145 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwl4b\" (UniqueName: \"kubernetes.io/projected/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-kube-api-access-fwl4b\") pod \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\" (UID: \"bc131b03-351f-4002-aaa4-e77ea4d9d7cb\") " Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.609520 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.609801 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.609814 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.611646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc131b03-351f-4002-aaa4-e77ea4d9d7cb" (UID: "bc131b03-351f-4002-aaa4-e77ea4d9d7cb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.611734 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-kube-api-access-fwl4b" (OuterVolumeSpecName: "kube-api-access-fwl4b") pod "bc131b03-351f-4002-aaa4-e77ea4d9d7cb" (UID: "bc131b03-351f-4002-aaa4-e77ea4d9d7cb"). InnerVolumeSpecName "kube-api-access-fwl4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.710963 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4771]: I0219 21:33:03.710999 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwl4b\" (UniqueName: \"kubernetes.io/projected/bc131b03-351f-4002-aaa4-e77ea4d9d7cb-kube-api-access-fwl4b\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.091426 4771 generic.go:334] "Generic (PLEG): container finished" podID="bc131b03-351f-4002-aaa4-e77ea4d9d7cb" containerID="3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa" exitCode=0 Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.091516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" event={"ID":"bc131b03-351f-4002-aaa4-e77ea4d9d7cb","Type":"ContainerDied","Data":"3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa"} Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.091596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" event={"ID":"bc131b03-351f-4002-aaa4-e77ea4d9d7cb","Type":"ContainerDied","Data":"3636a557a5c28224a03b0df0e6eea065037f32039286d43d9157d3e54052b87c"} Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.091626 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.091643 4771 scope.go:117] "RemoveContainer" containerID="3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.094970 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" event={"ID":"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2","Type":"ContainerStarted","Data":"e91fc1d3cb6a392fd283dde4e4befee7e82b8ace267e8c7d7a1f8be7d3da3f95"} Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.095058 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" event={"ID":"4e0f31e4-dbd4-42eb-83a2-7a7e1a5953c2","Type":"ContainerStarted","Data":"91c8ad1db4f079b0af8b29bc1c8f0544feba90f0fef38d339ce79d2ae3fd0c9a"} Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.096681 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.121759 4771 scope.go:117] "RemoveContainer" containerID="3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa" Feb 19 21:33:04 crc kubenswrapper[4771]: E0219 21:33:04.122273 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa\": container with ID starting with 3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa not found: ID does not exist" containerID="3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.122329 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa"} err="failed to get container status \"3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa\": rpc error: code = NotFound desc = could not find container \"3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa\": container with ID starting with 3162f9594e86c289870cb1e18079f4ab80cf92d5195bf0163e3a41606bc04aaa not found: ID does not exist" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.140987 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" podStartSLOduration=27.140955799 podStartE2EDuration="27.140955799s" podCreationTimestamp="2026-02-19 21:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:04.135538708 +0000 UTC m=+284.406981208" watchObservedRunningTime="2026-02-19 21:33:04.140955799 +0000 UTC m=+284.412398309" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.159458 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk"] Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.163244 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-vfhgk"] Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.325267 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-868cdf4b87-nhbp8" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.443627 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc131b03-351f-4002-aaa4-e77ea4d9d7cb" path="/var/lib/kubelet/pods/bc131b03-351f-4002-aaa4-e77ea4d9d7cb/volumes" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.444341 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6cd164-e4e0-4e2a-b08e-271e84a62b52" path="/var/lib/kubelet/pods/bd6cd164-e4e0-4e2a-b08e-271e84a62b52/volumes" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.485752 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-sr7jv"] Feb 19 21:33:04 crc kubenswrapper[4771]: E0219 21:33:04.486010 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc131b03-351f-4002-aaa4-e77ea4d9d7cb" containerName="controller-manager" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.486047 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc131b03-351f-4002-aaa4-e77ea4d9d7cb" containerName="controller-manager" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.486159 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc131b03-351f-4002-aaa4-e77ea4d9d7cb" containerName="controller-manager" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.486617 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.489687 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.489846 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.489867 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.490769 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.491071 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.491654 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.499157 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.504919 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-sr7jv"] Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.624415 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vshlr\" (UniqueName: \"kubernetes.io/projected/669663d6-502e-4e96-bb00-6adafc9ee085-kube-api-access-vshlr\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.624486 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-client-ca\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.624574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669663d6-502e-4e96-bb00-6adafc9ee085-serving-cert\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.624609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-proxy-ca-bundles\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.624631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-config\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.725468 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-client-ca\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.725775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669663d6-502e-4e96-bb00-6adafc9ee085-serving-cert\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.725805 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-proxy-ca-bundles\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.725826 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-config\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.725859 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vshlr\" (UniqueName: \"kubernetes.io/projected/669663d6-502e-4e96-bb00-6adafc9ee085-kube-api-access-vshlr\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.726843 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-proxy-ca-bundles\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.727172 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-config\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.727693 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/669663d6-502e-4e96-bb00-6adafc9ee085-client-ca\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.729698 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669663d6-502e-4e96-bb00-6adafc9ee085-serving-cert\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.743993 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vshlr\" (UniqueName: \"kubernetes.io/projected/669663d6-502e-4e96-bb00-6adafc9ee085-kube-api-access-vshlr\") pod \"controller-manager-86855dfb7-sr7jv\" (UID: \"669663d6-502e-4e96-bb00-6adafc9ee085\") " pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:04 crc kubenswrapper[4771]: I0219 21:33:04.798979 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:05 crc kubenswrapper[4771]: I0219 21:33:05.041737 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86855dfb7-sr7jv"] Feb 19 21:33:05 crc kubenswrapper[4771]: W0219 21:33:05.051914 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod669663d6_502e_4e96_bb00_6adafc9ee085.slice/crio-d7646c02b86bfb307f74111905ca7590efe7901139acbc3f53f457ee212ff55a WatchSource:0}: Error finding container d7646c02b86bfb307f74111905ca7590efe7901139acbc3f53f457ee212ff55a: Status 404 returned error can't find the container with id d7646c02b86bfb307f74111905ca7590efe7901139acbc3f53f457ee212ff55a Feb 19 21:33:05 crc kubenswrapper[4771]: I0219 21:33:05.103411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" event={"ID":"669663d6-502e-4e96-bb00-6adafc9ee085","Type":"ContainerStarted","Data":"d7646c02b86bfb307f74111905ca7590efe7901139acbc3f53f457ee212ff55a"} Feb 19 21:33:06 crc kubenswrapper[4771]: I0219 21:33:06.120348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" event={"ID":"669663d6-502e-4e96-bb00-6adafc9ee085","Type":"ContainerStarted","Data":"f006a21c80a508d395c8c63268ac07cf4a2efd7ad4f9d81aeb7e0122bd9c7799"} Feb 19 21:33:06 crc kubenswrapper[4771]: I0219 21:33:06.154081 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" podStartSLOduration=3.154010801 podStartE2EDuration="3.154010801s" podCreationTimestamp="2026-02-19 21:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:06.14980199 +0000 UTC m=+286.421244550" watchObservedRunningTime="2026-02-19 21:33:06.154010801 +0000 UTC m=+286.425453311" Feb 19 21:33:07 crc kubenswrapper[4771]: I0219 21:33:07.124969 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:07 crc kubenswrapper[4771]: I0219 21:33:07.132440 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86855dfb7-sr7jv" Feb 19 21:33:16 crc kubenswrapper[4771]: I0219 21:33:16.742733 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tzfd7" Feb 19 21:33:16 crc kubenswrapper[4771]: I0219 21:33:16.817644 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b4lf5"] Feb 19 21:33:20 crc kubenswrapper[4771]: I0219 21:33:20.209780 4771 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 21:33:41 crc kubenswrapper[4771]: I0219 21:33:41.879545 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" podUID="bb0d9dfc-00bf-48fb-b62f-9267f59db52f" containerName="registry" containerID="cri-o://f8bb52494c87f5501671a411bb1f77da07f326741145af9bdd8d4cbd68d3dc4d" gracePeriod=30 Feb 19 21:33:42 crc kubenswrapper[4771]: I0219 21:33:42.377197 4771 generic.go:334] "Generic (PLEG): container finished" podID="bb0d9dfc-00bf-48fb-b62f-9267f59db52f" containerID="f8bb52494c87f5501671a411bb1f77da07f326741145af9bdd8d4cbd68d3dc4d" exitCode=0 Feb 19 21:33:42 crc kubenswrapper[4771]: I0219 21:33:42.377331 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" event={"ID":"bb0d9dfc-00bf-48fb-b62f-9267f59db52f","Type":"ContainerDied","Data":"f8bb52494c87f5501671a411bb1f77da07f326741145af9bdd8d4cbd68d3dc4d"} Feb 19 21:33:42 crc kubenswrapper[4771]: I0219 21:33:42.923826 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.031233 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-bound-sa-token\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.031637 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-installation-pull-secrets\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.031696 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-tls\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.031883 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.031937 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-trusted-ca\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.031993 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-ca-trust-extracted\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.032114 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gndc8\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-kube-api-access-gndc8\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.032159 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-certificates\") pod \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\" (UID: \"bb0d9dfc-00bf-48fb-b62f-9267f59db52f\") " Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.032967 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.033713 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.039122 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.039627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-kube-api-access-gndc8" (OuterVolumeSpecName: "kube-api-access-gndc8") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "kube-api-access-gndc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.040449 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.041356 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.044870 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.052320 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bb0d9dfc-00bf-48fb-b62f-9267f59db52f" (UID: "bb0d9dfc-00bf-48fb-b62f-9267f59db52f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.134347 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.134383 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gndc8\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-kube-api-access-gndc8\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.134398 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.134412 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.134426 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.134438 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.134448 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb0d9dfc-00bf-48fb-b62f-9267f59db52f-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.388363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" event={"ID":"bb0d9dfc-00bf-48fb-b62f-9267f59db52f","Type":"ContainerDied","Data":"1ecd8ef9d63684c2f91451b565ea6b66ed9087d0c539194ee7b148c4580a0341"} Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.388447 4771 scope.go:117] "RemoveContainer" containerID="f8bb52494c87f5501671a411bb1f77da07f326741145af9bdd8d4cbd68d3dc4d" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.388452 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b4lf5" Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.432708 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b4lf5"] Feb 19 21:33:43 crc kubenswrapper[4771]: I0219 21:33:43.441858 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b4lf5"] Feb 19 21:33:44 crc kubenswrapper[4771]: I0219 21:33:44.449592 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0d9dfc-00bf-48fb-b62f-9267f59db52f" path="/var/lib/kubelet/pods/bb0d9dfc-00bf-48fb-b62f-9267f59db52f/volumes" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.154381 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ctvtj"] Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.155106 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ctvtj" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="registry-server" containerID="cri-o://5a15357fe4a2c2d6153bf88b17e8dcd0ee422a52edc941dd0eaf55a94ad3f181" gracePeriod=30 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.205639 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6bkf"] Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.205954 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k6bkf" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="registry-server" containerID="cri-o://91b9fb574745b688c1ae4c8f1f9edb890dfe2aaefdcf1bf06a3488bb8e5e000f" gracePeriod=30 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.216067 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbd5d"] Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.216241 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerName="marketplace-operator" containerID="cri-o://648ecd3015733aecc2c84ee53556799b6b4f07a4cba6cf9534cd2b8369a3be7a" gracePeriod=30 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.224932 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkhxs"] Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.225216 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bkhxs" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="registry-server" containerID="cri-o://fe7bdd1d995d51dad5949e2c9e7a2f7974d18ac6752c394aa8871538bd276ed9" gracePeriod=30 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.229940 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4pg9"] Feb 19 21:34:36 crc kubenswrapper[4771]: E0219 21:34:36.230181 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0d9dfc-00bf-48fb-b62f-9267f59db52f" containerName="registry" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.230195 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0d9dfc-00bf-48fb-b62f-9267f59db52f" containerName="registry" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.230282 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0d9dfc-00bf-48fb-b62f-9267f59db52f" containerName="registry" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.230707 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.244197 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hp8qv"] Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.244500 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hp8qv" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="registry-server" containerID="cri-o://b0826642160b1327710fb042b86300f4cbb4f67f1cec1d32a569dce430d7b4c1" gracePeriod=30 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.254139 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4pg9"] Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.336190 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75pph\" (UniqueName: \"kubernetes.io/projected/426c7c73-e3a3-4430-9610-eb935a493fa8-kube-api-access-75pph\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.336238 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/426c7c73-e3a3-4430-9610-eb935a493fa8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.336261 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/426c7c73-e3a3-4430-9610-eb935a493fa8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.436884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75pph\" (UniqueName: \"kubernetes.io/projected/426c7c73-e3a3-4430-9610-eb935a493fa8-kube-api-access-75pph\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.437243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/426c7c73-e3a3-4430-9610-eb935a493fa8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.437276 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/426c7c73-e3a3-4430-9610-eb935a493fa8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.438581 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/426c7c73-e3a3-4430-9610-eb935a493fa8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.469186 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75pph\" (UniqueName: \"kubernetes.io/projected/426c7c73-e3a3-4430-9610-eb935a493fa8-kube-api-access-75pph\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.469190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/426c7c73-e3a3-4430-9610-eb935a493fa8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4pg9\" (UID: \"426c7c73-e3a3-4430-9610-eb935a493fa8\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.873242 4771 generic.go:334] "Generic (PLEG): container finished" podID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerID="5a15357fe4a2c2d6153bf88b17e8dcd0ee422a52edc941dd0eaf55a94ad3f181" exitCode=0 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.873358 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctvtj" event={"ID":"7b38fbe5-f497-4deb-aaff-6d95b89a1783","Type":"ContainerDied","Data":"5a15357fe4a2c2d6153bf88b17e8dcd0ee422a52edc941dd0eaf55a94ad3f181"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.873454 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctvtj" event={"ID":"7b38fbe5-f497-4deb-aaff-6d95b89a1783","Type":"ContainerDied","Data":"7776d49f10461512ce74807265c66ac7d1b81b687a52cb29c5a31bf832056f2e"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.873513 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7776d49f10461512ce74807265c66ac7d1b81b687a52cb29c5a31bf832056f2e" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.876222 4771 generic.go:334] "Generic (PLEG): container finished" podID="52f892c4-4660-4b08-8102-ff370236d676" containerID="fe7bdd1d995d51dad5949e2c9e7a2f7974d18ac6752c394aa8871538bd276ed9" exitCode=0 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.876296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkhxs" event={"ID":"52f892c4-4660-4b08-8102-ff370236d676","Type":"ContainerDied","Data":"fe7bdd1d995d51dad5949e2c9e7a2f7974d18ac6752c394aa8871538bd276ed9"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.876328 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkhxs" event={"ID":"52f892c4-4660-4b08-8102-ff370236d676","Type":"ContainerDied","Data":"ddd14b953115769a1b7e1bcf32bce3399cfe66dcb53760169bbb312d5c6ef8fa"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.876342 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd14b953115769a1b7e1bcf32bce3399cfe66dcb53760169bbb312d5c6ef8fa" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.877557 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.878309 4771 generic.go:334] "Generic (PLEG): container finished" podID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerID="648ecd3015733aecc2c84ee53556799b6b4f07a4cba6cf9534cd2b8369a3be7a" exitCode=0 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.878394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" event={"ID":"0c7897d6-b26c-4f91-ab7f-64da32791052","Type":"ContainerDied","Data":"648ecd3015733aecc2c84ee53556799b6b4f07a4cba6cf9534cd2b8369a3be7a"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.878424 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" event={"ID":"0c7897d6-b26c-4f91-ab7f-64da32791052","Type":"ContainerDied","Data":"8742a43f66ee0872b490a165c8349f346f3526eacaced774ecf922d0e3feba2e"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.878440 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8742a43f66ee0872b490a165c8349f346f3526eacaced774ecf922d0e3feba2e" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.881481 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.882251 4771 generic.go:334] "Generic (PLEG): container finished" podID="18293c18-339b-4740-9828-5b4d07850142" containerID="b0826642160b1327710fb042b86300f4cbb4f67f1cec1d32a569dce430d7b4c1" exitCode=0 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.882309 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8qv" event={"ID":"18293c18-339b-4740-9828-5b4d07850142","Type":"ContainerDied","Data":"b0826642160b1327710fb042b86300f4cbb4f67f1cec1d32a569dce430d7b4c1"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.882336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8qv" event={"ID":"18293c18-339b-4740-9828-5b4d07850142","Type":"ContainerDied","Data":"95e55da8bc839af8a10f36b8019413d29da76febc3c8dd48e20d562ba5568431"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.882349 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95e55da8bc839af8a10f36b8019413d29da76febc3c8dd48e20d562ba5568431" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.884114 4771 generic.go:334] "Generic (PLEG): container finished" podID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerID="91b9fb574745b688c1ae4c8f1f9edb890dfe2aaefdcf1bf06a3488bb8e5e000f" exitCode=0 Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.884172 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6bkf" event={"ID":"a21efe71-f4d3-4155-af2a-06ebc4646f14","Type":"ContainerDied","Data":"91b9fb574745b688c1ae4c8f1f9edb890dfe2aaefdcf1bf06a3488bb8e5e000f"} Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.892265 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.912625 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.921185 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.955931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-trusted-ca\") pod \"0c7897d6-b26c-4f91-ab7f-64da32791052\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.955996 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvvvv\" (UniqueName: \"kubernetes.io/projected/7b38fbe5-f497-4deb-aaff-6d95b89a1783-kube-api-access-zvvvv\") pod \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.956087 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rzv9\" (UniqueName: \"kubernetes.io/projected/18293c18-339b-4740-9828-5b4d07850142-kube-api-access-8rzv9\") pod \"18293c18-339b-4740-9828-5b4d07850142\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.956890 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-utilities\") pod \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.956931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmsb5\" (UniqueName: \"kubernetes.io/projected/52f892c4-4660-4b08-8102-ff370236d676-kube-api-access-fmsb5\") pod \"52f892c4-4660-4b08-8102-ff370236d676\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.956992 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-utilities\") pod \"18293c18-339b-4740-9828-5b4d07850142\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.957030 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-catalog-content\") pod \"52f892c4-4660-4b08-8102-ff370236d676\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.957084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rplkf\" (UniqueName: \"kubernetes.io/projected/0c7897d6-b26c-4f91-ab7f-64da32791052-kube-api-access-rplkf\") pod \"0c7897d6-b26c-4f91-ab7f-64da32791052\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.957133 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-catalog-content\") pod \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\" (UID: \"7b38fbe5-f497-4deb-aaff-6d95b89a1783\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.957169 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-utilities\") pod \"52f892c4-4660-4b08-8102-ff370236d676\" (UID: \"52f892c4-4660-4b08-8102-ff370236d676\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.957198 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-catalog-content\") pod \"18293c18-339b-4740-9828-5b4d07850142\" (UID: \"18293c18-339b-4740-9828-5b4d07850142\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.957226 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-operator-metrics\") pod \"0c7897d6-b26c-4f91-ab7f-64da32791052\" (UID: \"0c7897d6-b26c-4f91-ab7f-64da32791052\") " Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.960216 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-utilities" (OuterVolumeSpecName: "utilities") pod "18293c18-339b-4740-9828-5b4d07850142" (UID: "18293c18-339b-4740-9828-5b4d07850142"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.960922 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-utilities" (OuterVolumeSpecName: "utilities") pod "7b38fbe5-f497-4deb-aaff-6d95b89a1783" (UID: "7b38fbe5-f497-4deb-aaff-6d95b89a1783"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.961572 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0c7897d6-b26c-4f91-ab7f-64da32791052" (UID: "0c7897d6-b26c-4f91-ab7f-64da32791052"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.962493 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0c7897d6-b26c-4f91-ab7f-64da32791052" (UID: "0c7897d6-b26c-4f91-ab7f-64da32791052"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.963305 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-utilities" (OuterVolumeSpecName: "utilities") pod "52f892c4-4660-4b08-8102-ff370236d676" (UID: "52f892c4-4660-4b08-8102-ff370236d676"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.964412 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f892c4-4660-4b08-8102-ff370236d676-kube-api-access-fmsb5" (OuterVolumeSpecName: "kube-api-access-fmsb5") pod "52f892c4-4660-4b08-8102-ff370236d676" (UID: "52f892c4-4660-4b08-8102-ff370236d676"). InnerVolumeSpecName "kube-api-access-fmsb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.967153 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:34:36 crc kubenswrapper[4771]: I0219 21:34:36.979483 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7897d6-b26c-4f91-ab7f-64da32791052-kube-api-access-rplkf" (OuterVolumeSpecName: "kube-api-access-rplkf") pod "0c7897d6-b26c-4f91-ab7f-64da32791052" (UID: "0c7897d6-b26c-4f91-ab7f-64da32791052"). InnerVolumeSpecName "kube-api-access-rplkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.003867 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b38fbe5-f497-4deb-aaff-6d95b89a1783-kube-api-access-zvvvv" (OuterVolumeSpecName: "kube-api-access-zvvvv") pod "7b38fbe5-f497-4deb-aaff-6d95b89a1783" (UID: "7b38fbe5-f497-4deb-aaff-6d95b89a1783"). InnerVolumeSpecName "kube-api-access-zvvvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.007304 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18293c18-339b-4740-9828-5b4d07850142-kube-api-access-8rzv9" (OuterVolumeSpecName: "kube-api-access-8rzv9") pod "18293c18-339b-4740-9828-5b4d07850142" (UID: "18293c18-339b-4740-9828-5b4d07850142"). InnerVolumeSpecName "kube-api-access-8rzv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.011054 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52f892c4-4660-4b08-8102-ff370236d676" (UID: "52f892c4-4660-4b08-8102-ff370236d676"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.020791 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b38fbe5-f497-4deb-aaff-6d95b89a1783" (UID: "7b38fbe5-f497-4deb-aaff-6d95b89a1783"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.058561 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-catalog-content\") pod \"a21efe71-f4d3-4155-af2a-06ebc4646f14\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.058718 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxcm9\" (UniqueName: \"kubernetes.io/projected/a21efe71-f4d3-4155-af2a-06ebc4646f14-kube-api-access-sxcm9\") pod \"a21efe71-f4d3-4155-af2a-06ebc4646f14\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.058757 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-utilities\") pod \"a21efe71-f4d3-4155-af2a-06ebc4646f14\" (UID: \"a21efe71-f4d3-4155-af2a-06ebc4646f14\") " Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.058964 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmsb5\" (UniqueName: \"kubernetes.io/projected/52f892c4-4660-4b08-8102-ff370236d676-kube-api-access-fmsb5\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.058976 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.058985 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.058995 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rplkf\" (UniqueName: \"kubernetes.io/projected/0c7897d6-b26c-4f91-ab7f-64da32791052-kube-api-access-rplkf\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059003 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059011 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52f892c4-4660-4b08-8102-ff370236d676-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059033 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059042 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7897d6-b26c-4f91-ab7f-64da32791052-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059050 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvvvv\" (UniqueName: \"kubernetes.io/projected/7b38fbe5-f497-4deb-aaff-6d95b89a1783-kube-api-access-zvvvv\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059058 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rzv9\" (UniqueName: \"kubernetes.io/projected/18293c18-339b-4740-9828-5b4d07850142-kube-api-access-8rzv9\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059067 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b38fbe5-f497-4deb-aaff-6d95b89a1783-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.059459 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-utilities" (OuterVolumeSpecName: "utilities") pod "a21efe71-f4d3-4155-af2a-06ebc4646f14" (UID: "a21efe71-f4d3-4155-af2a-06ebc4646f14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.062455 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21efe71-f4d3-4155-af2a-06ebc4646f14-kube-api-access-sxcm9" (OuterVolumeSpecName: "kube-api-access-sxcm9") pod "a21efe71-f4d3-4155-af2a-06ebc4646f14" (UID: "a21efe71-f4d3-4155-af2a-06ebc4646f14"). InnerVolumeSpecName "kube-api-access-sxcm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.108812 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a21efe71-f4d3-4155-af2a-06ebc4646f14" (UID: "a21efe71-f4d3-4155-af2a-06ebc4646f14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.119433 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4pg9"] Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.141220 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18293c18-339b-4740-9828-5b4d07850142" (UID: "18293c18-339b-4740-9828-5b4d07850142"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.159711 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.159739 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18293c18-339b-4740-9828-5b4d07850142-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.159749 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxcm9\" (UniqueName: \"kubernetes.io/projected/a21efe71-f4d3-4155-af2a-06ebc4646f14-kube-api-access-sxcm9\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.159781 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21efe71-f4d3-4155-af2a-06ebc4646f14-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.893049 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" event={"ID":"426c7c73-e3a3-4430-9610-eb935a493fa8","Type":"ContainerStarted","Data":"1cf6e60e202ae91d9e00387fa0f67998c48f75de023d216de5bafd8efef61652"} Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.893284 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" event={"ID":"426c7c73-e3a3-4430-9610-eb935a493fa8","Type":"ContainerStarted","Data":"940e4d00496811e9a69ea76702fc2fe722233cf5fd5e3fa430e39d6e2da438c8"} Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.893304 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.896210 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp8qv" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.896263 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkhxs" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.896272 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vbd5d" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.896323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6bkf" event={"ID":"a21efe71-f4d3-4155-af2a-06ebc4646f14","Type":"ContainerDied","Data":"b6816599a945c21fa18578fc2365d74e9b8af2a5152e30d0c230e9f3cd23b05b"} Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.896369 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6bkf" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.896379 4771 scope.go:117] "RemoveContainer" containerID="91b9fb574745b688c1ae4c8f1f9edb890dfe2aaefdcf1bf06a3488bb8e5e000f" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.896461 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctvtj" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.901961 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.934810 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r4pg9" podStartSLOduration=1.9347751949999998 podStartE2EDuration="1.934775195s" podCreationTimestamp="2026-02-19 21:34:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:34:37.920320783 +0000 UTC m=+378.191763263" watchObservedRunningTime="2026-02-19 21:34:37.934775195 +0000 UTC m=+378.206217745" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.936997 4771 scope.go:117] "RemoveContainer" containerID="fff4dad894c6a38d885e4c46da7242e1e63ec9f50304814a4ade86f0487f2bd4" Feb 19 21:34:37 crc kubenswrapper[4771]: I0219 21:34:37.977639 4771 scope.go:117] "RemoveContainer" containerID="f1c7e3616234a57fcf8d54c7bb1f54ae9ce707cfb01d5377290494ed1c6ac38c" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.011714 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6bkf"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.020130 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k6bkf"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.024045 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ctvtj"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.031191 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ctvtj"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.034180 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkhxs"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.040114 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkhxs"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.066542 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbd5d"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.068791 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vbd5d"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.071312 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hp8qv"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.073798 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hp8qv"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.446975 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" path="/var/lib/kubelet/pods/0c7897d6-b26c-4f91-ab7f-64da32791052/volumes" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.448310 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18293c18-339b-4740-9828-5b4d07850142" path="/var/lib/kubelet/pods/18293c18-339b-4740-9828-5b4d07850142/volumes" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.449758 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f892c4-4660-4b08-8102-ff370236d676" path="/var/lib/kubelet/pods/52f892c4-4660-4b08-8102-ff370236d676/volumes" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.451795 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" path="/var/lib/kubelet/pods/7b38fbe5-f497-4deb-aaff-6d95b89a1783/volumes" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.452920 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" path="/var/lib/kubelet/pods/a21efe71-f4d3-4155-af2a-06ebc4646f14/volumes" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.575118 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vltjv"] Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.575517 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.575540 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.575557 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerName="marketplace-operator" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.575569 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerName="marketplace-operator" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.575988 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576041 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576064 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576076 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576096 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576108 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576126 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576138 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576151 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576166 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576182 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576194 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576211 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576223 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576237 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576249 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="extract-content" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576265 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576277 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576298 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576311 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="extract-utilities" Feb 19 21:34:38 crc kubenswrapper[4771]: E0219 21:34:38.576325 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576337 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576497 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="18293c18-339b-4740-9828-5b4d07850142" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.576525 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f892c4-4660-4b08-8102-ff370236d676" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.577171 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21efe71-f4d3-4155-af2a-06ebc4646f14" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.577191 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7897d6-b26c-4f91-ab7f-64da32791052" containerName="marketplace-operator" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.577207 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b38fbe5-f497-4deb-aaff-6d95b89a1783" containerName="registry-server" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.578328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.581423 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.591358 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vltjv"] Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.688820 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807daddd-f92f-47b4-8d12-fdbeab38b6f1-catalog-content\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.688874 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs5dr\" (UniqueName: \"kubernetes.io/projected/807daddd-f92f-47b4-8d12-fdbeab38b6f1-kube-api-access-cs5dr\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.689134 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807daddd-f92f-47b4-8d12-fdbeab38b6f1-utilities\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.790226 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807daddd-f92f-47b4-8d12-fdbeab38b6f1-catalog-content\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.790295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs5dr\" (UniqueName: \"kubernetes.io/projected/807daddd-f92f-47b4-8d12-fdbeab38b6f1-kube-api-access-cs5dr\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.790378 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807daddd-f92f-47b4-8d12-fdbeab38b6f1-utilities\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.791282 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807daddd-f92f-47b4-8d12-fdbeab38b6f1-utilities\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.791413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807daddd-f92f-47b4-8d12-fdbeab38b6f1-catalog-content\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.808473 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs5dr\" (UniqueName: \"kubernetes.io/projected/807daddd-f92f-47b4-8d12-fdbeab38b6f1-kube-api-access-cs5dr\") pod \"redhat-operators-vltjv\" (UID: \"807daddd-f92f-47b4-8d12-fdbeab38b6f1\") " pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:38 crc kubenswrapper[4771]: I0219 21:34:38.897495 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:39 crc kubenswrapper[4771]: I0219 21:34:39.199187 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vltjv"] Feb 19 21:34:39 crc kubenswrapper[4771]: W0219 21:34:39.207253 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807daddd_f92f_47b4_8d12_fdbeab38b6f1.slice/crio-b804fe0254323e6470956380b145462f9114214d7fb98f20393bb77d50bbf754 WatchSource:0}: Error finding container b804fe0254323e6470956380b145462f9114214d7fb98f20393bb77d50bbf754: Status 404 returned error can't find the container with id b804fe0254323e6470956380b145462f9114214d7fb98f20393bb77d50bbf754 Feb 19 21:34:39 crc kubenswrapper[4771]: I0219 21:34:39.914148 4771 generic.go:334] "Generic (PLEG): container finished" podID="807daddd-f92f-47b4-8d12-fdbeab38b6f1" containerID="bd10d27b77067cf9a6ae4861c7535359d5f827412899c2e82bcc1107a8298542" exitCode=0 Feb 19 21:34:39 crc kubenswrapper[4771]: I0219 21:34:39.914241 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vltjv" event={"ID":"807daddd-f92f-47b4-8d12-fdbeab38b6f1","Type":"ContainerDied","Data":"bd10d27b77067cf9a6ae4861c7535359d5f827412899c2e82bcc1107a8298542"} Feb 19 21:34:39 crc kubenswrapper[4771]: I0219 21:34:39.915061 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vltjv" event={"ID":"807daddd-f92f-47b4-8d12-fdbeab38b6f1","Type":"ContainerStarted","Data":"b804fe0254323e6470956380b145462f9114214d7fb98f20393bb77d50bbf754"} Feb 19 21:34:39 crc kubenswrapper[4771]: I0219 21:34:39.919713 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.369654 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vfcqq"] Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.370738 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.375197 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.380712 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfcqq"] Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.409393 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-utilities\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.409437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqqk\" (UniqueName: \"kubernetes.io/projected/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-kube-api-access-lzqqk\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.409470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-catalog-content\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.510350 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-catalog-content\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.510491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-utilities\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.510557 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzqqk\" (UniqueName: \"kubernetes.io/projected/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-kube-api-access-lzqqk\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.511160 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-catalog-content\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.511465 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-utilities\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.556543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzqqk\" (UniqueName: \"kubernetes.io/projected/5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd-kube-api-access-lzqqk\") pod \"certified-operators-vfcqq\" (UID: \"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd\") " pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.727958 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.921752 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vltjv" event={"ID":"807daddd-f92f-47b4-8d12-fdbeab38b6f1","Type":"ContainerStarted","Data":"71355d57efdbdcdc45e6255e6db1c54cf6b5ab2e9f27f9f6f46381fd7be2c20e"} Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.974436 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rvrdf"] Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.975354 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.977408 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:34:40 crc kubenswrapper[4771]: I0219 21:34:40.986450 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvrdf"] Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.015784 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv8gb\" (UniqueName: \"kubernetes.io/projected/628cca0e-2b72-4434-8fb2-03474a586f19-kube-api-access-kv8gb\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.015852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628cca0e-2b72-4434-8fb2-03474a586f19-catalog-content\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.015908 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628cca0e-2b72-4434-8fb2-03474a586f19-utilities\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.117546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628cca0e-2b72-4434-8fb2-03474a586f19-catalog-content\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.117597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628cca0e-2b72-4434-8fb2-03474a586f19-utilities\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.117673 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv8gb\" (UniqueName: \"kubernetes.io/projected/628cca0e-2b72-4434-8fb2-03474a586f19-kube-api-access-kv8gb\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.118367 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628cca0e-2b72-4434-8fb2-03474a586f19-catalog-content\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.118700 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628cca0e-2b72-4434-8fb2-03474a586f19-utilities\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.135709 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv8gb\" (UniqueName: \"kubernetes.io/projected/628cca0e-2b72-4434-8fb2-03474a586f19-kube-api-access-kv8gb\") pod \"community-operators-rvrdf\" (UID: \"628cca0e-2b72-4434-8fb2-03474a586f19\") " pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.152712 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfcqq"] Feb 19 21:34:41 crc kubenswrapper[4771]: W0219 21:34:41.163498 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5158244b_cdfd_48fa_bbb6_ba1f5dbaf6bd.slice/crio-d8d3974adf5bd821ad26cd15a1f97467f96c65928c2af81e7bc09e9773ddcc06 WatchSource:0}: Error finding container d8d3974adf5bd821ad26cd15a1f97467f96c65928c2af81e7bc09e9773ddcc06: Status 404 returned error can't find the container with id d8d3974adf5bd821ad26cd15a1f97467f96c65928c2af81e7bc09e9773ddcc06 Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.287876 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.691579 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvrdf"] Feb 19 21:34:41 crc kubenswrapper[4771]: W0219 21:34:41.694374 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628cca0e_2b72_4434_8fb2_03474a586f19.slice/crio-5beeca9eebc1ba511548d8fd5f084af74373e6132ca067885eeb10d35457db8c WatchSource:0}: Error finding container 5beeca9eebc1ba511548d8fd5f084af74373e6132ca067885eeb10d35457db8c: Status 404 returned error can't find the container with id 5beeca9eebc1ba511548d8fd5f084af74373e6132ca067885eeb10d35457db8c Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.928850 4771 generic.go:334] "Generic (PLEG): container finished" podID="807daddd-f92f-47b4-8d12-fdbeab38b6f1" containerID="71355d57efdbdcdc45e6255e6db1c54cf6b5ab2e9f27f9f6f46381fd7be2c20e" exitCode=0 Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.928920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vltjv" event={"ID":"807daddd-f92f-47b4-8d12-fdbeab38b6f1","Type":"ContainerDied","Data":"71355d57efdbdcdc45e6255e6db1c54cf6b5ab2e9f27f9f6f46381fd7be2c20e"} Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.932795 4771 generic.go:334] "Generic (PLEG): container finished" podID="5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd" containerID="b24ae90a81ff162db3c28162da79e8c87103674e72ca53e71923e86be17ba42a" exitCode=0 Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.932849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfcqq" event={"ID":"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd","Type":"ContainerDied","Data":"b24ae90a81ff162db3c28162da79e8c87103674e72ca53e71923e86be17ba42a"} Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.932873 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfcqq" event={"ID":"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd","Type":"ContainerStarted","Data":"d8d3974adf5bd821ad26cd15a1f97467f96c65928c2af81e7bc09e9773ddcc06"} Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.935485 4771 generic.go:334] "Generic (PLEG): container finished" podID="628cca0e-2b72-4434-8fb2-03474a586f19" containerID="a5bbf61c35cff4d4d5bac88152d0950899dc3f33309b8f88a72cef567958c1ee" exitCode=0 Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.935531 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvrdf" event={"ID":"628cca0e-2b72-4434-8fb2-03474a586f19","Type":"ContainerDied","Data":"a5bbf61c35cff4d4d5bac88152d0950899dc3f33309b8f88a72cef567958c1ee"} Feb 19 21:34:41 crc kubenswrapper[4771]: I0219 21:34:41.935556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvrdf" event={"ID":"628cca0e-2b72-4434-8fb2-03474a586f19","Type":"ContainerStarted","Data":"5beeca9eebc1ba511548d8fd5f084af74373e6132ca067885eeb10d35457db8c"} Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.772109 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z56lm"] Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.774317 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.776637 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.787911 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z56lm"] Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.838576 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45470de-22dd-4455-8fd6-dbaf60fddece-catalog-content\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.838623 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99bpn\" (UniqueName: \"kubernetes.io/projected/c45470de-22dd-4455-8fd6-dbaf60fddece-kube-api-access-99bpn\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.838639 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45470de-22dd-4455-8fd6-dbaf60fddece-utilities\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.940295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45470de-22dd-4455-8fd6-dbaf60fddece-catalog-content\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.940410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99bpn\" (UniqueName: \"kubernetes.io/projected/c45470de-22dd-4455-8fd6-dbaf60fddece-kube-api-access-99bpn\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.940457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45470de-22dd-4455-8fd6-dbaf60fddece-utilities\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.940794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c45470de-22dd-4455-8fd6-dbaf60fddece-catalog-content\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.941136 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c45470de-22dd-4455-8fd6-dbaf60fddece-utilities\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.943907 4771 generic.go:334] "Generic (PLEG): container finished" podID="628cca0e-2b72-4434-8fb2-03474a586f19" containerID="e52b3153aa34e1317fbdc308e7f1b423ed340738f2adbd7aae5663204f155b19" exitCode=0 Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.943991 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvrdf" event={"ID":"628cca0e-2b72-4434-8fb2-03474a586f19","Type":"ContainerDied","Data":"e52b3153aa34e1317fbdc308e7f1b423ed340738f2adbd7aae5663204f155b19"} Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.948423 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vltjv" event={"ID":"807daddd-f92f-47b4-8d12-fdbeab38b6f1","Type":"ContainerStarted","Data":"cd4c93011a06ef4c8b7b5a218678a7b0ffdd949037036b6b86c7e562e59b1338"} Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.950618 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfcqq" event={"ID":"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd","Type":"ContainerStarted","Data":"fad370b963a93ac4853be948c8178c3e6927594eaa9a2d8da9df1a0e6edb6506"} Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.956754 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.956809 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.974774 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99bpn\" (UniqueName: \"kubernetes.io/projected/c45470de-22dd-4455-8fd6-dbaf60fddece-kube-api-access-99bpn\") pod \"redhat-marketplace-z56lm\" (UID: \"c45470de-22dd-4455-8fd6-dbaf60fddece\") " pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:42 crc kubenswrapper[4771]: I0219 21:34:42.997849 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vltjv" podStartSLOduration=2.577687264 podStartE2EDuration="4.997830409s" podCreationTimestamp="2026-02-19 21:34:38 +0000 UTC" firstStartedPulling="2026-02-19 21:34:39.919230162 +0000 UTC m=+380.190672672" lastFinishedPulling="2026-02-19 21:34:42.339373347 +0000 UTC m=+382.610815817" observedRunningTime="2026-02-19 21:34:42.994633265 +0000 UTC m=+383.266075745" watchObservedRunningTime="2026-02-19 21:34:42.997830409 +0000 UTC m=+383.269272889" Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.127976 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.520681 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z56lm"] Feb 19 21:34:43 crc kubenswrapper[4771]: W0219 21:34:43.527702 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc45470de_22dd_4455_8fd6_dbaf60fddece.slice/crio-608da41cfc1f52e60a8351665eed401c1656b06bfb10c06a63e7767054047e92 WatchSource:0}: Error finding container 608da41cfc1f52e60a8351665eed401c1656b06bfb10c06a63e7767054047e92: Status 404 returned error can't find the container with id 608da41cfc1f52e60a8351665eed401c1656b06bfb10c06a63e7767054047e92 Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.957426 4771 generic.go:334] "Generic (PLEG): container finished" podID="5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd" containerID="fad370b963a93ac4853be948c8178c3e6927594eaa9a2d8da9df1a0e6edb6506" exitCode=0 Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.957783 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfcqq" event={"ID":"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd","Type":"ContainerDied","Data":"fad370b963a93ac4853be948c8178c3e6927594eaa9a2d8da9df1a0e6edb6506"} Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.959994 4771 generic.go:334] "Generic (PLEG): container finished" podID="c45470de-22dd-4455-8fd6-dbaf60fddece" containerID="d26b6081d0c108e89c71087b7584eadf23c6759043e941a89e3845e47029041f" exitCode=0 Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.960082 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z56lm" event={"ID":"c45470de-22dd-4455-8fd6-dbaf60fddece","Type":"ContainerDied","Data":"d26b6081d0c108e89c71087b7584eadf23c6759043e941a89e3845e47029041f"} Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.960108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z56lm" event={"ID":"c45470de-22dd-4455-8fd6-dbaf60fddece","Type":"ContainerStarted","Data":"608da41cfc1f52e60a8351665eed401c1656b06bfb10c06a63e7767054047e92"} Feb 19 21:34:43 crc kubenswrapper[4771]: I0219 21:34:43.963073 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvrdf" event={"ID":"628cca0e-2b72-4434-8fb2-03474a586f19","Type":"ContainerStarted","Data":"657321850d4e603d4d6bd096c897840408cd1e7ca64fced408124c73ac48e441"} Feb 19 21:34:44 crc kubenswrapper[4771]: I0219 21:34:44.968605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z56lm" event={"ID":"c45470de-22dd-4455-8fd6-dbaf60fddece","Type":"ContainerStarted","Data":"a1da1a88a8640d929b5dd0d99b339ca1bc11a0d45ee3563066bdf42576e195d4"} Feb 19 21:34:44 crc kubenswrapper[4771]: I0219 21:34:44.971822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfcqq" event={"ID":"5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd","Type":"ContainerStarted","Data":"30420f03dc27d4f11562a16304c4560972c20cc894c658ccc293172e02e76e34"} Feb 19 21:34:44 crc kubenswrapper[4771]: I0219 21:34:44.992934 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rvrdf" podStartSLOduration=3.597848696 podStartE2EDuration="4.992919308s" podCreationTimestamp="2026-02-19 21:34:40 +0000 UTC" firstStartedPulling="2026-02-19 21:34:41.937812062 +0000 UTC m=+382.209254532" lastFinishedPulling="2026-02-19 21:34:43.332882674 +0000 UTC m=+383.604325144" observedRunningTime="2026-02-19 21:34:44.027715989 +0000 UTC m=+384.299158479" watchObservedRunningTime="2026-02-19 21:34:44.992919308 +0000 UTC m=+385.264361768" Feb 19 21:34:45 crc kubenswrapper[4771]: I0219 21:34:45.006592 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vfcqq" podStartSLOduration=2.574094928 podStartE2EDuration="5.006573789s" podCreationTimestamp="2026-02-19 21:34:40 +0000 UTC" firstStartedPulling="2026-02-19 21:34:41.934466014 +0000 UTC m=+382.205908494" lastFinishedPulling="2026-02-19 21:34:44.366944885 +0000 UTC m=+384.638387355" observedRunningTime="2026-02-19 21:34:45.002967263 +0000 UTC m=+385.274409743" watchObservedRunningTime="2026-02-19 21:34:45.006573789 +0000 UTC m=+385.278016259" Feb 19 21:34:45 crc kubenswrapper[4771]: I0219 21:34:45.977843 4771 generic.go:334] "Generic (PLEG): container finished" podID="c45470de-22dd-4455-8fd6-dbaf60fddece" containerID="a1da1a88a8640d929b5dd0d99b339ca1bc11a0d45ee3563066bdf42576e195d4" exitCode=0 Feb 19 21:34:45 crc kubenswrapper[4771]: I0219 21:34:45.977901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z56lm" event={"ID":"c45470de-22dd-4455-8fd6-dbaf60fddece","Type":"ContainerDied","Data":"a1da1a88a8640d929b5dd0d99b339ca1bc11a0d45ee3563066bdf42576e195d4"} Feb 19 21:34:46 crc kubenswrapper[4771]: I0219 21:34:46.986739 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z56lm" event={"ID":"c45470de-22dd-4455-8fd6-dbaf60fddece","Type":"ContainerStarted","Data":"94a48f9649c3b9f89f2a2eb5005ed259c144e6f6d826ff7fe5508074dd109cb2"} Feb 19 21:34:47 crc kubenswrapper[4771]: I0219 21:34:47.007228 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z56lm" podStartSLOduration=2.6004608129999998 podStartE2EDuration="5.007212124s" podCreationTimestamp="2026-02-19 21:34:42 +0000 UTC" firstStartedPulling="2026-02-19 21:34:43.961062305 +0000 UTC m=+384.232504775" lastFinishedPulling="2026-02-19 21:34:46.367813616 +0000 UTC m=+386.639256086" observedRunningTime="2026-02-19 21:34:47.006466904 +0000 UTC m=+387.277909394" watchObservedRunningTime="2026-02-19 21:34:47.007212124 +0000 UTC m=+387.278654594" Feb 19 21:34:48 crc kubenswrapper[4771]: I0219 21:34:48.897941 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:48 crc kubenswrapper[4771]: I0219 21:34:48.898561 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:48 crc kubenswrapper[4771]: I0219 21:34:48.962380 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:49 crc kubenswrapper[4771]: I0219 21:34:49.071924 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vltjv" Feb 19 21:34:50 crc kubenswrapper[4771]: I0219 21:34:50.728654 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:50 crc kubenswrapper[4771]: I0219 21:34:50.729128 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:50 crc kubenswrapper[4771]: I0219 21:34:50.784186 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:51 crc kubenswrapper[4771]: I0219 21:34:51.053461 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vfcqq" Feb 19 21:34:51 crc kubenswrapper[4771]: I0219 21:34:51.288342 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:51 crc kubenswrapper[4771]: I0219 21:34:51.288800 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:51 crc kubenswrapper[4771]: I0219 21:34:51.356350 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:52 crc kubenswrapper[4771]: I0219 21:34:52.065013 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvrdf" Feb 19 21:34:53 crc kubenswrapper[4771]: I0219 21:34:53.131433 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:53 crc kubenswrapper[4771]: I0219 21:34:53.131616 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:53 crc kubenswrapper[4771]: I0219 21:34:53.183139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:34:54 crc kubenswrapper[4771]: I0219 21:34:54.067939 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z56lm" Feb 19 21:35:12 crc kubenswrapper[4771]: I0219 21:35:12.957882 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:35:12 crc kubenswrapper[4771]: I0219 21:35:12.958626 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:35:20 crc kubenswrapper[4771]: I0219 21:35:20.718704 4771 scope.go:117] "RemoveContainer" containerID="82caee848aaceeec512c04dac76fe3f45edd3e1a38db3618ea7e4db3596c66b0" Feb 19 21:35:20 crc kubenswrapper[4771]: I0219 21:35:20.748288 4771 scope.go:117] "RemoveContainer" containerID="9a93e820793880ba2f3b9e87a134060100400fecc8493dfe72809d83b70e38d0" Feb 19 21:35:20 crc kubenswrapper[4771]: I0219 21:35:20.781938 4771 scope.go:117] "RemoveContainer" containerID="0fecba4366d662ffb5ad6ca71a29decfd330d48a017718253fdd10d90e908d91" Feb 19 21:35:20 crc kubenswrapper[4771]: I0219 21:35:20.813488 4771 scope.go:117] "RemoveContainer" containerID="648ecd3015733aecc2c84ee53556799b6b4f07a4cba6cf9534cd2b8369a3be7a" Feb 19 21:35:42 crc kubenswrapper[4771]: I0219 21:35:42.956624 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:35:42 crc kubenswrapper[4771]: I0219 21:35:42.957330 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:35:42 crc kubenswrapper[4771]: I0219 21:35:42.957402 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:35:42 crc kubenswrapper[4771]: I0219 21:35:42.958408 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b73cf65a14c2c7047315612825a835da1e94d5ce583905ac6452c7b7611f53f2"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:35:42 crc kubenswrapper[4771]: I0219 21:35:42.958566 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://b73cf65a14c2c7047315612825a835da1e94d5ce583905ac6452c7b7611f53f2" gracePeriod=600 Feb 19 21:35:43 crc kubenswrapper[4771]: I0219 21:35:43.398728 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="b73cf65a14c2c7047315612825a835da1e94d5ce583905ac6452c7b7611f53f2" exitCode=0 Feb 19 21:35:43 crc kubenswrapper[4771]: I0219 21:35:43.398794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"b73cf65a14c2c7047315612825a835da1e94d5ce583905ac6452c7b7611f53f2"} Feb 19 21:35:43 crc kubenswrapper[4771]: I0219 21:35:43.399282 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"00064ebea687c2f43446b8863a30cdb029580ca9dabd28ae56781a84a609019f"} Feb 19 21:35:43 crc kubenswrapper[4771]: I0219 21:35:43.399325 4771 scope.go:117] "RemoveContainer" containerID="1590b840d8c6c1ba4c22f3451cfd2f8e784c9476a6e201bdb87904113a629a04" Feb 19 21:36:20 crc kubenswrapper[4771]: I0219 21:36:20.882405 4771 scope.go:117] "RemoveContainer" containerID="796e47747a3f9d9af6a0576da790fafdc118fe9840d25624636b9d5f506caa05" Feb 19 21:36:20 crc kubenswrapper[4771]: I0219 21:36:20.914263 4771 scope.go:117] "RemoveContainer" containerID="378b167719243a2cc38cf6f1e593889205115c2d2577ab71bc0cbe2b0201acac" Feb 19 21:36:20 crc kubenswrapper[4771]: I0219 21:36:20.947813 4771 scope.go:117] "RemoveContainer" containerID="b0826642160b1327710fb042b86300f4cbb4f67f1cec1d32a569dce430d7b4c1" Feb 19 21:36:20 crc kubenswrapper[4771]: I0219 21:36:20.982370 4771 scope.go:117] "RemoveContainer" containerID="4ff5a45f9c34255904272abb688c2adb12be0f4089ff761093886e196d2e5661" Feb 19 21:36:21 crc kubenswrapper[4771]: I0219 21:36:21.007603 4771 scope.go:117] "RemoveContainer" containerID="fe7bdd1d995d51dad5949e2c9e7a2f7974d18ac6752c394aa8871538bd276ed9" Feb 19 21:36:21 crc kubenswrapper[4771]: I0219 21:36:21.030540 4771 scope.go:117] "RemoveContainer" containerID="5a15357fe4a2c2d6153bf88b17e8dcd0ee422a52edc941dd0eaf55a94ad3f181" Feb 19 21:38:12 crc kubenswrapper[4771]: I0219 21:38:12.957606 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:38:12 crc kubenswrapper[4771]: I0219 21:38:12.960450 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:38:42 crc kubenswrapper[4771]: I0219 21:38:42.957533 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:38:42 crc kubenswrapper[4771]: I0219 21:38:42.958319 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:39:12 crc kubenswrapper[4771]: I0219 21:39:12.956929 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:39:12 crc kubenswrapper[4771]: I0219 21:39:12.957746 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:39:12 crc kubenswrapper[4771]: I0219 21:39:12.957830 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:39:12 crc kubenswrapper[4771]: I0219 21:39:12.959142 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00064ebea687c2f43446b8863a30cdb029580ca9dabd28ae56781a84a609019f"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:39:12 crc kubenswrapper[4771]: I0219 21:39:12.959295 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://00064ebea687c2f43446b8863a30cdb029580ca9dabd28ae56781a84a609019f" gracePeriod=600 Feb 19 21:39:13 crc kubenswrapper[4771]: I0219 21:39:13.148547 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="00064ebea687c2f43446b8863a30cdb029580ca9dabd28ae56781a84a609019f" exitCode=0 Feb 19 21:39:13 crc kubenswrapper[4771]: I0219 21:39:13.148631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"00064ebea687c2f43446b8863a30cdb029580ca9dabd28ae56781a84a609019f"} Feb 19 21:39:13 crc kubenswrapper[4771]: I0219 21:39:13.148700 4771 scope.go:117] "RemoveContainer" containerID="b73cf65a14c2c7047315612825a835da1e94d5ce583905ac6452c7b7611f53f2" Feb 19 21:39:14 crc kubenswrapper[4771]: I0219 21:39:14.163429 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"0b551f7ad6e0a61c42a9d657911c51c24f934a3d33eec6d667ab79647cf50477"} Feb 19 21:39:15 crc kubenswrapper[4771]: I0219 21:39:15.480570 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.188231 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hl4r5"] Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.189545 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-controller" containerID="cri-o://9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.189673 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="northd" containerID="cri-o://e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.189620 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="nbdb" containerID="cri-o://fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.189726 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.189773 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-node" containerID="cri-o://43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.189747 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="sbdb" containerID="cri-o://ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.189819 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-acl-logging" containerID="cri-o://361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.264977 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovnkube-controller" containerID="cri-o://934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0" gracePeriod=30 Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.565198 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4 is running failed: container process not found" containerID="fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.565242 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc is running failed: container process not found" containerID="ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.566202 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc is running failed: container process not found" containerID="ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.566325 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4 is running failed: container process not found" containerID="fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.566658 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4 is running failed: container process not found" containerID="fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.566694 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="nbdb" Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.566777 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc is running failed: container process not found" containerID="ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 19 21:41:03 crc kubenswrapper[4771]: E0219 21:41:03.566797 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="sbdb" Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.867508 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hl4r5_1ae8f212-03b1-4a09-8a89-22a30241d9a8/ovn-acl-logging/0.log" Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.868523 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hl4r5_1ae8f212-03b1-4a09-8a89-22a30241d9a8/ovn-controller/0.log" Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869117 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0" exitCode=0 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869279 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869218 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc" exitCode=0 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869349 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4" exitCode=0 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869374 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584" exitCode=0 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869394 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b" exitCode=0 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869412 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f" exitCode=0 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869429 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9" exitCode=143 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869446 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerID="9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b" exitCode=143 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869469 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869552 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869574 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869592 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869610 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.869628 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.877803 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jtrr_c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1/kube-multus/0.log" Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.877866 4771 generic.go:334] "Generic (PLEG): container finished" podID="c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1" containerID="44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1" exitCode=2 Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.877896 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jtrr" event={"ID":"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1","Type":"ContainerDied","Data":"44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1"} Feb 19 21:41:03 crc kubenswrapper[4771]: I0219 21:41:03.878870 4771 scope.go:117] "RemoveContainer" containerID="44d821e1f8c80adbe778f27c3cd6a3aa8984a0774155ea792d06d33bdf5c3bb1" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.392947 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hl4r5_1ae8f212-03b1-4a09-8a89-22a30241d9a8/ovn-acl-logging/0.log" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.395014 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hl4r5_1ae8f212-03b1-4a09-8a89-22a30241d9a8/ovn-controller/0.log" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.395723 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464476 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k9svn"] Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464773 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-controller" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464791 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-controller" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464807 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464815 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464824 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="northd" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464831 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="northd" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464840 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kubecfg-setup" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464847 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kubecfg-setup" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464857 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-node" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464864 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-node" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464877 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-acl-logging" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464885 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-acl-logging" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464898 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovnkube-controller" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464909 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovnkube-controller" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464920 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="nbdb" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464928 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="nbdb" Feb 19 21:41:04 crc kubenswrapper[4771]: E0219 21:41:04.464939 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="sbdb" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.464946 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="sbdb" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465074 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465091 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovnkube-controller" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465102 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="northd" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465112 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="sbdb" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465123 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="kube-rbac-proxy-node" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465132 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-controller" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465145 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="ovn-acl-logging" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.465154 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" containerName="nbdb" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.467205 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470145 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470208 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-systemd-units\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470270 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-slash\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470300 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-systemd\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470333 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-env-overrides\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470366 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-run-netns\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-var-lib-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470483 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovn-node-metrics-cert\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470566 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470614 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-cni-bin\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470645 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-kubelet\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470678 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-ovn\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470742 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470785 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-cni-netd\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470826 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovnkube-config\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-log-socket\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.470954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovnkube-script-lib\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.471092 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcpp4\" (UniqueName: \"kubernetes.io/projected/3344ba59-b57e-47cd-9ba0-84b9dd68590f-kube-api-access-wcpp4\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.471137 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-etc-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.471188 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-node-log\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572013 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-etc-openvswitch\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572081 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-bin\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572105 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg8wz\" (UniqueName: \"kubernetes.io/projected/1ae8f212-03b1-4a09-8a89-22a30241d9a8-kube-api-access-hg8wz\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572124 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-slash\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572139 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-ovn\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572191 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-netns\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572204 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-var-lib-openvswitch\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572223 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-config\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572238 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-ovn-kubernetes\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-netd\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572276 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-env-overrides\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572319 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-systemd\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572336 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-node-log\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572353 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-kubelet\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-openvswitch\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572400 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-systemd-units\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572423 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovn-node-metrics-cert\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572451 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-script-lib\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572469 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-log-socket\") pod \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\" (UID: \"1ae8f212-03b1-4a09-8a89-22a30241d9a8\") " Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572511 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572556 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572603 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcpp4\" (UniqueName: \"kubernetes.io/projected/3344ba59-b57e-47cd-9ba0-84b9dd68590f-kube-api-access-wcpp4\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572627 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-etc-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-node-log\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572678 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-systemd-units\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572718 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-slash\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-systemd\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572749 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-env-overrides\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572767 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-run-netns\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572783 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-var-lib-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572798 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovn-node-metrics-cert\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572820 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572836 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-cni-bin\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572851 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-kubelet\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572869 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-ovn\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572907 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-cni-netd\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572924 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovnkube-config\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572942 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-log-socket\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572957 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovnkube-script-lib\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.572995 4771 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573005 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573029 4771 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573070 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573113 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573142 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-slash" (OuterVolumeSpecName: "host-slash") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573393 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573405 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573478 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573578 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573622 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-node-log" (OuterVolumeSpecName: "node-log") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573700 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovnkube-script-lib\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-cni-bin\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573764 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-ovn\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573748 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-kubelet\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573783 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-var-lib-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573876 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-run-netns\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.573962 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574049 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-run-systemd\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574122 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-cni-netd\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574084 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-etc-openvswitch\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574136 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-node-log\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574104 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574240 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-run-ovn-kubernetes\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574085 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-env-overrides\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574324 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-log-socket" (OuterVolumeSpecName: "log-socket") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-systemd-units\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574455 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-host-slash\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovnkube-config\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574385 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3344ba59-b57e-47cd-9ba0-84b9dd68590f-log-socket\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574495 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.574496 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.581627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3344ba59-b57e-47cd-9ba0-84b9dd68590f-ovn-node-metrics-cert\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.587177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae8f212-03b1-4a09-8a89-22a30241d9a8-kube-api-access-hg8wz" (OuterVolumeSpecName: "kube-api-access-hg8wz") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "kube-api-access-hg8wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.588640 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.591125 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1ae8f212-03b1-4a09-8a89-22a30241d9a8" (UID: "1ae8f212-03b1-4a09-8a89-22a30241d9a8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.594765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcpp4\" (UniqueName: \"kubernetes.io/projected/3344ba59-b57e-47cd-9ba0-84b9dd68590f-kube-api-access-wcpp4\") pod \"ovnkube-node-k9svn\" (UID: \"3344ba59-b57e-47cd-9ba0-84b9dd68590f\") " pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674229 4771 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674265 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg8wz\" (UniqueName: \"kubernetes.io/projected/1ae8f212-03b1-4a09-8a89-22a30241d9a8-kube-api-access-hg8wz\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674279 4771 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674289 4771 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674301 4771 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674311 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674321 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674332 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674343 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674352 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674361 4771 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674370 4771 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674379 4771 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674388 4771 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674397 4771 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ae8f212-03b1-4a09-8a89-22a30241d9a8-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674408 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.674417 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ae8f212-03b1-4a09-8a89-22a30241d9a8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.838799 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.901875 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hl4r5_1ae8f212-03b1-4a09-8a89-22a30241d9a8/ovn-acl-logging/0.log" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.903249 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hl4r5_1ae8f212-03b1-4a09-8a89-22a30241d9a8/ovn-controller/0.log" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.904264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" event={"ID":"1ae8f212-03b1-4a09-8a89-22a30241d9a8","Type":"ContainerDied","Data":"9c98926fe726dcf9f9a723b0f10f3e59f8e520a5b1c56ebe240a8b893cfe6eae"} Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.904637 4771 scope.go:117] "RemoveContainer" containerID="934ae611309d4053f800b37315595d18da6907948cde46b16c656ccd2fa009e0" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.904855 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hl4r5" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.907304 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jtrr_c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1/kube-multus/0.log" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.907440 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jtrr" event={"ID":"c28b5b75-d85c-4b96-98d5-2fc60ef5f8e1","Type":"ContainerStarted","Data":"f696724f1fad31fcb1a980a21bb17282e7cd16b4bd68f5a347e3c43eb8a4f48f"} Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.911221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"7adaa9979e62f70bdb3b5ce5f07b27d33aa3af378dae51bb26f5410367d0c9bc"} Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.931263 4771 scope.go:117] "RemoveContainer" containerID="ef8c801b13dcd856032e8288d16ef385a37edfd49611d2202a43dd5282f4c7fc" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.955995 4771 scope.go:117] "RemoveContainer" containerID="fa523b7259a8084401520a99244bc8b04715aece77a9262701274029b6967ee4" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.974886 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hl4r5"] Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.978465 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hl4r5"] Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.978636 4771 scope.go:117] "RemoveContainer" containerID="e8150ccd0ac2165f6d5bea9c01d49532cd22550d31ac3c0247f3f684aff81584" Feb 19 21:41:04 crc kubenswrapper[4771]: I0219 21:41:04.991210 4771 scope.go:117] "RemoveContainer" containerID="1ab67b4e22ab9e419da191e1817fcf7c541c06367055250ae8a86808979c561b" Feb 19 21:41:05 crc kubenswrapper[4771]: I0219 21:41:05.002435 4771 scope.go:117] "RemoveContainer" containerID="43c68b156bc7ef63894f65f72dae96fb97822e7c5d5c75ee8c0abf5920551a6f" Feb 19 21:41:05 crc kubenswrapper[4771]: I0219 21:41:05.014101 4771 scope.go:117] "RemoveContainer" containerID="361db7d081020a24ea9fa221a79709b9260a2a984017f99e2d7bc9af7e6bbfb9" Feb 19 21:41:05 crc kubenswrapper[4771]: I0219 21:41:05.034749 4771 scope.go:117] "RemoveContainer" containerID="9558ac587489983408a595fd7c21988da4cb7c03a309da3d0fcb61baed077c8b" Feb 19 21:41:05 crc kubenswrapper[4771]: I0219 21:41:05.049544 4771 scope.go:117] "RemoveContainer" containerID="c1ea39a6092bbead8833a6321480d8b0a6e593a657b19f770192c34b936d23e0" Feb 19 21:41:05 crc kubenswrapper[4771]: I0219 21:41:05.919182 4771 generic.go:334] "Generic (PLEG): container finished" podID="3344ba59-b57e-47cd-9ba0-84b9dd68590f" containerID="808da42d83278c2404883ad9dc0905bfb6476cf5bced3b20062e94fe69dd1ccb" exitCode=0 Feb 19 21:41:05 crc kubenswrapper[4771]: I0219 21:41:05.919824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerDied","Data":"808da42d83278c2404883ad9dc0905bfb6476cf5bced3b20062e94fe69dd1ccb"} Feb 19 21:41:06 crc kubenswrapper[4771]: I0219 21:41:06.445655 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae8f212-03b1-4a09-8a89-22a30241d9a8" path="/var/lib/kubelet/pods/1ae8f212-03b1-4a09-8a89-22a30241d9a8/volumes" Feb 19 21:41:06 crc kubenswrapper[4771]: I0219 21:41:06.930479 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"d9217eeba97461e420147e30c8f4b276e48b34fd1eb9a8d6d3cad9eef9f6b3fe"} Feb 19 21:41:07 crc kubenswrapper[4771]: I0219 21:41:07.948477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"99db5c15f9cdd5b1236260a75055d3e4055c2fd5f824596274f5065dc73e92b3"} Feb 19 21:41:07 crc kubenswrapper[4771]: I0219 21:41:07.948546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"dd6889734a9b3970e8b83285d6b8e4ef85c4acaca2f3b8d6e1ab8aad06687fb2"} Feb 19 21:41:07 crc kubenswrapper[4771]: I0219 21:41:07.948571 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"d0a36cec567692798f71a3e208229cd19d687e38fa5300216c93d862b46e34b9"} Feb 19 21:41:08 crc kubenswrapper[4771]: I0219 21:41:08.959460 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"92ff87edcfc7c9c4af86c6e205e005a7a48c091199751a76430ba2a03db28412"} Feb 19 21:41:08 crc kubenswrapper[4771]: I0219 21:41:08.959518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"132638d3b2facdda8ea65b8c59a82219a683855f5acf6e3cb184f928434d9fea"} Feb 19 21:41:11 crc kubenswrapper[4771]: I0219 21:41:11.989769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"dcb52d79994e33cf8388bd455dddc8dcabed9e0b7847b4926bbfa6b1af470f3a"} Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.824175 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-w4qz4"] Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.825489 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.828541 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.828566 4771 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-whtfj" Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.829270 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.828559 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.961159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnqpk\" (UniqueName: \"kubernetes.io/projected/2a344000-2cb1-4aa3-932e-d26a540f079f-kube-api-access-vnqpk\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.961250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2a344000-2cb1-4aa3-932e-d26a540f079f-node-mnt\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:12 crc kubenswrapper[4771]: I0219 21:41:12.961311 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2a344000-2cb1-4aa3-932e-d26a540f079f-crc-storage\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: I0219 21:41:13.062688 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2a344000-2cb1-4aa3-932e-d26a540f079f-crc-storage\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: I0219 21:41:13.062852 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnqpk\" (UniqueName: \"kubernetes.io/projected/2a344000-2cb1-4aa3-932e-d26a540f079f-kube-api-access-vnqpk\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: I0219 21:41:13.062894 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2a344000-2cb1-4aa3-932e-d26a540f079f-node-mnt\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: I0219 21:41:13.063247 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2a344000-2cb1-4aa3-932e-d26a540f079f-node-mnt\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: I0219 21:41:13.063473 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2a344000-2cb1-4aa3-932e-d26a540f079f-crc-storage\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: I0219 21:41:13.087345 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnqpk\" (UniqueName: \"kubernetes.io/projected/2a344000-2cb1-4aa3-932e-d26a540f079f-kube-api-access-vnqpk\") pod \"crc-storage-crc-w4qz4\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: I0219 21:41:13.147964 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: E0219 21:41:13.192857 4771 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(d8de8a233d4af87bba4f8c7cfb815d4663621ae2cc8cb846b6595452e8acbabf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:41:13 crc kubenswrapper[4771]: E0219 21:41:13.193497 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(d8de8a233d4af87bba4f8c7cfb815d4663621ae2cc8cb846b6595452e8acbabf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: E0219 21:41:13.193544 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(d8de8a233d4af87bba4f8c7cfb815d4663621ae2cc8cb846b6595452e8acbabf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:13 crc kubenswrapper[4771]: E0219 21:41:13.193618 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-w4qz4_crc-storage(2a344000-2cb1-4aa3-932e-d26a540f079f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-w4qz4_crc-storage(2a344000-2cb1-4aa3-932e-d26a540f079f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(d8de8a233d4af87bba4f8c7cfb815d4663621ae2cc8cb846b6595452e8acbabf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-w4qz4" podUID="2a344000-2cb1-4aa3-932e-d26a540f079f" Feb 19 21:41:14 crc kubenswrapper[4771]: I0219 21:41:14.754929 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-w4qz4"] Feb 19 21:41:14 crc kubenswrapper[4771]: I0219 21:41:14.755470 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:14 crc kubenswrapper[4771]: I0219 21:41:14.756115 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:14 crc kubenswrapper[4771]: E0219 21:41:14.792770 4771 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(957f70fe83dc9ee0a34c0f32279e4626cc6ee46d777d02db127a52fb8652e7cc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:41:14 crc kubenswrapper[4771]: E0219 21:41:14.792858 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(957f70fe83dc9ee0a34c0f32279e4626cc6ee46d777d02db127a52fb8652e7cc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:14 crc kubenswrapper[4771]: E0219 21:41:14.792893 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(957f70fe83dc9ee0a34c0f32279e4626cc6ee46d777d02db127a52fb8652e7cc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:14 crc kubenswrapper[4771]: E0219 21:41:14.792967 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-w4qz4_crc-storage(2a344000-2cb1-4aa3-932e-d26a540f079f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-w4qz4_crc-storage(2a344000-2cb1-4aa3-932e-d26a540f079f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-w4qz4_crc-storage_2a344000-2cb1-4aa3-932e-d26a540f079f_0(957f70fe83dc9ee0a34c0f32279e4626cc6ee46d777d02db127a52fb8652e7cc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-w4qz4" podUID="2a344000-2cb1-4aa3-932e-d26a540f079f" Feb 19 21:41:15 crc kubenswrapper[4771]: I0219 21:41:15.011291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" event={"ID":"3344ba59-b57e-47cd-9ba0-84b9dd68590f","Type":"ContainerStarted","Data":"c6300cd58482ba47d11a1214beb20e5281c19b993a685d3ce7541424376e2a66"} Feb 19 21:41:15 crc kubenswrapper[4771]: I0219 21:41:15.011744 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:15 crc kubenswrapper[4771]: I0219 21:41:15.069044 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:15 crc kubenswrapper[4771]: I0219 21:41:15.079168 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" podStartSLOduration=11.079137486 podStartE2EDuration="11.079137486s" podCreationTimestamp="2026-02-19 21:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:41:15.071216037 +0000 UTC m=+775.342658527" watchObservedRunningTime="2026-02-19 21:41:15.079137486 +0000 UTC m=+775.350579996" Feb 19 21:41:16 crc kubenswrapper[4771]: I0219 21:41:16.021088 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:16 crc kubenswrapper[4771]: I0219 21:41:16.021717 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:16 crc kubenswrapper[4771]: I0219 21:41:16.098531 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:18 crc kubenswrapper[4771]: I0219 21:41:18.791690 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jxrgs"] Feb 19 21:41:18 crc kubenswrapper[4771]: I0219 21:41:18.793135 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:18 crc kubenswrapper[4771]: I0219 21:41:18.815044 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxrgs"] Feb 19 21:41:18 crc kubenswrapper[4771]: I0219 21:41:18.958324 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbb6\" (UniqueName: \"kubernetes.io/projected/17d432f5-f56b-413d-8aa4-bc275369dc84-kube-api-access-plbb6\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:18 crc kubenswrapper[4771]: I0219 21:41:18.958816 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-catalog-content\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:18 crc kubenswrapper[4771]: I0219 21:41:18.958957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-utilities\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.059934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-utilities\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.059999 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbb6\" (UniqueName: \"kubernetes.io/projected/17d432f5-f56b-413d-8aa4-bc275369dc84-kube-api-access-plbb6\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.060104 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-catalog-content\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.060563 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-catalog-content\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.060717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-utilities\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.091814 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbb6\" (UniqueName: \"kubernetes.io/projected/17d432f5-f56b-413d-8aa4-bc275369dc84-kube-api-access-plbb6\") pod \"community-operators-jxrgs\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.144005 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:19 crc kubenswrapper[4771]: I0219 21:41:19.403332 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxrgs"] Feb 19 21:41:20 crc kubenswrapper[4771]: I0219 21:41:20.062630 4771 generic.go:334] "Generic (PLEG): container finished" podID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerID="d87716c88dcdeede9a051aa807f017241c42c0a31d9d2b9ddf7ea7aa73cfcb88" exitCode=0 Feb 19 21:41:20 crc kubenswrapper[4771]: I0219 21:41:20.062787 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrgs" event={"ID":"17d432f5-f56b-413d-8aa4-bc275369dc84","Type":"ContainerDied","Data":"d87716c88dcdeede9a051aa807f017241c42c0a31d9d2b9ddf7ea7aa73cfcb88"} Feb 19 21:41:20 crc kubenswrapper[4771]: I0219 21:41:20.062995 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrgs" event={"ID":"17d432f5-f56b-413d-8aa4-bc275369dc84","Type":"ContainerStarted","Data":"75f3e1c264247658cd46fc2e8d344a16f94ad51c673b4d1247ba774bade8e034"} Feb 19 21:41:20 crc kubenswrapper[4771]: I0219 21:41:20.065296 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:41:21 crc kubenswrapper[4771]: I0219 21:41:21.072821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrgs" event={"ID":"17d432f5-f56b-413d-8aa4-bc275369dc84","Type":"ContainerStarted","Data":"1f390104b4fa62335a005e1704848782d82c08a1a2f509ffb66120edcff62ea2"} Feb 19 21:41:22 crc kubenswrapper[4771]: I0219 21:41:22.084389 4771 generic.go:334] "Generic (PLEG): container finished" podID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerID="1f390104b4fa62335a005e1704848782d82c08a1a2f509ffb66120edcff62ea2" exitCode=0 Feb 19 21:41:22 crc kubenswrapper[4771]: I0219 21:41:22.084453 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrgs" event={"ID":"17d432f5-f56b-413d-8aa4-bc275369dc84","Type":"ContainerDied","Data":"1f390104b4fa62335a005e1704848782d82c08a1a2f509ffb66120edcff62ea2"} Feb 19 21:41:23 crc kubenswrapper[4771]: I0219 21:41:23.094062 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrgs" event={"ID":"17d432f5-f56b-413d-8aa4-bc275369dc84","Type":"ContainerStarted","Data":"913605c1d42bc18b9bfed84c78c66578e67cb11e598b6007727b243d0b4d3675"} Feb 19 21:41:28 crc kubenswrapper[4771]: I0219 21:41:28.437301 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:28 crc kubenswrapper[4771]: I0219 21:41:28.438360 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:28 crc kubenswrapper[4771]: I0219 21:41:28.678345 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jxrgs" podStartSLOduration=8.218449385 podStartE2EDuration="10.678327201s" podCreationTimestamp="2026-02-19 21:41:18 +0000 UTC" firstStartedPulling="2026-02-19 21:41:20.064808125 +0000 UTC m=+780.336250625" lastFinishedPulling="2026-02-19 21:41:22.524685931 +0000 UTC m=+782.796128441" observedRunningTime="2026-02-19 21:41:23.121756655 +0000 UTC m=+783.393199145" watchObservedRunningTime="2026-02-19 21:41:28.678327201 +0000 UTC m=+788.949769671" Feb 19 21:41:28 crc kubenswrapper[4771]: I0219 21:41:28.678753 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-w4qz4"] Feb 19 21:41:28 crc kubenswrapper[4771]: W0219 21:41:28.688263 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a344000_2cb1_4aa3_932e_d26a540f079f.slice/crio-f9020e8553bc1a64c4f1b2587528f274522bdfb873a4161eedfb7d91e7fda6bc WatchSource:0}: Error finding container f9020e8553bc1a64c4f1b2587528f274522bdfb873a4161eedfb7d91e7fda6bc: Status 404 returned error can't find the container with id f9020e8553bc1a64c4f1b2587528f274522bdfb873a4161eedfb7d91e7fda6bc Feb 19 21:41:29 crc kubenswrapper[4771]: I0219 21:41:29.132409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-w4qz4" event={"ID":"2a344000-2cb1-4aa3-932e-d26a540f079f","Type":"ContainerStarted","Data":"f9020e8553bc1a64c4f1b2587528f274522bdfb873a4161eedfb7d91e7fda6bc"} Feb 19 21:41:29 crc kubenswrapper[4771]: I0219 21:41:29.144800 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:29 crc kubenswrapper[4771]: I0219 21:41:29.144990 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:29 crc kubenswrapper[4771]: I0219 21:41:29.195768 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:30 crc kubenswrapper[4771]: I0219 21:41:30.187145 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:30 crc kubenswrapper[4771]: I0219 21:41:30.233966 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jxrgs"] Feb 19 21:41:31 crc kubenswrapper[4771]: I0219 21:41:31.149726 4771 generic.go:334] "Generic (PLEG): container finished" podID="2a344000-2cb1-4aa3-932e-d26a540f079f" containerID="9c1e890673a69091be9378ffde69f13c4d3aa011167a46b5ac75277d30b54801" exitCode=0 Feb 19 21:41:31 crc kubenswrapper[4771]: I0219 21:41:31.149817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-w4qz4" event={"ID":"2a344000-2cb1-4aa3-932e-d26a540f079f","Type":"ContainerDied","Data":"9c1e890673a69091be9378ffde69f13c4d3aa011167a46b5ac75277d30b54801"} Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.166322 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jxrgs" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="registry-server" containerID="cri-o://913605c1d42bc18b9bfed84c78c66578e67cb11e598b6007727b243d0b4d3675" gracePeriod=2 Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.443856 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.548234 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2a344000-2cb1-4aa3-932e-d26a540f079f-crc-storage\") pod \"2a344000-2cb1-4aa3-932e-d26a540f079f\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.548372 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnqpk\" (UniqueName: \"kubernetes.io/projected/2a344000-2cb1-4aa3-932e-d26a540f079f-kube-api-access-vnqpk\") pod \"2a344000-2cb1-4aa3-932e-d26a540f079f\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.548492 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2a344000-2cb1-4aa3-932e-d26a540f079f-node-mnt\") pod \"2a344000-2cb1-4aa3-932e-d26a540f079f\" (UID: \"2a344000-2cb1-4aa3-932e-d26a540f079f\") " Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.548732 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a344000-2cb1-4aa3-932e-d26a540f079f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "2a344000-2cb1-4aa3-932e-d26a540f079f" (UID: "2a344000-2cb1-4aa3-932e-d26a540f079f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.556326 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a344000-2cb1-4aa3-932e-d26a540f079f-kube-api-access-vnqpk" (OuterVolumeSpecName: "kube-api-access-vnqpk") pod "2a344000-2cb1-4aa3-932e-d26a540f079f" (UID: "2a344000-2cb1-4aa3-932e-d26a540f079f"). InnerVolumeSpecName "kube-api-access-vnqpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.573394 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a344000-2cb1-4aa3-932e-d26a540f079f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "2a344000-2cb1-4aa3-932e-d26a540f079f" (UID: "2a344000-2cb1-4aa3-932e-d26a540f079f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.650424 4771 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2a344000-2cb1-4aa3-932e-d26a540f079f-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.650459 4771 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2a344000-2cb1-4aa3-932e-d26a540f079f-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:32 crc kubenswrapper[4771]: I0219 21:41:32.650472 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnqpk\" (UniqueName: \"kubernetes.io/projected/2a344000-2cb1-4aa3-932e-d26a540f079f-kube-api-access-vnqpk\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.176115 4771 generic.go:334] "Generic (PLEG): container finished" podID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerID="913605c1d42bc18b9bfed84c78c66578e67cb11e598b6007727b243d0b4d3675" exitCode=0 Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.176218 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrgs" event={"ID":"17d432f5-f56b-413d-8aa4-bc275369dc84","Type":"ContainerDied","Data":"913605c1d42bc18b9bfed84c78c66578e67cb11e598b6007727b243d0b4d3675"} Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.177841 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-w4qz4" event={"ID":"2a344000-2cb1-4aa3-932e-d26a540f079f","Type":"ContainerDied","Data":"f9020e8553bc1a64c4f1b2587528f274522bdfb873a4161eedfb7d91e7fda6bc"} Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.177880 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9020e8553bc1a64c4f1b2587528f274522bdfb873a4161eedfb7d91e7fda6bc" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.177915 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w4qz4" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.682732 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.868149 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-catalog-content\") pod \"17d432f5-f56b-413d-8aa4-bc275369dc84\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.868253 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-utilities\") pod \"17d432f5-f56b-413d-8aa4-bc275369dc84\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.868301 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plbb6\" (UniqueName: \"kubernetes.io/projected/17d432f5-f56b-413d-8aa4-bc275369dc84-kube-api-access-plbb6\") pod \"17d432f5-f56b-413d-8aa4-bc275369dc84\" (UID: \"17d432f5-f56b-413d-8aa4-bc275369dc84\") " Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.870005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-utilities" (OuterVolumeSpecName: "utilities") pod "17d432f5-f56b-413d-8aa4-bc275369dc84" (UID: "17d432f5-f56b-413d-8aa4-bc275369dc84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.876159 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17d432f5-f56b-413d-8aa4-bc275369dc84-kube-api-access-plbb6" (OuterVolumeSpecName: "kube-api-access-plbb6") pod "17d432f5-f56b-413d-8aa4-bc275369dc84" (UID: "17d432f5-f56b-413d-8aa4-bc275369dc84"). InnerVolumeSpecName "kube-api-access-plbb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.960934 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17d432f5-f56b-413d-8aa4-bc275369dc84" (UID: "17d432f5-f56b-413d-8aa4-bc275369dc84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.969624 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plbb6\" (UniqueName: \"kubernetes.io/projected/17d432f5-f56b-413d-8aa4-bc275369dc84-kube-api-access-plbb6\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.969752 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:33 crc kubenswrapper[4771]: I0219 21:41:33.969854 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17d432f5-f56b-413d-8aa4-bc275369dc84-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.189439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrgs" event={"ID":"17d432f5-f56b-413d-8aa4-bc275369dc84","Type":"ContainerDied","Data":"75f3e1c264247658cd46fc2e8d344a16f94ad51c673b4d1247ba774bade8e034"} Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.189504 4771 scope.go:117] "RemoveContainer" containerID="913605c1d42bc18b9bfed84c78c66578e67cb11e598b6007727b243d0b4d3675" Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.189537 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrgs" Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.217324 4771 scope.go:117] "RemoveContainer" containerID="1f390104b4fa62335a005e1704848782d82c08a1a2f509ffb66120edcff62ea2" Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.251127 4771 scope.go:117] "RemoveContainer" containerID="d87716c88dcdeede9a051aa807f017241c42c0a31d9d2b9ddf7ea7aa73cfcb88" Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.255266 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jxrgs"] Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.258780 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jxrgs"] Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.449595 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" path="/var/lib/kubelet/pods/17d432f5-f56b-413d-8aa4-bc275369dc84/volumes" Feb 19 21:41:34 crc kubenswrapper[4771]: I0219 21:41:34.876077 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k9svn" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.815413 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td"] Feb 19 21:41:40 crc kubenswrapper[4771]: E0219 21:41:40.815990 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a344000-2cb1-4aa3-932e-d26a540f079f" containerName="storage" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.816054 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a344000-2cb1-4aa3-932e-d26a540f079f" containerName="storage" Feb 19 21:41:40 crc kubenswrapper[4771]: E0219 21:41:40.816079 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="extract-content" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.816092 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="extract-content" Feb 19 21:41:40 crc kubenswrapper[4771]: E0219 21:41:40.816109 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="registry-server" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.816121 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="registry-server" Feb 19 21:41:40 crc kubenswrapper[4771]: E0219 21:41:40.816140 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="extract-utilities" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.816178 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="extract-utilities" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.816371 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="17d432f5-f56b-413d-8aa4-bc275369dc84" containerName="registry-server" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.816397 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a344000-2cb1-4aa3-932e-d26a540f079f" containerName="storage" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.818236 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.820874 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.829991 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td"] Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.974882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.975070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:40 crc kubenswrapper[4771]: I0219 21:41:40.975146 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpk9\" (UniqueName: \"kubernetes.io/projected/113dbb1c-c7c0-45ce-99d8-86156cbce45c-kube-api-access-tzpk9\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.076745 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.076888 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.076963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzpk9\" (UniqueName: \"kubernetes.io/projected/113dbb1c-c7c0-45ce-99d8-86156cbce45c-kube-api-access-tzpk9\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.077805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.077933 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.102745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzpk9\" (UniqueName: \"kubernetes.io/projected/113dbb1c-c7c0-45ce-99d8-86156cbce45c-kube-api-access-tzpk9\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.135108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:41 crc kubenswrapper[4771]: I0219 21:41:41.394556 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td"] Feb 19 21:41:42 crc kubenswrapper[4771]: I0219 21:41:42.246836 4771 generic.go:334] "Generic (PLEG): container finished" podID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerID="ef6bc14b4a81b3e283b7c1b63e14dedbbd91cc6d29ec0218eb295e7813caf14f" exitCode=0 Feb 19 21:41:42 crc kubenswrapper[4771]: I0219 21:41:42.246946 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" event={"ID":"113dbb1c-c7c0-45ce-99d8-86156cbce45c","Type":"ContainerDied","Data":"ef6bc14b4a81b3e283b7c1b63e14dedbbd91cc6d29ec0218eb295e7813caf14f"} Feb 19 21:41:42 crc kubenswrapper[4771]: I0219 21:41:42.247333 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" event={"ID":"113dbb1c-c7c0-45ce-99d8-86156cbce45c","Type":"ContainerStarted","Data":"f99777a3bdb679c69a37c9ece76f8c478a3ab7008d7ec489cb1b5a6a6ae284f3"} Feb 19 21:41:42 crc kubenswrapper[4771]: I0219 21:41:42.957584 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:41:42 crc kubenswrapper[4771]: I0219 21:41:42.957644 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.572330 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n685c"] Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.574514 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.581861 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n685c"] Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.722303 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-utilities\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.722483 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhs58\" (UniqueName: \"kubernetes.io/projected/86c9413f-fb4e-4d10-807e-e3da51cc96d0-kube-api-access-nhs58\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.722637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-catalog-content\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.823372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-catalog-content\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.823491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-utilities\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.823525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhs58\" (UniqueName: \"kubernetes.io/projected/86c9413f-fb4e-4d10-807e-e3da51cc96d0-kube-api-access-nhs58\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.824329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-utilities\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.824745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-catalog-content\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.856611 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhs58\" (UniqueName: \"kubernetes.io/projected/86c9413f-fb4e-4d10-807e-e3da51cc96d0-kube-api-access-nhs58\") pod \"redhat-operators-n685c\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:43 crc kubenswrapper[4771]: I0219 21:41:43.917802 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:44 crc kubenswrapper[4771]: I0219 21:41:44.101938 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n685c"] Feb 19 21:41:44 crc kubenswrapper[4771]: W0219 21:41:44.118989 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86c9413f_fb4e_4d10_807e_e3da51cc96d0.slice/crio-ae64d75211063a43e8abc2d03da2735ec748d44e3bd069fd86bd096e34e08612 WatchSource:0}: Error finding container ae64d75211063a43e8abc2d03da2735ec748d44e3bd069fd86bd096e34e08612: Status 404 returned error can't find the container with id ae64d75211063a43e8abc2d03da2735ec748d44e3bd069fd86bd096e34e08612 Feb 19 21:41:44 crc kubenswrapper[4771]: I0219 21:41:44.259051 4771 generic.go:334] "Generic (PLEG): container finished" podID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerID="f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb" exitCode=0 Feb 19 21:41:44 crc kubenswrapper[4771]: I0219 21:41:44.259108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n685c" event={"ID":"86c9413f-fb4e-4d10-807e-e3da51cc96d0","Type":"ContainerDied","Data":"f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb"} Feb 19 21:41:44 crc kubenswrapper[4771]: I0219 21:41:44.259165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n685c" event={"ID":"86c9413f-fb4e-4d10-807e-e3da51cc96d0","Type":"ContainerStarted","Data":"ae64d75211063a43e8abc2d03da2735ec748d44e3bd069fd86bd096e34e08612"} Feb 19 21:41:44 crc kubenswrapper[4771]: I0219 21:41:44.260810 4771 generic.go:334] "Generic (PLEG): container finished" podID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerID="bb45ffa9ef14af540f0eed65d5cbf2f4b06cfecd2fcb79f9785b03b343bdf319" exitCode=0 Feb 19 21:41:44 crc kubenswrapper[4771]: I0219 21:41:44.260844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" event={"ID":"113dbb1c-c7c0-45ce-99d8-86156cbce45c","Type":"ContainerDied","Data":"bb45ffa9ef14af540f0eed65d5cbf2f4b06cfecd2fcb79f9785b03b343bdf319"} Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.270970 4771 generic.go:334] "Generic (PLEG): container finished" podID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerID="1015ba4dfed0112842f6118d739612019ed2a635d52fce2ea48ed852757dd6ee" exitCode=0 Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.271123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" event={"ID":"113dbb1c-c7c0-45ce-99d8-86156cbce45c","Type":"ContainerDied","Data":"1015ba4dfed0112842f6118d739612019ed2a635d52fce2ea48ed852757dd6ee"} Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.561272 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6r5v5"] Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.562902 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.577485 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r5v5"] Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.647920 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-catalog-content\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.647993 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-utilities\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.648041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p64xv\" (UniqueName: \"kubernetes.io/projected/88fc6650-97c5-4168-b6fd-c8fc10819dfa-kube-api-access-p64xv\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.748708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p64xv\" (UniqueName: \"kubernetes.io/projected/88fc6650-97c5-4168-b6fd-c8fc10819dfa-kube-api-access-p64xv\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.748776 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-catalog-content\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.748847 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-utilities\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.749474 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-catalog-content\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.749494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-utilities\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.773381 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p64xv\" (UniqueName: \"kubernetes.io/projected/88fc6650-97c5-4168-b6fd-c8fc10819dfa-kube-api-access-p64xv\") pod \"redhat-marketplace-6r5v5\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:45 crc kubenswrapper[4771]: I0219 21:41:45.923014 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.242899 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r5v5"] Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.280528 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n685c" event={"ID":"86c9413f-fb4e-4d10-807e-e3da51cc96d0","Type":"ContainerStarted","Data":"55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072"} Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.283005 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r5v5" event={"ID":"88fc6650-97c5-4168-b6fd-c8fc10819dfa","Type":"ContainerStarted","Data":"56473f7022de7b786fa5c6716f3a386128a85d1711a3f131ef93ab4e02409800"} Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.517647 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.663993 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzpk9\" (UniqueName: \"kubernetes.io/projected/113dbb1c-c7c0-45ce-99d8-86156cbce45c-kube-api-access-tzpk9\") pod \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.664117 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-bundle\") pod \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.664146 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-util\") pod \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\" (UID: \"113dbb1c-c7c0-45ce-99d8-86156cbce45c\") " Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.665629 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-bundle" (OuterVolumeSpecName: "bundle") pod "113dbb1c-c7c0-45ce-99d8-86156cbce45c" (UID: "113dbb1c-c7c0-45ce-99d8-86156cbce45c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.672367 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113dbb1c-c7c0-45ce-99d8-86156cbce45c-kube-api-access-tzpk9" (OuterVolumeSpecName: "kube-api-access-tzpk9") pod "113dbb1c-c7c0-45ce-99d8-86156cbce45c" (UID: "113dbb1c-c7c0-45ce-99d8-86156cbce45c"). InnerVolumeSpecName "kube-api-access-tzpk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.677250 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-util" (OuterVolumeSpecName: "util") pod "113dbb1c-c7c0-45ce-99d8-86156cbce45c" (UID: "113dbb1c-c7c0-45ce-99d8-86156cbce45c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.766486 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.766532 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/113dbb1c-c7c0-45ce-99d8-86156cbce45c-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:46 crc kubenswrapper[4771]: I0219 21:41:46.766550 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzpk9\" (UniqueName: \"kubernetes.io/projected/113dbb1c-c7c0-45ce-99d8-86156cbce45c-kube-api-access-tzpk9\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:47 crc kubenswrapper[4771]: I0219 21:41:47.291871 4771 generic.go:334] "Generic (PLEG): container finished" podID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerID="6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a" exitCode=0 Feb 19 21:41:47 crc kubenswrapper[4771]: I0219 21:41:47.292038 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r5v5" event={"ID":"88fc6650-97c5-4168-b6fd-c8fc10819dfa","Type":"ContainerDied","Data":"6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a"} Feb 19 21:41:47 crc kubenswrapper[4771]: I0219 21:41:47.295173 4771 generic.go:334] "Generic (PLEG): container finished" podID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerID="55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072" exitCode=0 Feb 19 21:41:47 crc kubenswrapper[4771]: I0219 21:41:47.295232 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n685c" event={"ID":"86c9413f-fb4e-4d10-807e-e3da51cc96d0","Type":"ContainerDied","Data":"55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072"} Feb 19 21:41:47 crc kubenswrapper[4771]: I0219 21:41:47.303619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" event={"ID":"113dbb1c-c7c0-45ce-99d8-86156cbce45c","Type":"ContainerDied","Data":"f99777a3bdb679c69a37c9ece76f8c478a3ab7008d7ec489cb1b5a6a6ae284f3"} Feb 19 21:41:47 crc kubenswrapper[4771]: I0219 21:41:47.303708 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td" Feb 19 21:41:47 crc kubenswrapper[4771]: I0219 21:41:47.303765 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f99777a3bdb679c69a37c9ece76f8c478a3ab7008d7ec489cb1b5a6a6ae284f3" Feb 19 21:41:48 crc kubenswrapper[4771]: I0219 21:41:48.314199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r5v5" event={"ID":"88fc6650-97c5-4168-b6fd-c8fc10819dfa","Type":"ContainerStarted","Data":"1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281"} Feb 19 21:41:48 crc kubenswrapper[4771]: I0219 21:41:48.321401 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n685c" event={"ID":"86c9413f-fb4e-4d10-807e-e3da51cc96d0","Type":"ContainerStarted","Data":"4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12"} Feb 19 21:41:48 crc kubenswrapper[4771]: I0219 21:41:48.375221 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n685c" podStartSLOduration=1.803419377 podStartE2EDuration="5.375190074s" podCreationTimestamp="2026-02-19 21:41:43 +0000 UTC" firstStartedPulling="2026-02-19 21:41:44.260340872 +0000 UTC m=+804.531783352" lastFinishedPulling="2026-02-19 21:41:47.832111539 +0000 UTC m=+808.103554049" observedRunningTime="2026-02-19 21:41:48.367784379 +0000 UTC m=+808.639226939" watchObservedRunningTime="2026-02-19 21:41:48.375190074 +0000 UTC m=+808.646632594" Feb 19 21:41:49 crc kubenswrapper[4771]: I0219 21:41:49.331981 4771 generic.go:334] "Generic (PLEG): container finished" podID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerID="1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281" exitCode=0 Feb 19 21:41:49 crc kubenswrapper[4771]: I0219 21:41:49.332060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r5v5" event={"ID":"88fc6650-97c5-4168-b6fd-c8fc10819dfa","Type":"ContainerDied","Data":"1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281"} Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.269619 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4hbnh"] Feb 19 21:41:50 crc kubenswrapper[4771]: E0219 21:41:50.270166 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerName="util" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.270182 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerName="util" Feb 19 21:41:50 crc kubenswrapper[4771]: E0219 21:41:50.270197 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerName="pull" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.270204 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerName="pull" Feb 19 21:41:50 crc kubenswrapper[4771]: E0219 21:41:50.270221 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerName="extract" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.270227 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerName="extract" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.270319 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="113dbb1c-c7c0-45ce-99d8-86156cbce45c" containerName="extract" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.276845 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.286287 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.286593 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.286859 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-pz74g" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.289107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbmpv\" (UniqueName: \"kubernetes.io/projected/0e3f0120-0714-42a2-b7ff-d4c25acd93c4-kube-api-access-wbmpv\") pod \"nmstate-operator-694c9596b7-4hbnh\" (UID: \"0e3f0120-0714-42a2-b7ff-d4c25acd93c4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.297553 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4hbnh"] Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.344636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r5v5" event={"ID":"88fc6650-97c5-4168-b6fd-c8fc10819dfa","Type":"ContainerStarted","Data":"a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319"} Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.367747 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6r5v5" podStartSLOduration=2.942685674 podStartE2EDuration="5.367722319s" podCreationTimestamp="2026-02-19 21:41:45 +0000 UTC" firstStartedPulling="2026-02-19 21:41:47.293232885 +0000 UTC m=+807.564675355" lastFinishedPulling="2026-02-19 21:41:49.71826953 +0000 UTC m=+809.989712000" observedRunningTime="2026-02-19 21:41:50.363487997 +0000 UTC m=+810.634930507" watchObservedRunningTime="2026-02-19 21:41:50.367722319 +0000 UTC m=+810.639164829" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.390216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbmpv\" (UniqueName: \"kubernetes.io/projected/0e3f0120-0714-42a2-b7ff-d4c25acd93c4-kube-api-access-wbmpv\") pod \"nmstate-operator-694c9596b7-4hbnh\" (UID: \"0e3f0120-0714-42a2-b7ff-d4c25acd93c4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.412431 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbmpv\" (UniqueName: \"kubernetes.io/projected/0e3f0120-0714-42a2-b7ff-d4c25acd93c4-kube-api-access-wbmpv\") pod \"nmstate-operator-694c9596b7-4hbnh\" (UID: \"0e3f0120-0714-42a2-b7ff-d4c25acd93c4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.636552 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" Feb 19 21:41:50 crc kubenswrapper[4771]: I0219 21:41:50.883305 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4hbnh"] Feb 19 21:41:51 crc kubenswrapper[4771]: I0219 21:41:51.360179 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" event={"ID":"0e3f0120-0714-42a2-b7ff-d4c25acd93c4","Type":"ContainerStarted","Data":"47e320f4dd35353de3c3dad3521f224cc7ac5ba3caa17609312be132d4da97ef"} Feb 19 21:41:53 crc kubenswrapper[4771]: I0219 21:41:53.918774 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:53 crc kubenswrapper[4771]: I0219 21:41:53.919094 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:41:54 crc kubenswrapper[4771]: I0219 21:41:54.380060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" event={"ID":"0e3f0120-0714-42a2-b7ff-d4c25acd93c4","Type":"ContainerStarted","Data":"4d907f06f002323955d767edf4bda900ddae311b2f6ff15add30c8b083d6c675"} Feb 19 21:41:54 crc kubenswrapper[4771]: I0219 21:41:54.406859 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-4hbnh" podStartSLOduration=1.477976377 podStartE2EDuration="4.406832038s" podCreationTimestamp="2026-02-19 21:41:50 +0000 UTC" firstStartedPulling="2026-02-19 21:41:50.89627255 +0000 UTC m=+811.167715030" lastFinishedPulling="2026-02-19 21:41:53.825128211 +0000 UTC m=+814.096570691" observedRunningTime="2026-02-19 21:41:54.40122996 +0000 UTC m=+814.672672470" watchObservedRunningTime="2026-02-19 21:41:54.406832038 +0000 UTC m=+814.678274548" Feb 19 21:41:54 crc kubenswrapper[4771]: I0219 21:41:54.973184 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n685c" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="registry-server" probeResult="failure" output=< Feb 19 21:41:54 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 21:41:54 crc kubenswrapper[4771]: > Feb 19 21:41:55 crc kubenswrapper[4771]: I0219 21:41:55.923302 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:55 crc kubenswrapper[4771]: I0219 21:41:55.923388 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:55 crc kubenswrapper[4771]: I0219 21:41:55.996566 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.357138 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rj8v2"] Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.358751 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.380298 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rj8v2"] Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.446341 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.469661 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-catalog-content\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.469779 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-utilities\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.469979 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltztr\" (UniqueName: \"kubernetes.io/projected/91882b82-9663-470b-a4cc-1db296c09408-kube-api-access-ltztr\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.571757 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltztr\" (UniqueName: \"kubernetes.io/projected/91882b82-9663-470b-a4cc-1db296c09408-kube-api-access-ltztr\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.571912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-catalog-content\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.572175 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-utilities\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.572945 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-utilities\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.573259 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-catalog-content\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.611422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltztr\" (UniqueName: \"kubernetes.io/projected/91882b82-9663-470b-a4cc-1db296c09408-kube-api-access-ltztr\") pod \"certified-operators-rj8v2\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.691387 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:41:56 crc kubenswrapper[4771]: I0219 21:41:56.942592 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rj8v2"] Feb 19 21:41:56 crc kubenswrapper[4771]: W0219 21:41:56.947761 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91882b82_9663_470b_a4cc_1db296c09408.slice/crio-154d28cd681abffe73da779818f114f2b9fbedc195b028c81e22f6c5e056944e WatchSource:0}: Error finding container 154d28cd681abffe73da779818f114f2b9fbedc195b028c81e22f6c5e056944e: Status 404 returned error can't find the container with id 154d28cd681abffe73da779818f114f2b9fbedc195b028c81e22f6c5e056944e Feb 19 21:41:57 crc kubenswrapper[4771]: I0219 21:41:57.396263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rj8v2" event={"ID":"91882b82-9663-470b-a4cc-1db296c09408","Type":"ContainerStarted","Data":"154d28cd681abffe73da779818f114f2b9fbedc195b028c81e22f6c5e056944e"} Feb 19 21:41:58 crc kubenswrapper[4771]: I0219 21:41:58.351166 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r5v5"] Feb 19 21:41:58 crc kubenswrapper[4771]: I0219 21:41:58.427788 4771 generic.go:334] "Generic (PLEG): container finished" podID="91882b82-9663-470b-a4cc-1db296c09408" containerID="4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174" exitCode=0 Feb 19 21:41:58 crc kubenswrapper[4771]: I0219 21:41:58.428091 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rj8v2" event={"ID":"91882b82-9663-470b-a4cc-1db296c09408","Type":"ContainerDied","Data":"4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174"} Feb 19 21:41:58 crc kubenswrapper[4771]: I0219 21:41:58.428500 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6r5v5" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="registry-server" containerID="cri-o://a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319" gracePeriod=2 Feb 19 21:41:58 crc kubenswrapper[4771]: I0219 21:41:58.914381 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.109627 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-utilities\") pod \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.109753 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p64xv\" (UniqueName: \"kubernetes.io/projected/88fc6650-97c5-4168-b6fd-c8fc10819dfa-kube-api-access-p64xv\") pod \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.109811 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-catalog-content\") pod \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\" (UID: \"88fc6650-97c5-4168-b6fd-c8fc10819dfa\") " Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.111614 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-utilities" (OuterVolumeSpecName: "utilities") pod "88fc6650-97c5-4168-b6fd-c8fc10819dfa" (UID: "88fc6650-97c5-4168-b6fd-c8fc10819dfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.117649 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88fc6650-97c5-4168-b6fd-c8fc10819dfa-kube-api-access-p64xv" (OuterVolumeSpecName: "kube-api-access-p64xv") pod "88fc6650-97c5-4168-b6fd-c8fc10819dfa" (UID: "88fc6650-97c5-4168-b6fd-c8fc10819dfa"). InnerVolumeSpecName "kube-api-access-p64xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.163983 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88fc6650-97c5-4168-b6fd-c8fc10819dfa" (UID: "88fc6650-97c5-4168-b6fd-c8fc10819dfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.216284 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.216351 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p64xv\" (UniqueName: \"kubernetes.io/projected/88fc6650-97c5-4168-b6fd-c8fc10819dfa-kube-api-access-p64xv\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.216381 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88fc6650-97c5-4168-b6fd-c8fc10819dfa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.441121 4771 generic.go:334] "Generic (PLEG): container finished" podID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerID="a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319" exitCode=0 Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.441196 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r5v5" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.441219 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r5v5" event={"ID":"88fc6650-97c5-4168-b6fd-c8fc10819dfa","Type":"ContainerDied","Data":"a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319"} Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.441903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r5v5" event={"ID":"88fc6650-97c5-4168-b6fd-c8fc10819dfa","Type":"ContainerDied","Data":"56473f7022de7b786fa5c6716f3a386128a85d1711a3f131ef93ab4e02409800"} Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.441940 4771 scope.go:117] "RemoveContainer" containerID="a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.445148 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rj8v2" event={"ID":"91882b82-9663-470b-a4cc-1db296c09408","Type":"ContainerStarted","Data":"794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d"} Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.467375 4771 scope.go:117] "RemoveContainer" containerID="1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.496213 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r5v5"] Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.500424 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r5v5"] Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.521201 4771 scope.go:117] "RemoveContainer" containerID="6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.558129 4771 scope.go:117] "RemoveContainer" containerID="a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319" Feb 19 21:41:59 crc kubenswrapper[4771]: E0219 21:41:59.558450 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319\": container with ID starting with a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319 not found: ID does not exist" containerID="a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.558489 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319"} err="failed to get container status \"a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319\": rpc error: code = NotFound desc = could not find container \"a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319\": container with ID starting with a53c5357130d6e3a1ea1ec7904202ade4aca9adba316f43bee9d9b70a8e35319 not found: ID does not exist" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.558526 4771 scope.go:117] "RemoveContainer" containerID="1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281" Feb 19 21:41:59 crc kubenswrapper[4771]: E0219 21:41:59.558885 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281\": container with ID starting with 1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281 not found: ID does not exist" containerID="1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.558912 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281"} err="failed to get container status \"1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281\": rpc error: code = NotFound desc = could not find container \"1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281\": container with ID starting with 1a1690e44f533166a5ea74bef0f9b507ef6ea4929ad3223343e0f189ec003281 not found: ID does not exist" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.558929 4771 scope.go:117] "RemoveContainer" containerID="6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a" Feb 19 21:41:59 crc kubenswrapper[4771]: E0219 21:41:59.559260 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a\": container with ID starting with 6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a not found: ID does not exist" containerID="6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a" Feb 19 21:41:59 crc kubenswrapper[4771]: I0219 21:41:59.559290 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a"} err="failed to get container status \"6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a\": rpc error: code = NotFound desc = could not find container \"6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a\": container with ID starting with 6cf2977572c010a5ce51b2122ebfde99c46b5f00b9461276a570ccb8a5b03b4a not found: ID does not exist" Feb 19 21:42:00 crc kubenswrapper[4771]: I0219 21:42:00.449252 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" path="/var/lib/kubelet/pods/88fc6650-97c5-4168-b6fd-c8fc10819dfa/volumes" Feb 19 21:42:00 crc kubenswrapper[4771]: I0219 21:42:00.456094 4771 generic.go:334] "Generic (PLEG): container finished" podID="91882b82-9663-470b-a4cc-1db296c09408" containerID="794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d" exitCode=0 Feb 19 21:42:00 crc kubenswrapper[4771]: I0219 21:42:00.456217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rj8v2" event={"ID":"91882b82-9663-470b-a4cc-1db296c09408","Type":"ContainerDied","Data":"794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d"} Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.462400 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rj8v2" event={"ID":"91882b82-9663-470b-a4cc-1db296c09408","Type":"ContainerStarted","Data":"e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4"} Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.484135 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rj8v2" podStartSLOduration=2.960701413 podStartE2EDuration="5.48412108s" podCreationTimestamp="2026-02-19 21:41:56 +0000 UTC" firstStartedPulling="2026-02-19 21:41:58.433527628 +0000 UTC m=+818.704970128" lastFinishedPulling="2026-02-19 21:42:00.956947315 +0000 UTC m=+821.228389795" observedRunningTime="2026-02-19 21:42:01.482181868 +0000 UTC m=+821.753624348" watchObservedRunningTime="2026-02-19 21:42:01.48412108 +0000 UTC m=+821.755563550" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.626451 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx"] Feb 19 21:42:01 crc kubenswrapper[4771]: E0219 21:42:01.626723 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="extract-utilities" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.626750 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="extract-utilities" Feb 19 21:42:01 crc kubenswrapper[4771]: E0219 21:42:01.626760 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="registry-server" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.626769 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="registry-server" Feb 19 21:42:01 crc kubenswrapper[4771]: E0219 21:42:01.626797 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="extract-content" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.626805 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="extract-content" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.626944 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fc6650-97c5-4168-b6fd-c8fc10819dfa" containerName="registry-server" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.627374 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.630125 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-gkwsf" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.630355 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.641114 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx"] Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.659221 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hcxck"] Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.659851 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.680221 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb"] Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.680932 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.736966 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb"] Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.745953 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-dbus-socket\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.745991 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-ovs-socket\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.746012 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/676b32b6-a0a8-4f1a-9c8f-242842320451-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zgzhx\" (UID: \"676b32b6-a0a8-4f1a-9c8f-242842320451\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.746069 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbmgv\" (UniqueName: \"kubernetes.io/projected/7fe6cc7b-ec96-4fcd-8685-0583ce9d6538-kube-api-access-gbmgv\") pod \"nmstate-metrics-58c85c668d-f5vsb\" (UID: \"7fe6cc7b-ec96-4fcd-8685-0583ce9d6538\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.746091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-nmstate-lock\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.746105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnnt\" (UniqueName: \"kubernetes.io/projected/e61d563d-a115-4346-b98a-ffa4d0ec4393-kube-api-access-vrnnt\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.746133 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66w47\" (UniqueName: \"kubernetes.io/projected/676b32b6-a0a8-4f1a-9c8f-242842320451-kube-api-access-66w47\") pod \"nmstate-webhook-866bcb46dc-zgzhx\" (UID: \"676b32b6-a0a8-4f1a-9c8f-242842320451\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.818116 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc"] Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.818892 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.822918 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.822955 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.823285 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rmkwv" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.839546 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc"] Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.846869 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-nmstate-lock\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.846901 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnnt\" (UniqueName: \"kubernetes.io/projected/e61d563d-a115-4346-b98a-ffa4d0ec4393-kube-api-access-vrnnt\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.846927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6mcb\" (UniqueName: \"kubernetes.io/projected/63c374a8-819b-4a41-8836-1a23409d0a12-kube-api-access-f6mcb\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.846954 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66w47\" (UniqueName: \"kubernetes.io/projected/676b32b6-a0a8-4f1a-9c8f-242842320451-kube-api-access-66w47\") pod \"nmstate-webhook-866bcb46dc-zgzhx\" (UID: \"676b32b6-a0a8-4f1a-9c8f-242842320451\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.846977 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-dbus-socket\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.846991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-ovs-socket\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.847010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/676b32b6-a0a8-4f1a-9c8f-242842320451-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zgzhx\" (UID: \"676b32b6-a0a8-4f1a-9c8f-242842320451\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.847030 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-nmstate-lock\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.847044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/63c374a8-819b-4a41-8836-1a23409d0a12-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.847189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/63c374a8-819b-4a41-8836-1a23409d0a12-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.847260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbmgv\" (UniqueName: \"kubernetes.io/projected/7fe6cc7b-ec96-4fcd-8685-0583ce9d6538-kube-api-access-gbmgv\") pod \"nmstate-metrics-58c85c668d-f5vsb\" (UID: \"7fe6cc7b-ec96-4fcd-8685-0583ce9d6538\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.847309 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-ovs-socket\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.847487 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e61d563d-a115-4346-b98a-ffa4d0ec4393-dbus-socket\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: E0219 21:42:01.847503 4771 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 19 21:42:01 crc kubenswrapper[4771]: E0219 21:42:01.847615 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676b32b6-a0a8-4f1a-9c8f-242842320451-tls-key-pair podName:676b32b6-a0a8-4f1a-9c8f-242842320451 nodeName:}" failed. No retries permitted until 2026-02-19 21:42:02.347586533 +0000 UTC m=+822.619029013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/676b32b6-a0a8-4f1a-9c8f-242842320451-tls-key-pair") pod "nmstate-webhook-866bcb46dc-zgzhx" (UID: "676b32b6-a0a8-4f1a-9c8f-242842320451") : secret "openshift-nmstate-webhook" not found Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.867726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnnt\" (UniqueName: \"kubernetes.io/projected/e61d563d-a115-4346-b98a-ffa4d0ec4393-kube-api-access-vrnnt\") pod \"nmstate-handler-hcxck\" (UID: \"e61d563d-a115-4346-b98a-ffa4d0ec4393\") " pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.867821 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66w47\" (UniqueName: \"kubernetes.io/projected/676b32b6-a0a8-4f1a-9c8f-242842320451-kube-api-access-66w47\") pod \"nmstate-webhook-866bcb46dc-zgzhx\" (UID: \"676b32b6-a0a8-4f1a-9c8f-242842320451\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.867933 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbmgv\" (UniqueName: \"kubernetes.io/projected/7fe6cc7b-ec96-4fcd-8685-0583ce9d6538-kube-api-access-gbmgv\") pod \"nmstate-metrics-58c85c668d-f5vsb\" (UID: \"7fe6cc7b-ec96-4fcd-8685-0583ce9d6538\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.947974 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/63c374a8-819b-4a41-8836-1a23409d0a12-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.948077 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6mcb\" (UniqueName: \"kubernetes.io/projected/63c374a8-819b-4a41-8836-1a23409d0a12-kube-api-access-f6mcb\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.948147 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/63c374a8-819b-4a41-8836-1a23409d0a12-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: E0219 21:42:01.948200 4771 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 19 21:42:01 crc kubenswrapper[4771]: E0219 21:42:01.948325 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c374a8-819b-4a41-8836-1a23409d0a12-plugin-serving-cert podName:63c374a8-819b-4a41-8836-1a23409d0a12 nodeName:}" failed. No retries permitted until 2026-02-19 21:42:02.448299817 +0000 UTC m=+822.719742297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/63c374a8-819b-4a41-8836-1a23409d0a12-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-6hqlc" (UID: "63c374a8-819b-4a41-8836-1a23409d0a12") : secret "plugin-serving-cert" not found Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.949421 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/63c374a8-819b-4a41-8836-1a23409d0a12-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.976277 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.976871 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6mcb\" (UniqueName: \"kubernetes.io/projected/63c374a8-819b-4a41-8836-1a23409d0a12-kube-api-access-f6mcb\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:01 crc kubenswrapper[4771]: I0219 21:42:01.992993 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" Feb 19 21:42:02 crc kubenswrapper[4771]: W0219 21:42:02.003274 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61d563d_a115_4346_b98a_ffa4d0ec4393.slice/crio-8d30278e4a1269e6aa68d798d8f7aced1bf694aa623ec757745fe1551608950a WatchSource:0}: Error finding container 8d30278e4a1269e6aa68d798d8f7aced1bf694aa623ec757745fe1551608950a: Status 404 returned error can't find the container with id 8d30278e4a1269e6aa68d798d8f7aced1bf694aa623ec757745fe1551608950a Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.007242 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-557dc9b6b6-z259k"] Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.007875 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.020836 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557dc9b6b6-z259k"] Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.048793 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-oauth-serving-cert\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.049389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-trusted-ca-bundle\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.049501 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-serving-cert\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.049617 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv64b\" (UniqueName: \"kubernetes.io/projected/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-kube-api-access-lv64b\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.049751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-service-ca\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.049856 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-config\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.049954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-oauth-config\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.152643 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-service-ca\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.152687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-config\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.152706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-oauth-config\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.152747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-oauth-serving-cert\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.152782 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-trusted-ca-bundle\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.152801 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-serving-cert\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.152816 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv64b\" (UniqueName: \"kubernetes.io/projected/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-kube-api-access-lv64b\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.153758 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-service-ca\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.154251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-oauth-serving-cert\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.154808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-config\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.154962 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-trusted-ca-bundle\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.157808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-serving-cert\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.167830 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-console-oauth-config\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.176449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv64b\" (UniqueName: \"kubernetes.io/projected/8d7a3bca-afdc-4491-a074-bd7abbbc46b7-kube-api-access-lv64b\") pod \"console-557dc9b6b6-z259k\" (UID: \"8d7a3bca-afdc-4491-a074-bd7abbbc46b7\") " pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.255699 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb"] Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.358396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/676b32b6-a0a8-4f1a-9c8f-242842320451-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zgzhx\" (UID: \"676b32b6-a0a8-4f1a-9c8f-242842320451\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.358958 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.361984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/676b32b6-a0a8-4f1a-9c8f-242842320451-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zgzhx\" (UID: \"676b32b6-a0a8-4f1a-9c8f-242842320451\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.459676 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/63c374a8-819b-4a41-8836-1a23409d0a12-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.464124 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/63c374a8-819b-4a41-8836-1a23409d0a12-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6hqlc\" (UID: \"63c374a8-819b-4a41-8836-1a23409d0a12\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.471612 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" event={"ID":"7fe6cc7b-ec96-4fcd-8685-0583ce9d6538","Type":"ContainerStarted","Data":"a3a4a19830eac345be7352d896d5c88f090cad0ceb133f122f4cfc86b472d76a"} Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.473209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hcxck" event={"ID":"e61d563d-a115-4346-b98a-ffa4d0ec4393","Type":"ContainerStarted","Data":"8d30278e4a1269e6aa68d798d8f7aced1bf694aa623ec757745fe1551608950a"} Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.547961 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:02 crc kubenswrapper[4771]: W0219 21:42:02.576451 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d7a3bca_afdc_4491_a074_bd7abbbc46b7.slice/crio-152d46e6647a9a8fb5cb20aaac56986b9bae29529302d1488fffeb70a96a4951 WatchSource:0}: Error finding container 152d46e6647a9a8fb5cb20aaac56986b9bae29529302d1488fffeb70a96a4951: Status 404 returned error can't find the container with id 152d46e6647a9a8fb5cb20aaac56986b9bae29529302d1488fffeb70a96a4951 Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.576642 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557dc9b6b6-z259k"] Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.735856 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.786319 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx"] Feb 19 21:42:02 crc kubenswrapper[4771]: W0219 21:42:02.793499 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676b32b6_a0a8_4f1a_9c8f_242842320451.slice/crio-34ebe98efdbe28172508eb84ce5fc43faddef60d4eb504233cf970d35ceeaa4d WatchSource:0}: Error finding container 34ebe98efdbe28172508eb84ce5fc43faddef60d4eb504233cf970d35ceeaa4d: Status 404 returned error can't find the container with id 34ebe98efdbe28172508eb84ce5fc43faddef60d4eb504233cf970d35ceeaa4d Feb 19 21:42:02 crc kubenswrapper[4771]: I0219 21:42:02.934816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc"] Feb 19 21:42:03 crc kubenswrapper[4771]: I0219 21:42:03.480122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557dc9b6b6-z259k" event={"ID":"8d7a3bca-afdc-4491-a074-bd7abbbc46b7","Type":"ContainerStarted","Data":"99ffd16aab5618177b0633507c65c7d1bdcaf86af12e31c9be4e22ae38138bf9"} Feb 19 21:42:03 crc kubenswrapper[4771]: I0219 21:42:03.480160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557dc9b6b6-z259k" event={"ID":"8d7a3bca-afdc-4491-a074-bd7abbbc46b7","Type":"ContainerStarted","Data":"152d46e6647a9a8fb5cb20aaac56986b9bae29529302d1488fffeb70a96a4951"} Feb 19 21:42:03 crc kubenswrapper[4771]: I0219 21:42:03.481674 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" event={"ID":"676b32b6-a0a8-4f1a-9c8f-242842320451","Type":"ContainerStarted","Data":"34ebe98efdbe28172508eb84ce5fc43faddef60d4eb504233cf970d35ceeaa4d"} Feb 19 21:42:03 crc kubenswrapper[4771]: I0219 21:42:03.483182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" event={"ID":"63c374a8-819b-4a41-8836-1a23409d0a12","Type":"ContainerStarted","Data":"b6e3fb56d79f0f0f3524931facdba1aff0b5fc08f16d9bf79c8f8b239c4ebf43"} Feb 19 21:42:03 crc kubenswrapper[4771]: I0219 21:42:03.495955 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-557dc9b6b6-z259k" podStartSLOduration=2.495939784 podStartE2EDuration="2.495939784s" podCreationTimestamp="2026-02-19 21:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:42:03.493686415 +0000 UTC m=+823.765128895" watchObservedRunningTime="2026-02-19 21:42:03.495939784 +0000 UTC m=+823.767382254" Feb 19 21:42:03 crc kubenswrapper[4771]: I0219 21:42:03.981848 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:42:04 crc kubenswrapper[4771]: I0219 21:42:04.033194 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.496894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hcxck" event={"ID":"e61d563d-a115-4346-b98a-ffa4d0ec4393","Type":"ContainerStarted","Data":"9fbcad64eb353d54331f67eafdb1adcd88ad0c14236ecd61b65605ed57c0407c"} Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.499903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" event={"ID":"676b32b6-a0a8-4f1a-9c8f-242842320451","Type":"ContainerStarted","Data":"62cdf4a0bedf363a977bec832e6ea74427e76c43b5f9931df382827f46ca4c0b"} Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.501333 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.502366 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" event={"ID":"7fe6cc7b-ec96-4fcd-8685-0583ce9d6538","Type":"ContainerStarted","Data":"d599e4870e3756d825504e8f1bec4d121d36713a210778a24304091b05784168"} Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.515391 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hcxck" podStartSLOduration=1.640768494 podStartE2EDuration="4.51536848s" podCreationTimestamp="2026-02-19 21:42:01 +0000 UTC" firstStartedPulling="2026-02-19 21:42:02.005780788 +0000 UTC m=+822.277223278" lastFinishedPulling="2026-02-19 21:42:04.880380794 +0000 UTC m=+825.151823264" observedRunningTime="2026-02-19 21:42:05.515288178 +0000 UTC m=+825.786730668" watchObservedRunningTime="2026-02-19 21:42:05.51536848 +0000 UTC m=+825.786810980" Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.948321 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" podStartSLOduration=2.7981273079999998 podStartE2EDuration="4.948300522s" podCreationTimestamp="2026-02-19 21:42:01 +0000 UTC" firstStartedPulling="2026-02-19 21:42:02.796114573 +0000 UTC m=+823.067557043" lastFinishedPulling="2026-02-19 21:42:04.946287747 +0000 UTC m=+825.217730257" observedRunningTime="2026-02-19 21:42:05.53046828 +0000 UTC m=+825.801910790" watchObservedRunningTime="2026-02-19 21:42:05.948300522 +0000 UTC m=+826.219743002" Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.951300 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n685c"] Feb 19 21:42:05 crc kubenswrapper[4771]: I0219 21:42:05.951549 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n685c" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="registry-server" containerID="cri-o://4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12" gracePeriod=2 Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.377313 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.417337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-utilities\") pod \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.417650 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-catalog-content\") pod \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.425101 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhs58\" (UniqueName: \"kubernetes.io/projected/86c9413f-fb4e-4d10-807e-e3da51cc96d0-kube-api-access-nhs58\") pod \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\" (UID: \"86c9413f-fb4e-4d10-807e-e3da51cc96d0\") " Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.418560 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-utilities" (OuterVolumeSpecName: "utilities") pod "86c9413f-fb4e-4d10-807e-e3da51cc96d0" (UID: "86c9413f-fb4e-4d10-807e-e3da51cc96d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.425629 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.429878 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c9413f-fb4e-4d10-807e-e3da51cc96d0-kube-api-access-nhs58" (OuterVolumeSpecName: "kube-api-access-nhs58") pod "86c9413f-fb4e-4d10-807e-e3da51cc96d0" (UID: "86c9413f-fb4e-4d10-807e-e3da51cc96d0"). InnerVolumeSpecName "kube-api-access-nhs58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.527262 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhs58\" (UniqueName: \"kubernetes.io/projected/86c9413f-fb4e-4d10-807e-e3da51cc96d0-kube-api-access-nhs58\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.528981 4771 generic.go:334] "Generic (PLEG): container finished" podID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerID="4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12" exitCode=0 Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.529047 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n685c" event={"ID":"86c9413f-fb4e-4d10-807e-e3da51cc96d0","Type":"ContainerDied","Data":"4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12"} Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.529107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n685c" event={"ID":"86c9413f-fb4e-4d10-807e-e3da51cc96d0","Type":"ContainerDied","Data":"ae64d75211063a43e8abc2d03da2735ec748d44e3bd069fd86bd096e34e08612"} Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.529130 4771 scope.go:117] "RemoveContainer" containerID="4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.529135 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n685c" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.531619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" event={"ID":"63c374a8-819b-4a41-8836-1a23409d0a12","Type":"ContainerStarted","Data":"503f9e315a38c00a3391b9d74ddf9409c1d1737fe8a420ab7c1e9cca9b0664a1"} Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.531841 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.545222 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6hqlc" podStartSLOduration=2.559231558 podStartE2EDuration="5.54520129s" podCreationTimestamp="2026-02-19 21:42:01 +0000 UTC" firstStartedPulling="2026-02-19 21:42:02.944716874 +0000 UTC m=+823.216159344" lastFinishedPulling="2026-02-19 21:42:05.930686605 +0000 UTC m=+826.202129076" observedRunningTime="2026-02-19 21:42:06.543650329 +0000 UTC m=+826.815092809" watchObservedRunningTime="2026-02-19 21:42:06.54520129 +0000 UTC m=+826.816643770" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.547927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86c9413f-fb4e-4d10-807e-e3da51cc96d0" (UID: "86c9413f-fb4e-4d10-807e-e3da51cc96d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.559794 4771 scope.go:117] "RemoveContainer" containerID="55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.586649 4771 scope.go:117] "RemoveContainer" containerID="f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.603292 4771 scope.go:117] "RemoveContainer" containerID="4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12" Feb 19 21:42:06 crc kubenswrapper[4771]: E0219 21:42:06.603825 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12\": container with ID starting with 4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12 not found: ID does not exist" containerID="4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.603871 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12"} err="failed to get container status \"4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12\": rpc error: code = NotFound desc = could not find container \"4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12\": container with ID starting with 4f88a44733b510f7119a118ce0a9bce8adb930232b01a160a3a7d73ebabe2c12 not found: ID does not exist" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.603898 4771 scope.go:117] "RemoveContainer" containerID="55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072" Feb 19 21:42:06 crc kubenswrapper[4771]: E0219 21:42:06.604573 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072\": container with ID starting with 55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072 not found: ID does not exist" containerID="55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.604597 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072"} err="failed to get container status \"55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072\": rpc error: code = NotFound desc = could not find container \"55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072\": container with ID starting with 55b3c92a80538ef0ade649a27a90038fddcc3c91abeae7954defc3edf048e072 not found: ID does not exist" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.604613 4771 scope.go:117] "RemoveContainer" containerID="f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb" Feb 19 21:42:06 crc kubenswrapper[4771]: E0219 21:42:06.604895 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb\": container with ID starting with f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb not found: ID does not exist" containerID="f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.604975 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb"} err="failed to get container status \"f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb\": rpc error: code = NotFound desc = could not find container \"f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb\": container with ID starting with f20d2f42821a11205d650c36470500e0b5fb1c7a4bf07803623c1f2bcf6c47eb not found: ID does not exist" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.628676 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c9413f-fb4e-4d10-807e-e3da51cc96d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.691800 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.691857 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.743741 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.869776 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n685c"] Feb 19 21:42:06 crc kubenswrapper[4771]: I0219 21:42:06.874660 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n685c"] Feb 19 21:42:07 crc kubenswrapper[4771]: I0219 21:42:07.543791 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" event={"ID":"7fe6cc7b-ec96-4fcd-8685-0583ce9d6538","Type":"ContainerStarted","Data":"22317a3477ac4fb1452e0ebb867bf25c52f769073c6d378427d800def34587e3"} Feb 19 21:42:07 crc kubenswrapper[4771]: I0219 21:42:07.567913 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f5vsb" podStartSLOduration=1.526755538 podStartE2EDuration="6.567890222s" podCreationTimestamp="2026-02-19 21:42:01 +0000 UTC" firstStartedPulling="2026-02-19 21:42:02.265081306 +0000 UTC m=+822.536523776" lastFinishedPulling="2026-02-19 21:42:07.30621599 +0000 UTC m=+827.577658460" observedRunningTime="2026-02-19 21:42:07.561471332 +0000 UTC m=+827.832913872" watchObservedRunningTime="2026-02-19 21:42:07.567890222 +0000 UTC m=+827.839332732" Feb 19 21:42:07 crc kubenswrapper[4771]: I0219 21:42:07.639312 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:42:08 crc kubenswrapper[4771]: I0219 21:42:08.443886 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" path="/var/lib/kubelet/pods/86c9413f-fb4e-4d10-807e-e3da51cc96d0/volumes" Feb 19 21:42:09 crc kubenswrapper[4771]: I0219 21:42:09.153854 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rj8v2"] Feb 19 21:42:09 crc kubenswrapper[4771]: I0219 21:42:09.561608 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rj8v2" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="registry-server" containerID="cri-o://e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4" gracePeriod=2 Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.028284 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.074126 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-utilities\") pod \"91882b82-9663-470b-a4cc-1db296c09408\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.074333 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltztr\" (UniqueName: \"kubernetes.io/projected/91882b82-9663-470b-a4cc-1db296c09408-kube-api-access-ltztr\") pod \"91882b82-9663-470b-a4cc-1db296c09408\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.074394 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-catalog-content\") pod \"91882b82-9663-470b-a4cc-1db296c09408\" (UID: \"91882b82-9663-470b-a4cc-1db296c09408\") " Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.077126 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-utilities" (OuterVolumeSpecName: "utilities") pod "91882b82-9663-470b-a4cc-1db296c09408" (UID: "91882b82-9663-470b-a4cc-1db296c09408"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.082247 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91882b82-9663-470b-a4cc-1db296c09408-kube-api-access-ltztr" (OuterVolumeSpecName: "kube-api-access-ltztr") pod "91882b82-9663-470b-a4cc-1db296c09408" (UID: "91882b82-9663-470b-a4cc-1db296c09408"). InnerVolumeSpecName "kube-api-access-ltztr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.175644 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltztr\" (UniqueName: \"kubernetes.io/projected/91882b82-9663-470b-a4cc-1db296c09408-kube-api-access-ltztr\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.175678 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.573172 4771 generic.go:334] "Generic (PLEG): container finished" podID="91882b82-9663-470b-a4cc-1db296c09408" containerID="e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4" exitCode=0 Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.573241 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rj8v2" event={"ID":"91882b82-9663-470b-a4cc-1db296c09408","Type":"ContainerDied","Data":"e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4"} Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.573297 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rj8v2" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.573330 4771 scope.go:117] "RemoveContainer" containerID="e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.573312 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rj8v2" event={"ID":"91882b82-9663-470b-a4cc-1db296c09408","Type":"ContainerDied","Data":"154d28cd681abffe73da779818f114f2b9fbedc195b028c81e22f6c5e056944e"} Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.604935 4771 scope.go:117] "RemoveContainer" containerID="794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.636479 4771 scope.go:117] "RemoveContainer" containerID="4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.658518 4771 scope.go:117] "RemoveContainer" containerID="e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4" Feb 19 21:42:10 crc kubenswrapper[4771]: E0219 21:42:10.659094 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4\": container with ID starting with e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4 not found: ID does not exist" containerID="e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.659161 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4"} err="failed to get container status \"e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4\": rpc error: code = NotFound desc = could not find container \"e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4\": container with ID starting with e0613a82b9ae89115b39b4997aaad5c9ad0957bbee0ff444907e26cc78d6f9d4 not found: ID does not exist" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.659201 4771 scope.go:117] "RemoveContainer" containerID="794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d" Feb 19 21:42:10 crc kubenswrapper[4771]: E0219 21:42:10.659626 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d\": container with ID starting with 794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d not found: ID does not exist" containerID="794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.659684 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d"} err="failed to get container status \"794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d\": rpc error: code = NotFound desc = could not find container \"794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d\": container with ID starting with 794275361ab3810d2d76838c97447dc6f8d522f196198a864df8fb3cd0f2b82d not found: ID does not exist" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.659725 4771 scope.go:117] "RemoveContainer" containerID="4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174" Feb 19 21:42:10 crc kubenswrapper[4771]: E0219 21:42:10.660410 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174\": container with ID starting with 4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174 not found: ID does not exist" containerID="4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.660481 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174"} err="failed to get container status \"4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174\": rpc error: code = NotFound desc = could not find container \"4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174\": container with ID starting with 4aee3af207c1fc5b6681d0333e6cb5fbcfd65bb6220c5eabfe425601d79d8174 not found: ID does not exist" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.938642 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91882b82-9663-470b-a4cc-1db296c09408" (UID: "91882b82-9663-470b-a4cc-1db296c09408"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:10 crc kubenswrapper[4771]: I0219 21:42:10.986681 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91882b82-9663-470b-a4cc-1db296c09408-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:11 crc kubenswrapper[4771]: I0219 21:42:11.240957 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rj8v2"] Feb 19 21:42:11 crc kubenswrapper[4771]: I0219 21:42:11.248296 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rj8v2"] Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.011330 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hcxck" Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.359983 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.360561 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.368205 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.456779 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91882b82-9663-470b-a4cc-1db296c09408" path="/var/lib/kubelet/pods/91882b82-9663-470b-a4cc-1db296c09408/volumes" Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.595355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-557dc9b6b6-z259k" Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.680431 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-h7w47"] Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.956840 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:42:12 crc kubenswrapper[4771]: I0219 21:42:12.957607 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:42:22 crc kubenswrapper[4771]: I0219 21:42:22.556469 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zgzhx" Feb 19 21:42:37 crc kubenswrapper[4771]: I0219 21:42:37.725514 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-h7w47" podUID="67d4896e-f092-419b-971d-aac9dbaed6d0" containerName="console" containerID="cri-o://41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da" gracePeriod=15 Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.154875 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-h7w47_67d4896e-f092-419b-971d-aac9dbaed6d0/console/0.log" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.155339 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.201591 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-console-config\") pod \"67d4896e-f092-419b-971d-aac9dbaed6d0\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.201654 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-oauth-config\") pod \"67d4896e-f092-419b-971d-aac9dbaed6d0\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.201716 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-serving-cert\") pod \"67d4896e-f092-419b-971d-aac9dbaed6d0\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.201756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-oauth-serving-cert\") pod \"67d4896e-f092-419b-971d-aac9dbaed6d0\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.201848 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-trusted-ca-bundle\") pod \"67d4896e-f092-419b-971d-aac9dbaed6d0\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.201894 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-service-ca\") pod \"67d4896e-f092-419b-971d-aac9dbaed6d0\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.201956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmnfk\" (UniqueName: \"kubernetes.io/projected/67d4896e-f092-419b-971d-aac9dbaed6d0-kube-api-access-zmnfk\") pod \"67d4896e-f092-419b-971d-aac9dbaed6d0\" (UID: \"67d4896e-f092-419b-971d-aac9dbaed6d0\") " Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.202435 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "67d4896e-f092-419b-971d-aac9dbaed6d0" (UID: "67d4896e-f092-419b-971d-aac9dbaed6d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.202442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "67d4896e-f092-419b-971d-aac9dbaed6d0" (UID: "67d4896e-f092-419b-971d-aac9dbaed6d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.202581 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "67d4896e-f092-419b-971d-aac9dbaed6d0" (UID: "67d4896e-f092-419b-971d-aac9dbaed6d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.202925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-console-config" (OuterVolumeSpecName: "console-config") pod "67d4896e-f092-419b-971d-aac9dbaed6d0" (UID: "67d4896e-f092-419b-971d-aac9dbaed6d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.209566 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d4896e-f092-419b-971d-aac9dbaed6d0-kube-api-access-zmnfk" (OuterVolumeSpecName: "kube-api-access-zmnfk") pod "67d4896e-f092-419b-971d-aac9dbaed6d0" (UID: "67d4896e-f092-419b-971d-aac9dbaed6d0"). InnerVolumeSpecName "kube-api-access-zmnfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.209579 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "67d4896e-f092-419b-971d-aac9dbaed6d0" (UID: "67d4896e-f092-419b-971d-aac9dbaed6d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.209715 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "67d4896e-f092-419b-971d-aac9dbaed6d0" (UID: "67d4896e-f092-419b-971d-aac9dbaed6d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.310308 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.310338 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.310347 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4896e-f092-419b-971d-aac9dbaed6d0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.310357 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.310365 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.310374 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67d4896e-f092-419b-971d-aac9dbaed6d0-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.310382 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmnfk\" (UniqueName: \"kubernetes.io/projected/67d4896e-f092-419b-971d-aac9dbaed6d0-kube-api-access-zmnfk\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.770289 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-h7w47_67d4896e-f092-419b-971d-aac9dbaed6d0/console/0.log" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.770348 4771 generic.go:334] "Generic (PLEG): container finished" podID="67d4896e-f092-419b-971d-aac9dbaed6d0" containerID="41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da" exitCode=2 Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.770380 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h7w47" event={"ID":"67d4896e-f092-419b-971d-aac9dbaed6d0","Type":"ContainerDied","Data":"41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da"} Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.770420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h7w47" event={"ID":"67d4896e-f092-419b-971d-aac9dbaed6d0","Type":"ContainerDied","Data":"7f5fa88f4cd5d53df5eaff523aee4dc468f77994a1837f4d0d593aa58771ef6b"} Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.770442 4771 scope.go:117] "RemoveContainer" containerID="41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.770456 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h7w47" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.791041 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-h7w47"] Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.791304 4771 scope.go:117] "RemoveContainer" containerID="41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da" Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.791782 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da\": container with ID starting with 41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da not found: ID does not exist" containerID="41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.791821 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da"} err="failed to get container status \"41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da\": rpc error: code = NotFound desc = could not find container \"41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da\": container with ID starting with 41d452b7763bac22e7d6344466be6896af31af87d6ccbd1431822d1633e718da not found: ID does not exist" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.798461 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-h7w47"] Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934327 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r"] Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.934638 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="extract-utilities" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934668 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="extract-utilities" Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.934690 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="extract-content" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934705 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="extract-content" Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.934735 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="extract-content" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934749 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="extract-content" Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.934768 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d4896e-f092-419b-971d-aac9dbaed6d0" containerName="console" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934781 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d4896e-f092-419b-971d-aac9dbaed6d0" containerName="console" Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.934798 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="registry-server" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934811 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="registry-server" Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.934835 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="extract-utilities" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934848 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="extract-utilities" Feb 19 21:42:38 crc kubenswrapper[4771]: E0219 21:42:38.934872 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="registry-server" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.934885 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="registry-server" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.935114 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d4896e-f092-419b-971d-aac9dbaed6d0" containerName="console" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.935144 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="91882b82-9663-470b-a4cc-1db296c09408" containerName="registry-server" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.935183 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c9413f-fb4e-4d10-807e-e3da51cc96d0" containerName="registry-server" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.936438 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.938719 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:42:38 crc kubenswrapper[4771]: I0219 21:42:38.941663 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r"] Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.019416 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.019521 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.019604 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-689x6\" (UniqueName: \"kubernetes.io/projected/a609387a-0d3f-4374-90c3-23b37506661c-kube-api-access-689x6\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.120947 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-689x6\" (UniqueName: \"kubernetes.io/projected/a609387a-0d3f-4374-90c3-23b37506661c-kube-api-access-689x6\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.121078 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.121128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.121767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.121895 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.149917 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-689x6\" (UniqueName: \"kubernetes.io/projected/a609387a-0d3f-4374-90c3-23b37506661c-kube-api-access-689x6\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.253545 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.745957 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r"] Feb 19 21:42:39 crc kubenswrapper[4771]: I0219 21:42:39.778547 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" event={"ID":"a609387a-0d3f-4374-90c3-23b37506661c","Type":"ContainerStarted","Data":"ea936bc080ca5ec15c2240490b96a015a7e89d4fa0fed1c1256b6834942bd6e0"} Feb 19 21:42:40 crc kubenswrapper[4771]: I0219 21:42:40.451385 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d4896e-f092-419b-971d-aac9dbaed6d0" path="/var/lib/kubelet/pods/67d4896e-f092-419b-971d-aac9dbaed6d0/volumes" Feb 19 21:42:40 crc kubenswrapper[4771]: I0219 21:42:40.787860 4771 generic.go:334] "Generic (PLEG): container finished" podID="a609387a-0d3f-4374-90c3-23b37506661c" containerID="e907bb0c4577c86c36f5150e772f8d2690d53dbfbf3453eadf8e6048bf648f17" exitCode=0 Feb 19 21:42:40 crc kubenswrapper[4771]: I0219 21:42:40.787997 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" event={"ID":"a609387a-0d3f-4374-90c3-23b37506661c","Type":"ContainerDied","Data":"e907bb0c4577c86c36f5150e772f8d2690d53dbfbf3453eadf8e6048bf648f17"} Feb 19 21:42:42 crc kubenswrapper[4771]: I0219 21:42:42.807099 4771 generic.go:334] "Generic (PLEG): container finished" podID="a609387a-0d3f-4374-90c3-23b37506661c" containerID="91f785341486b00fb7a11577b1480f37e0baf1be6ddb8088b9da20d47b065979" exitCode=0 Feb 19 21:42:42 crc kubenswrapper[4771]: I0219 21:42:42.807168 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" event={"ID":"a609387a-0d3f-4374-90c3-23b37506661c","Type":"ContainerDied","Data":"91f785341486b00fb7a11577b1480f37e0baf1be6ddb8088b9da20d47b065979"} Feb 19 21:42:42 crc kubenswrapper[4771]: I0219 21:42:42.957117 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:42:42 crc kubenswrapper[4771]: I0219 21:42:42.957584 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:42:42 crc kubenswrapper[4771]: I0219 21:42:42.957651 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:42:42 crc kubenswrapper[4771]: I0219 21:42:42.958582 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b551f7ad6e0a61c42a9d657911c51c24f934a3d33eec6d667ab79647cf50477"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:42:42 crc kubenswrapper[4771]: I0219 21:42:42.958694 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://0b551f7ad6e0a61c42a9d657911c51c24f934a3d33eec6d667ab79647cf50477" gracePeriod=600 Feb 19 21:42:43 crc kubenswrapper[4771]: I0219 21:42:43.815062 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="0b551f7ad6e0a61c42a9d657911c51c24f934a3d33eec6d667ab79647cf50477" exitCode=0 Feb 19 21:42:43 crc kubenswrapper[4771]: I0219 21:42:43.815145 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"0b551f7ad6e0a61c42a9d657911c51c24f934a3d33eec6d667ab79647cf50477"} Feb 19 21:42:43 crc kubenswrapper[4771]: I0219 21:42:43.815203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"de6552d35b845cd34ff7f7f99fd7b414fe0ca3adffdf00a9aac825e2b4fe5659"} Feb 19 21:42:43 crc kubenswrapper[4771]: I0219 21:42:43.815230 4771 scope.go:117] "RemoveContainer" containerID="00064ebea687c2f43446b8863a30cdb029580ca9dabd28ae56781a84a609019f" Feb 19 21:42:43 crc kubenswrapper[4771]: I0219 21:42:43.819811 4771 generic.go:334] "Generic (PLEG): container finished" podID="a609387a-0d3f-4374-90c3-23b37506661c" containerID="f33cd9f98f5aeea1f2e7fa7929917a1aaa0675e23742b753a664486941815e43" exitCode=0 Feb 19 21:42:43 crc kubenswrapper[4771]: I0219 21:42:43.819846 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" event={"ID":"a609387a-0d3f-4374-90c3-23b37506661c","Type":"ContainerDied","Data":"f33cd9f98f5aeea1f2e7fa7929917a1aaa0675e23742b753a664486941815e43"} Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.092921 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.104836 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-util\") pod \"a609387a-0d3f-4374-90c3-23b37506661c\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.104890 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-bundle\") pod \"a609387a-0d3f-4374-90c3-23b37506661c\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.104961 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-689x6\" (UniqueName: \"kubernetes.io/projected/a609387a-0d3f-4374-90c3-23b37506661c-kube-api-access-689x6\") pod \"a609387a-0d3f-4374-90c3-23b37506661c\" (UID: \"a609387a-0d3f-4374-90c3-23b37506661c\") " Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.106332 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-bundle" (OuterVolumeSpecName: "bundle") pod "a609387a-0d3f-4374-90c3-23b37506661c" (UID: "a609387a-0d3f-4374-90c3-23b37506661c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.114329 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a609387a-0d3f-4374-90c3-23b37506661c-kube-api-access-689x6" (OuterVolumeSpecName: "kube-api-access-689x6") pod "a609387a-0d3f-4374-90c3-23b37506661c" (UID: "a609387a-0d3f-4374-90c3-23b37506661c"). InnerVolumeSpecName "kube-api-access-689x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.142171 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-util" (OuterVolumeSpecName: "util") pod "a609387a-0d3f-4374-90c3-23b37506661c" (UID: "a609387a-0d3f-4374-90c3-23b37506661c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.206392 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.206422 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a609387a-0d3f-4374-90c3-23b37506661c-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.206431 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-689x6\" (UniqueName: \"kubernetes.io/projected/a609387a-0d3f-4374-90c3-23b37506661c-kube-api-access-689x6\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.838834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" event={"ID":"a609387a-0d3f-4374-90c3-23b37506661c","Type":"ContainerDied","Data":"ea936bc080ca5ec15c2240490b96a015a7e89d4fa0fed1c1256b6834942bd6e0"} Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.838885 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea936bc080ca5ec15c2240490b96a015a7e89d4fa0fed1c1256b6834942bd6e0" Feb 19 21:42:45 crc kubenswrapper[4771]: I0219 21:42:45.838928 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.058103 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6"] Feb 19 21:42:54 crc kubenswrapper[4771]: E0219 21:42:54.058721 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a609387a-0d3f-4374-90c3-23b37506661c" containerName="extract" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.058732 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a609387a-0d3f-4374-90c3-23b37506661c" containerName="extract" Feb 19 21:42:54 crc kubenswrapper[4771]: E0219 21:42:54.058743 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a609387a-0d3f-4374-90c3-23b37506661c" containerName="util" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.058749 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a609387a-0d3f-4374-90c3-23b37506661c" containerName="util" Feb 19 21:42:54 crc kubenswrapper[4771]: E0219 21:42:54.058759 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a609387a-0d3f-4374-90c3-23b37506661c" containerName="pull" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.058764 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a609387a-0d3f-4374-90c3-23b37506661c" containerName="pull" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.058852 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a609387a-0d3f-4374-90c3-23b37506661c" containerName="extract" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.059247 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.062153 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lxzxh" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.065976 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.066046 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.066042 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.066140 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.074285 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6"] Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.235868 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0305f851-5f0b-4dd2-910b-e331dd45e4b7-apiservice-cert\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.236119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0305f851-5f0b-4dd2-910b-e331dd45e4b7-webhook-cert\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.236281 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdbn\" (UniqueName: \"kubernetes.io/projected/0305f851-5f0b-4dd2-910b-e331dd45e4b7-kube-api-access-6wdbn\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.337283 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0305f851-5f0b-4dd2-910b-e331dd45e4b7-apiservice-cert\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.337340 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0305f851-5f0b-4dd2-910b-e331dd45e4b7-webhook-cert\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.337369 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdbn\" (UniqueName: \"kubernetes.io/projected/0305f851-5f0b-4dd2-910b-e331dd45e4b7-kube-api-access-6wdbn\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.343536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0305f851-5f0b-4dd2-910b-e331dd45e4b7-apiservice-cert\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.343828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0305f851-5f0b-4dd2-910b-e331dd45e4b7-webhook-cert\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.366158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdbn\" (UniqueName: \"kubernetes.io/projected/0305f851-5f0b-4dd2-910b-e331dd45e4b7-kube-api-access-6wdbn\") pod \"metallb-operator-controller-manager-7595995fb8-mwpt6\" (UID: \"0305f851-5f0b-4dd2-910b-e331dd45e4b7\") " pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.376300 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.601212 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj"] Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.601817 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.607159 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.607215 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.607159 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2jf98" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.621425 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj"] Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.744248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e16c23ec-4799-4e15-a12e-995a8076b5f7-webhook-cert\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.744302 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsq78\" (UniqueName: \"kubernetes.io/projected/e16c23ec-4799-4e15-a12e-995a8076b5f7-kube-api-access-wsq78\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.744455 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e16c23ec-4799-4e15-a12e-995a8076b5f7-apiservice-cert\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.845931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e16c23ec-4799-4e15-a12e-995a8076b5f7-webhook-cert\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.846014 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsq78\" (UniqueName: \"kubernetes.io/projected/e16c23ec-4799-4e15-a12e-995a8076b5f7-kube-api-access-wsq78\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.846137 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e16c23ec-4799-4e15-a12e-995a8076b5f7-apiservice-cert\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.851706 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e16c23ec-4799-4e15-a12e-995a8076b5f7-apiservice-cert\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.854563 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e16c23ec-4799-4e15-a12e-995a8076b5f7-webhook-cert\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.877790 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsq78\" (UniqueName: \"kubernetes.io/projected/e16c23ec-4799-4e15-a12e-995a8076b5f7-kube-api-access-wsq78\") pod \"metallb-operator-webhook-server-64f99c9dc8-x7lhj\" (UID: \"e16c23ec-4799-4e15-a12e-995a8076b5f7\") " pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.912902 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6"] Feb 19 21:42:54 crc kubenswrapper[4771]: I0219 21:42:54.936553 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:42:55 crc kubenswrapper[4771]: I0219 21:42:55.156099 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj"] Feb 19 21:42:55 crc kubenswrapper[4771]: W0219 21:42:55.162646 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16c23ec_4799_4e15_a12e_995a8076b5f7.slice/crio-db0e04e43936f19465f8d16d602efffbf6d1b9d0d96914da42c18e797ab92a1d WatchSource:0}: Error finding container db0e04e43936f19465f8d16d602efffbf6d1b9d0d96914da42c18e797ab92a1d: Status 404 returned error can't find the container with id db0e04e43936f19465f8d16d602efffbf6d1b9d0d96914da42c18e797ab92a1d Feb 19 21:42:55 crc kubenswrapper[4771]: I0219 21:42:55.901675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" event={"ID":"e16c23ec-4799-4e15-a12e-995a8076b5f7","Type":"ContainerStarted","Data":"db0e04e43936f19465f8d16d602efffbf6d1b9d0d96914da42c18e797ab92a1d"} Feb 19 21:42:55 crc kubenswrapper[4771]: I0219 21:42:55.903202 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" event={"ID":"0305f851-5f0b-4dd2-910b-e331dd45e4b7","Type":"ContainerStarted","Data":"8e19b64bdf829717e35f63156992f6c42e71c8433662270998373c3ac55e618a"} Feb 19 21:42:58 crc kubenswrapper[4771]: I0219 21:42:58.923759 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" event={"ID":"0305f851-5f0b-4dd2-910b-e331dd45e4b7","Type":"ContainerStarted","Data":"f2a69afb0910ef8db5ad6a2a9d475172d38c030eab565d40418a73895ff6f26a"} Feb 19 21:42:58 crc kubenswrapper[4771]: I0219 21:42:58.924126 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:42:58 crc kubenswrapper[4771]: I0219 21:42:58.948649 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" podStartSLOduration=1.274147949 podStartE2EDuration="4.948627213s" podCreationTimestamp="2026-02-19 21:42:54 +0000 UTC" firstStartedPulling="2026-02-19 21:42:54.912872913 +0000 UTC m=+875.184315393" lastFinishedPulling="2026-02-19 21:42:58.587352187 +0000 UTC m=+878.858794657" observedRunningTime="2026-02-19 21:42:58.941576366 +0000 UTC m=+879.213018866" watchObservedRunningTime="2026-02-19 21:42:58.948627213 +0000 UTC m=+879.220069703" Feb 19 21:43:01 crc kubenswrapper[4771]: I0219 21:43:01.970771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" event={"ID":"e16c23ec-4799-4e15-a12e-995a8076b5f7","Type":"ContainerStarted","Data":"1a207a807a71115ae4a05430de5bb59f4c84819087fbf1951bbc8a20232f31d1"} Feb 19 21:43:01 crc kubenswrapper[4771]: I0219 21:43:01.971103 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:43:02 crc kubenswrapper[4771]: I0219 21:43:02.011875 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" podStartSLOduration=2.355810361 podStartE2EDuration="8.011843469s" podCreationTimestamp="2026-02-19 21:42:54 +0000 UTC" firstStartedPulling="2026-02-19 21:42:55.16529358 +0000 UTC m=+875.436736050" lastFinishedPulling="2026-02-19 21:43:00.821326688 +0000 UTC m=+881.092769158" observedRunningTime="2026-02-19 21:43:02.004409462 +0000 UTC m=+882.275851952" watchObservedRunningTime="2026-02-19 21:43:02.011843469 +0000 UTC m=+882.283285979" Feb 19 21:43:14 crc kubenswrapper[4771]: I0219 21:43:14.945266 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-64f99c9dc8-x7lhj" Feb 19 21:43:34 crc kubenswrapper[4771]: I0219 21:43:34.378974 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7595995fb8-mwpt6" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.230932 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7wp2l"] Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.233635 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.235085 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.235617 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-lg4lg" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.235952 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.247618 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49"] Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.249532 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.252272 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.252555 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49"] Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.303540 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-b55hf"] Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.304370 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.308788 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.308843 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.308967 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2qxdg" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.310568 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.317202 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-sfghb"] Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.317959 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.319742 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326028 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/083f4636-9855-4b5c-ad7c-685d534b508c-metallb-excludel2\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326076 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics-certs\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326098 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-metrics-certs\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2966750-1a68-42a1-9da5-30c5d5955550-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lhn49\" (UID: \"c2966750-1a68-42a1-9da5-30c5d5955550\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326135 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/083f4636-9855-4b5c-ad7c-685d534b508c-kube-api-access-2lrpr\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326170 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbpq\" (UniqueName: \"kubernetes.io/projected/c2966750-1a68-42a1-9da5-30c5d5955550-kube-api-access-wpbpq\") pod \"frr-k8s-webhook-server-78b44bf5bb-lhn49\" (UID: \"c2966750-1a68-42a1-9da5-30c5d5955550\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326213 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326228 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-reloader\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326246 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-conf\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326262 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-cert\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326276 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrns7\" (UniqueName: \"kubernetes.io/projected/ff7ca988-1ed4-468a-a531-52c2b45e144c-kube-api-access-xrns7\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326290 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-metrics-certs\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326314 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-sockets\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326329 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmlzp\" (UniqueName: \"kubernetes.io/projected/e7b28acc-9520-4a82-a044-a165756be9e8-kube-api-access-fmlzp\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.326346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-startup\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.334385 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-sfghb"] Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427602 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-startup\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/083f4636-9855-4b5c-ad7c-685d534b508c-metallb-excludel2\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427676 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics-certs\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-metrics-certs\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427713 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2966750-1a68-42a1-9da5-30c5d5955550-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lhn49\" (UID: \"c2966750-1a68-42a1-9da5-30c5d5955550\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427727 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/083f4636-9855-4b5c-ad7c-685d534b508c-kube-api-access-2lrpr\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427761 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427785 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbpq\" (UniqueName: \"kubernetes.io/projected/c2966750-1a68-42a1-9da5-30c5d5955550-kube-api-access-wpbpq\") pod \"frr-k8s-webhook-server-78b44bf5bb-lhn49\" (UID: \"c2966750-1a68-42a1-9da5-30c5d5955550\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427805 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427821 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-reloader\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427840 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-conf\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-cert\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrns7\" (UniqueName: \"kubernetes.io/projected/ff7ca988-1ed4-468a-a531-52c2b45e144c-kube-api-access-xrns7\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427883 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-metrics-certs\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427921 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-sockets\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.427938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmlzp\" (UniqueName: \"kubernetes.io/projected/e7b28acc-9520-4a82-a044-a165756be9e8-kube-api-access-fmlzp\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.428970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-startup\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.429455 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/083f4636-9855-4b5c-ad7c-685d534b508c-metallb-excludel2\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.429516 4771 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.429553 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics-certs podName:ff7ca988-1ed4-468a-a531-52c2b45e144c nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.929538751 +0000 UTC m=+916.200981221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics-certs") pod "frr-k8s-7wp2l" (UID: "ff7ca988-1ed4-468a-a531-52c2b45e144c") : secret "frr-k8s-certs-secret" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.429713 4771 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.429735 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-metrics-certs podName:e7b28acc-9520-4a82-a044-a165756be9e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.929728076 +0000 UTC m=+916.201170546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-metrics-certs") pod "controller-69bbfbf88f-sfghb" (UID: "e7b28acc-9520-4a82-a044-a165756be9e8") : secret "controller-certs-secret" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.430396 4771 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.430427 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-metrics-certs podName:083f4636-9855-4b5c-ad7c-685d534b508c nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.930417554 +0000 UTC m=+916.201860024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-metrics-certs") pod "speaker-b55hf" (UID: "083f4636-9855-4b5c-ad7c-685d534b508c") : secret "speaker-certs-secret" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.430552 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.430579 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist podName:083f4636-9855-4b5c-ad7c-685d534b508c nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.930569607 +0000 UTC m=+916.202012067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist") pod "speaker-b55hf" (UID: "083f4636-9855-4b5c-ad7c-685d534b508c") : secret "metallb-memberlist" not found Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.431206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-reloader\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.431268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-conf\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.431374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-frr-sockets\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.431440 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.433197 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.439231 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2966750-1a68-42a1-9da5-30c5d5955550-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lhn49\" (UID: \"c2966750-1a68-42a1-9da5-30c5d5955550\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.446113 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmlzp\" (UniqueName: \"kubernetes.io/projected/e7b28acc-9520-4a82-a044-a165756be9e8-kube-api-access-fmlzp\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.449276 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbpq\" (UniqueName: \"kubernetes.io/projected/c2966750-1a68-42a1-9da5-30c5d5955550-kube-api-access-wpbpq\") pod \"frr-k8s-webhook-server-78b44bf5bb-lhn49\" (UID: \"c2966750-1a68-42a1-9da5-30c5d5955550\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.451718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/083f4636-9855-4b5c-ad7c-685d534b508c-kube-api-access-2lrpr\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.451833 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-cert\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.457236 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrns7\" (UniqueName: \"kubernetes.io/projected/ff7ca988-1ed4-468a-a531-52c2b45e144c-kube-api-access-xrns7\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.564478 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.754909 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49"] Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.934794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics-certs\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.934893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-metrics-certs\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.935110 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.935216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-metrics-certs\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.935301 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 21:43:35 crc kubenswrapper[4771]: E0219 21:43:35.935397 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist podName:083f4636-9855-4b5c-ad7c-685d534b508c nodeName:}" failed. No retries permitted until 2026-02-19 21:43:36.935371267 +0000 UTC m=+917.206813807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist") pod "speaker-b55hf" (UID: "083f4636-9855-4b5c-ad7c-685d534b508c") : secret "metallb-memberlist" not found Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.944747 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-metrics-certs\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.944855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7b28acc-9520-4a82-a044-a165756be9e8-metrics-certs\") pod \"controller-69bbfbf88f-sfghb\" (UID: \"e7b28acc-9520-4a82-a044-a165756be9e8\") " pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:35 crc kubenswrapper[4771]: I0219 21:43:35.944914 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff7ca988-1ed4-468a-a531-52c2b45e144c-metrics-certs\") pod \"frr-k8s-7wp2l\" (UID: \"ff7ca988-1ed4-468a-a531-52c2b45e144c\") " pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:36 crc kubenswrapper[4771]: I0219 21:43:36.153329 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:36 crc kubenswrapper[4771]: I0219 21:43:36.192552 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" event={"ID":"c2966750-1a68-42a1-9da5-30c5d5955550","Type":"ContainerStarted","Data":"28efd788a61907f86f976e0e052993ad02008f9c8f17e54e1c59477b30721429"} Feb 19 21:43:36 crc kubenswrapper[4771]: I0219 21:43:36.233483 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:36 crc kubenswrapper[4771]: I0219 21:43:36.567962 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-sfghb"] Feb 19 21:43:36 crc kubenswrapper[4771]: I0219 21:43:36.956664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:36 crc kubenswrapper[4771]: I0219 21:43:36.963074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/083f4636-9855-4b5c-ad7c-685d534b508c-memberlist\") pod \"speaker-b55hf\" (UID: \"083f4636-9855-4b5c-ad7c-685d534b508c\") " pod="metallb-system/speaker-b55hf" Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.119933 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b55hf" Feb 19 21:43:37 crc kubenswrapper[4771]: W0219 21:43:37.147368 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod083f4636_9855_4b5c_ad7c_685d534b508c.slice/crio-d6db23ec3c4af587b36942ed4f8dac17ca196f406bc7831c639d58991f2060a1 WatchSource:0}: Error finding container d6db23ec3c4af587b36942ed4f8dac17ca196f406bc7831c639d58991f2060a1: Status 404 returned error can't find the container with id d6db23ec3c4af587b36942ed4f8dac17ca196f406bc7831c639d58991f2060a1 Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.198883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b55hf" event={"ID":"083f4636-9855-4b5c-ad7c-685d534b508c","Type":"ContainerStarted","Data":"d6db23ec3c4af587b36942ed4f8dac17ca196f406bc7831c639d58991f2060a1"} Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.201747 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-sfghb" event={"ID":"e7b28acc-9520-4a82-a044-a165756be9e8","Type":"ContainerStarted","Data":"df0a21645ccc4efa3869939ef77c26c1f4c62aabd3b6f5883dd23e43204f6f11"} Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.201771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-sfghb" event={"ID":"e7b28acc-9520-4a82-a044-a165756be9e8","Type":"ContainerStarted","Data":"95737bcf2197971476df39f21033ee63005f478286fc49f5ce3bbde3b548aef7"} Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.201781 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-sfghb" event={"ID":"e7b28acc-9520-4a82-a044-a165756be9e8","Type":"ContainerStarted","Data":"b173b6c6117c76cbc0d772093e57f64216c9671e09e133e6df048706e8b35a65"} Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.202640 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.203792 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerStarted","Data":"0924df2a3a829b1f07f62670cfb3f7d41eebf645b648f7f0f2d59530f1371095"} Feb 19 21:43:37 crc kubenswrapper[4771]: I0219 21:43:37.250324 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-sfghb" podStartSLOduration=2.2503090390000002 podStartE2EDuration="2.250309039s" podCreationTimestamp="2026-02-19 21:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:43:37.246321603 +0000 UTC m=+917.517764083" watchObservedRunningTime="2026-02-19 21:43:37.250309039 +0000 UTC m=+917.521751499" Feb 19 21:43:38 crc kubenswrapper[4771]: I0219 21:43:38.223803 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b55hf" event={"ID":"083f4636-9855-4b5c-ad7c-685d534b508c","Type":"ContainerStarted","Data":"cb4043fd684ad0f3f47e0a2279e04a8258f09c96b17a4827004527939ce380ee"} Feb 19 21:43:38 crc kubenswrapper[4771]: I0219 21:43:38.223874 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b55hf" event={"ID":"083f4636-9855-4b5c-ad7c-685d534b508c","Type":"ContainerStarted","Data":"c14bba750b62abda61a8d8566e8f05d8fe96abedb342c6ca98384faa284099a2"} Feb 19 21:43:39 crc kubenswrapper[4771]: I0219 21:43:39.229595 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-b55hf" Feb 19 21:43:40 crc kubenswrapper[4771]: I0219 21:43:40.460980 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-b55hf" podStartSLOduration=5.460955488 podStartE2EDuration="5.460955488s" podCreationTimestamp="2026-02-19 21:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:43:38.244184233 +0000 UTC m=+918.515626723" watchObservedRunningTime="2026-02-19 21:43:40.460955488 +0000 UTC m=+920.732397968" Feb 19 21:43:44 crc kubenswrapper[4771]: I0219 21:43:44.274854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" event={"ID":"c2966750-1a68-42a1-9da5-30c5d5955550","Type":"ContainerStarted","Data":"87c10e8c7f91a3ac420b73091fbcd2f51c7d1e7e56aabdaa8ad1772225783c29"} Feb 19 21:43:44 crc kubenswrapper[4771]: I0219 21:43:44.275541 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:44 crc kubenswrapper[4771]: I0219 21:43:44.278141 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff7ca988-1ed4-468a-a531-52c2b45e144c" containerID="0b9a6e63093b58a0f0881fcb5566a6fed9aad42f6a45939f069fbfdb77129f0d" exitCode=0 Feb 19 21:43:44 crc kubenswrapper[4771]: I0219 21:43:44.278205 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerDied","Data":"0b9a6e63093b58a0f0881fcb5566a6fed9aad42f6a45939f069fbfdb77129f0d"} Feb 19 21:43:44 crc kubenswrapper[4771]: I0219 21:43:44.302510 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" podStartSLOduration=1.834974791 podStartE2EDuration="9.302489376s" podCreationTimestamp="2026-02-19 21:43:35 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.773368639 +0000 UTC m=+916.044811109" lastFinishedPulling="2026-02-19 21:43:43.240883214 +0000 UTC m=+923.512325694" observedRunningTime="2026-02-19 21:43:44.297706669 +0000 UTC m=+924.569149179" watchObservedRunningTime="2026-02-19 21:43:44.302489376 +0000 UTC m=+924.573931886" Feb 19 21:43:45 crc kubenswrapper[4771]: I0219 21:43:45.292446 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff7ca988-1ed4-468a-a531-52c2b45e144c" containerID="2cfbbdc93f233e1bdba139d063dff55f02c775019d05ef0fb3a1c1c62d96455a" exitCode=0 Feb 19 21:43:45 crc kubenswrapper[4771]: I0219 21:43:45.292515 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerDied","Data":"2cfbbdc93f233e1bdba139d063dff55f02c775019d05ef0fb3a1c1c62d96455a"} Feb 19 21:43:46 crc kubenswrapper[4771]: I0219 21:43:46.239789 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-sfghb" Feb 19 21:43:46 crc kubenswrapper[4771]: I0219 21:43:46.309291 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff7ca988-1ed4-468a-a531-52c2b45e144c" containerID="022b055ea900fc0f57e00e92c7a092ff3fdbbe6f8fe5b5328b85612129edce4b" exitCode=0 Feb 19 21:43:46 crc kubenswrapper[4771]: I0219 21:43:46.309346 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerDied","Data":"022b055ea900fc0f57e00e92c7a092ff3fdbbe6f8fe5b5328b85612129edce4b"} Feb 19 21:43:47 crc kubenswrapper[4771]: I0219 21:43:47.125795 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-b55hf" Feb 19 21:43:47 crc kubenswrapper[4771]: I0219 21:43:47.320899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerStarted","Data":"5625288fe70342fa457e948e156bafaa1c310a6d689d69a27a4344bd11e77d9c"} Feb 19 21:43:47 crc kubenswrapper[4771]: I0219 21:43:47.321315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerStarted","Data":"6f25e4970a8864b77edc6c8aabfcee66bb04d725ac1e58b79c4fb2681b246796"} Feb 19 21:43:47 crc kubenswrapper[4771]: I0219 21:43:47.321338 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerStarted","Data":"665ee6216c446f52a1132c721ac137ff11402bd1cf6281a6d996046f09fe1338"} Feb 19 21:43:47 crc kubenswrapper[4771]: I0219 21:43:47.321357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerStarted","Data":"9d5c279813ec73a5738cb10f4a98bd1b3fe6af82954fe8efe2452cb16047ad36"} Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.334953 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerStarted","Data":"7cf8628d39a516715728739c208848953660e36b1a2230804dedbd99817afc44"} Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.335012 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7wp2l" event={"ID":"ff7ca988-1ed4-468a-a531-52c2b45e144c","Type":"ContainerStarted","Data":"b112fcb5f5923c592cb2f164d4ff9ac1f6d930f234159de7b719acf1c142f08e"} Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.335247 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.369673 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7wp2l" podStartSLOduration=6.475735204 podStartE2EDuration="13.369657482s" podCreationTimestamp="2026-02-19 21:43:35 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.325661313 +0000 UTC m=+916.597103783" lastFinishedPulling="2026-02-19 21:43:43.219583581 +0000 UTC m=+923.491026061" observedRunningTime="2026-02-19 21:43:48.368373648 +0000 UTC m=+928.639816208" watchObservedRunningTime="2026-02-19 21:43:48.369657482 +0000 UTC m=+928.641099952" Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.945401 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs"] Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.949582 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.952367 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:43:48 crc kubenswrapper[4771]: I0219 21:43:48.989412 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs"] Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.038293 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.038369 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.038452 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9kjs\" (UniqueName: \"kubernetes.io/projected/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-kube-api-access-q9kjs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.139853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.140259 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.140338 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9kjs\" (UniqueName: \"kubernetes.io/projected/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-kube-api-access-q9kjs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.140725 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.141063 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.173705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9kjs\" (UniqueName: \"kubernetes.io/projected/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-kube-api-access-q9kjs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.295937 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:49 crc kubenswrapper[4771]: I0219 21:43:49.591314 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs"] Feb 19 21:43:50 crc kubenswrapper[4771]: I0219 21:43:50.352056 4771 generic.go:334] "Generic (PLEG): container finished" podID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerID="7a485678f53ce330e11b23491b68e07259b6f32e8a49eab6a0a2defb093739c8" exitCode=0 Feb 19 21:43:50 crc kubenswrapper[4771]: I0219 21:43:50.352159 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" event={"ID":"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea","Type":"ContainerDied","Data":"7a485678f53ce330e11b23491b68e07259b6f32e8a49eab6a0a2defb093739c8"} Feb 19 21:43:50 crc kubenswrapper[4771]: I0219 21:43:50.352624 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" event={"ID":"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea","Type":"ContainerStarted","Data":"a24553401ed7c2e599b3fd2ad27dbb9f8b8bceabdbd22bdc3921e6ee9100588e"} Feb 19 21:43:51 crc kubenswrapper[4771]: I0219 21:43:51.153953 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:51 crc kubenswrapper[4771]: I0219 21:43:51.225719 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:54 crc kubenswrapper[4771]: I0219 21:43:54.383202 4771 generic.go:334] "Generic (PLEG): container finished" podID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerID="92e159735dbb71ce873d0681c210f9b0afa1b9588837b6ee29ba95163873efed" exitCode=0 Feb 19 21:43:54 crc kubenswrapper[4771]: I0219 21:43:54.383555 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" event={"ID":"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea","Type":"ContainerDied","Data":"92e159735dbb71ce873d0681c210f9b0afa1b9588837b6ee29ba95163873efed"} Feb 19 21:43:55 crc kubenswrapper[4771]: I0219 21:43:55.395594 4771 generic.go:334] "Generic (PLEG): container finished" podID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerID="8a272442ad3181bf3fec7c25a127bcac5592ac7e81774ca5a659098551b81555" exitCode=0 Feb 19 21:43:55 crc kubenswrapper[4771]: I0219 21:43:55.395794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" event={"ID":"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea","Type":"ContainerDied","Data":"8a272442ad3181bf3fec7c25a127bcac5592ac7e81774ca5a659098551b81555"} Feb 19 21:43:55 crc kubenswrapper[4771]: I0219 21:43:55.577972 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lhn49" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.161129 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7wp2l" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.752479 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.845898 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-bundle\") pod \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.845949 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9kjs\" (UniqueName: \"kubernetes.io/projected/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-kube-api-access-q9kjs\") pod \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.845979 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-util\") pod \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\" (UID: \"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea\") " Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.846784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-bundle" (OuterVolumeSpecName: "bundle") pod "7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" (UID: "7990fa1c-49c7-43c8-b638-ba2f44a5d0ea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.851918 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-kube-api-access-q9kjs" (OuterVolumeSpecName: "kube-api-access-q9kjs") pod "7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" (UID: "7990fa1c-49c7-43c8-b638-ba2f44a5d0ea"). InnerVolumeSpecName "kube-api-access-q9kjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.856061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-util" (OuterVolumeSpecName: "util") pod "7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" (UID: "7990fa1c-49c7-43c8-b638-ba2f44a5d0ea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.947058 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.947089 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9kjs\" (UniqueName: \"kubernetes.io/projected/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-kube-api-access-q9kjs\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:56 crc kubenswrapper[4771]: I0219 21:43:56.947100 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7990fa1c-49c7-43c8-b638-ba2f44a5d0ea-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:57 crc kubenswrapper[4771]: I0219 21:43:57.416797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" event={"ID":"7990fa1c-49c7-43c8-b638-ba2f44a5d0ea","Type":"ContainerDied","Data":"a24553401ed7c2e599b3fd2ad27dbb9f8b8bceabdbd22bdc3921e6ee9100588e"} Feb 19 21:43:57 crc kubenswrapper[4771]: I0219 21:43:57.416852 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24553401ed7c2e599b3fd2ad27dbb9f8b8bceabdbd22bdc3921e6ee9100588e" Feb 19 21:43:57 crc kubenswrapper[4771]: I0219 21:43:57.417747 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.570663 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r"] Feb 19 21:44:02 crc kubenswrapper[4771]: E0219 21:44:02.571444 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerName="util" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.571459 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerName="util" Feb 19 21:44:02 crc kubenswrapper[4771]: E0219 21:44:02.571478 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerName="extract" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.571486 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerName="extract" Feb 19 21:44:02 crc kubenswrapper[4771]: E0219 21:44:02.571502 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerName="pull" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.571509 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerName="pull" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.571674 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7990fa1c-49c7-43c8-b638-ba2f44a5d0ea" containerName="extract" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.572151 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.576142 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-nmqch" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.577289 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.577379 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.597815 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r"] Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.752618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/491324af-c24e-481c-a9d7-d248342b3a3c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-lgx8r\" (UID: \"491324af-c24e-481c-a9d7-d248342b3a3c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.752937 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9hk9\" (UniqueName: \"kubernetes.io/projected/491324af-c24e-481c-a9d7-d248342b3a3c-kube-api-access-b9hk9\") pod \"cert-manager-operator-controller-manager-66c8bdd694-lgx8r\" (UID: \"491324af-c24e-481c-a9d7-d248342b3a3c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.854421 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/491324af-c24e-481c-a9d7-d248342b3a3c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-lgx8r\" (UID: \"491324af-c24e-481c-a9d7-d248342b3a3c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.854525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9hk9\" (UniqueName: \"kubernetes.io/projected/491324af-c24e-481c-a9d7-d248342b3a3c-kube-api-access-b9hk9\") pod \"cert-manager-operator-controller-manager-66c8bdd694-lgx8r\" (UID: \"491324af-c24e-481c-a9d7-d248342b3a3c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.854921 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/491324af-c24e-481c-a9d7-d248342b3a3c-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-lgx8r\" (UID: \"491324af-c24e-481c-a9d7-d248342b3a3c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.886469 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9hk9\" (UniqueName: \"kubernetes.io/projected/491324af-c24e-481c-a9d7-d248342b3a3c-kube-api-access-b9hk9\") pod \"cert-manager-operator-controller-manager-66c8bdd694-lgx8r\" (UID: \"491324af-c24e-481c-a9d7-d248342b3a3c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:02 crc kubenswrapper[4771]: I0219 21:44:02.890377 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" Feb 19 21:44:03 crc kubenswrapper[4771]: I0219 21:44:03.441638 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r"] Feb 19 21:44:03 crc kubenswrapper[4771]: W0219 21:44:03.454418 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491324af_c24e_481c_a9d7_d248342b3a3c.slice/crio-34cb3804599197c0ac2d01504976fb9eaa31d81e3815ec50d32412c994c8c69f WatchSource:0}: Error finding container 34cb3804599197c0ac2d01504976fb9eaa31d81e3815ec50d32412c994c8c69f: Status 404 returned error can't find the container with id 34cb3804599197c0ac2d01504976fb9eaa31d81e3815ec50d32412c994c8c69f Feb 19 21:44:03 crc kubenswrapper[4771]: I0219 21:44:03.492987 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" event={"ID":"491324af-c24e-481c-a9d7-d248342b3a3c","Type":"ContainerStarted","Data":"34cb3804599197c0ac2d01504976fb9eaa31d81e3815ec50d32412c994c8c69f"} Feb 19 21:44:07 crc kubenswrapper[4771]: I0219 21:44:07.520261 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" event={"ID":"491324af-c24e-481c-a9d7-d248342b3a3c","Type":"ContainerStarted","Data":"a70201a95d82906e6612d9272bc3320a384f3128e0da5cd5bc6581238b84183a"} Feb 19 21:44:07 crc kubenswrapper[4771]: I0219 21:44:07.549783 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-lgx8r" podStartSLOduration=2.5752472600000003 podStartE2EDuration="5.549756343s" podCreationTimestamp="2026-02-19 21:44:02 +0000 UTC" firstStartedPulling="2026-02-19 21:44:03.459151319 +0000 UTC m=+943.730593829" lastFinishedPulling="2026-02-19 21:44:06.433660442 +0000 UTC m=+946.705102912" observedRunningTime="2026-02-19 21:44:07.543055755 +0000 UTC m=+947.814498285" watchObservedRunningTime="2026-02-19 21:44:07.549756343 +0000 UTC m=+947.821198843" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.768856 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-s6929"] Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.769725 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.771711 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.771826 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bzvqn" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.772197 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.780816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-s6929"] Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.874634 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc2fp\" (UniqueName: \"kubernetes.io/projected/2de006aa-ec1d-4278-b1c6-d0cac9eebe17-kube-api-access-sc2fp\") pod \"cert-manager-webhook-6888856db4-s6929\" (UID: \"2de006aa-ec1d-4278-b1c6-d0cac9eebe17\") " pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.874941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de006aa-ec1d-4278-b1c6-d0cac9eebe17-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-s6929\" (UID: \"2de006aa-ec1d-4278-b1c6-d0cac9eebe17\") " pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.976312 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc2fp\" (UniqueName: \"kubernetes.io/projected/2de006aa-ec1d-4278-b1c6-d0cac9eebe17-kube-api-access-sc2fp\") pod \"cert-manager-webhook-6888856db4-s6929\" (UID: \"2de006aa-ec1d-4278-b1c6-d0cac9eebe17\") " pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.976376 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de006aa-ec1d-4278-b1c6-d0cac9eebe17-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-s6929\" (UID: \"2de006aa-ec1d-4278-b1c6-d0cac9eebe17\") " pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:09 crc kubenswrapper[4771]: I0219 21:44:09.998506 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc2fp\" (UniqueName: \"kubernetes.io/projected/2de006aa-ec1d-4278-b1c6-d0cac9eebe17-kube-api-access-sc2fp\") pod \"cert-manager-webhook-6888856db4-s6929\" (UID: \"2de006aa-ec1d-4278-b1c6-d0cac9eebe17\") " pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:10 crc kubenswrapper[4771]: I0219 21:44:10.001760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de006aa-ec1d-4278-b1c6-d0cac9eebe17-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-s6929\" (UID: \"2de006aa-ec1d-4278-b1c6-d0cac9eebe17\") " pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:10 crc kubenswrapper[4771]: I0219 21:44:10.082791 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:10 crc kubenswrapper[4771]: I0219 21:44:10.501739 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-s6929"] Feb 19 21:44:10 crc kubenswrapper[4771]: I0219 21:44:10.535604 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-s6929" event={"ID":"2de006aa-ec1d-4278-b1c6-d0cac9eebe17","Type":"ContainerStarted","Data":"e034876605f187c7b515923ea624d6259a68e857406d62ca5d1795ec2ee98bb3"} Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.154092 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n4z5q"] Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.158782 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.165595 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ppkxd" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.177394 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n4z5q"] Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.207578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzkvf\" (UniqueName: \"kubernetes.io/projected/483a30ad-c9d6-4198-8cda-f4afb3459f0f-kube-api-access-xzkvf\") pod \"cert-manager-cainjector-5545bd876-n4z5q\" (UID: \"483a30ad-c9d6-4198-8cda-f4afb3459f0f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.207748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/483a30ad-c9d6-4198-8cda-f4afb3459f0f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n4z5q\" (UID: \"483a30ad-c9d6-4198-8cda-f4afb3459f0f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.308617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/483a30ad-c9d6-4198-8cda-f4afb3459f0f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n4z5q\" (UID: \"483a30ad-c9d6-4198-8cda-f4afb3459f0f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.308778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzkvf\" (UniqueName: \"kubernetes.io/projected/483a30ad-c9d6-4198-8cda-f4afb3459f0f-kube-api-access-xzkvf\") pod \"cert-manager-cainjector-5545bd876-n4z5q\" (UID: \"483a30ad-c9d6-4198-8cda-f4afb3459f0f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.336753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/483a30ad-c9d6-4198-8cda-f4afb3459f0f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n4z5q\" (UID: \"483a30ad-c9d6-4198-8cda-f4afb3459f0f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.336962 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzkvf\" (UniqueName: \"kubernetes.io/projected/483a30ad-c9d6-4198-8cda-f4afb3459f0f-kube-api-access-xzkvf\") pod \"cert-manager-cainjector-5545bd876-n4z5q\" (UID: \"483a30ad-c9d6-4198-8cda-f4afb3459f0f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.483933 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" Feb 19 21:44:12 crc kubenswrapper[4771]: I0219 21:44:12.931173 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n4z5q"] Feb 19 21:44:12 crc kubenswrapper[4771]: W0219 21:44:12.937201 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod483a30ad_c9d6_4198_8cda_f4afb3459f0f.slice/crio-331fff1bc915404cc7972444d0f864aae9602d187d96b5db6c5f0d763d08d054 WatchSource:0}: Error finding container 331fff1bc915404cc7972444d0f864aae9602d187d96b5db6c5f0d763d08d054: Status 404 returned error can't find the container with id 331fff1bc915404cc7972444d0f864aae9602d187d96b5db6c5f0d763d08d054 Feb 19 21:44:13 crc kubenswrapper[4771]: I0219 21:44:13.555910 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" event={"ID":"483a30ad-c9d6-4198-8cda-f4afb3459f0f","Type":"ContainerStarted","Data":"331fff1bc915404cc7972444d0f864aae9602d187d96b5db6c5f0d763d08d054"} Feb 19 21:44:15 crc kubenswrapper[4771]: I0219 21:44:15.573620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" event={"ID":"483a30ad-c9d6-4198-8cda-f4afb3459f0f","Type":"ContainerStarted","Data":"2f913f0fd749b1e1ddba66b44f50021f2024ac60e8990c39a73cbbf0e41cdfa8"} Feb 19 21:44:15 crc kubenswrapper[4771]: I0219 21:44:15.576597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-s6929" event={"ID":"2de006aa-ec1d-4278-b1c6-d0cac9eebe17","Type":"ContainerStarted","Data":"a171c50d1026ded685917ab298b4345a5458b2727471892c7ffeea627ceec1f3"} Feb 19 21:44:15 crc kubenswrapper[4771]: I0219 21:44:15.576756 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:15 crc kubenswrapper[4771]: I0219 21:44:15.600227 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-n4z5q" podStartSLOduration=1.283110545 podStartE2EDuration="3.600195139s" podCreationTimestamp="2026-02-19 21:44:12 +0000 UTC" firstStartedPulling="2026-02-19 21:44:12.938508806 +0000 UTC m=+953.209951286" lastFinishedPulling="2026-02-19 21:44:15.2555934 +0000 UTC m=+955.527035880" observedRunningTime="2026-02-19 21:44:15.594860948 +0000 UTC m=+955.866303438" watchObservedRunningTime="2026-02-19 21:44:15.600195139 +0000 UTC m=+955.871637669" Feb 19 21:44:15 crc kubenswrapper[4771]: I0219 21:44:15.626800 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-s6929" podStartSLOduration=1.8739127899999999 podStartE2EDuration="6.626773961s" podCreationTimestamp="2026-02-19 21:44:09 +0000 UTC" firstStartedPulling="2026-02-19 21:44:10.511478719 +0000 UTC m=+950.782921199" lastFinishedPulling="2026-02-19 21:44:15.26433986 +0000 UTC m=+955.535782370" observedRunningTime="2026-02-19 21:44:15.625570729 +0000 UTC m=+955.897013239" watchObservedRunningTime="2026-02-19 21:44:15.626773961 +0000 UTC m=+955.898216451" Feb 19 21:44:20 crc kubenswrapper[4771]: I0219 21:44:20.086454 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-s6929" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.174850 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-w5psz"] Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.177747 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.180980 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qgcjc" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.196143 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-w5psz"] Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.331070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a33d99d-a153-489c-925b-1fd6a1d0cdf7-bound-sa-token\") pod \"cert-manager-545d4d4674-w5psz\" (UID: \"5a33d99d-a153-489c-925b-1fd6a1d0cdf7\") " pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.331156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88dh\" (UniqueName: \"kubernetes.io/projected/5a33d99d-a153-489c-925b-1fd6a1d0cdf7-kube-api-access-d88dh\") pod \"cert-manager-545d4d4674-w5psz\" (UID: \"5a33d99d-a153-489c-925b-1fd6a1d0cdf7\") " pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.432985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88dh\" (UniqueName: \"kubernetes.io/projected/5a33d99d-a153-489c-925b-1fd6a1d0cdf7-kube-api-access-d88dh\") pod \"cert-manager-545d4d4674-w5psz\" (UID: \"5a33d99d-a153-489c-925b-1fd6a1d0cdf7\") " pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.433198 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a33d99d-a153-489c-925b-1fd6a1d0cdf7-bound-sa-token\") pod \"cert-manager-545d4d4674-w5psz\" (UID: \"5a33d99d-a153-489c-925b-1fd6a1d0cdf7\") " pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.456224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88dh\" (UniqueName: \"kubernetes.io/projected/5a33d99d-a153-489c-925b-1fd6a1d0cdf7-kube-api-access-d88dh\") pod \"cert-manager-545d4d4674-w5psz\" (UID: \"5a33d99d-a153-489c-925b-1fd6a1d0cdf7\") " pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.460524 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a33d99d-a153-489c-925b-1fd6a1d0cdf7-bound-sa-token\") pod \"cert-manager-545d4d4674-w5psz\" (UID: \"5a33d99d-a153-489c-925b-1fd6a1d0cdf7\") " pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.533232 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-w5psz" Feb 19 21:44:21 crc kubenswrapper[4771]: I0219 21:44:21.809615 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-w5psz"] Feb 19 21:44:22 crc kubenswrapper[4771]: I0219 21:44:22.642693 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-w5psz" event={"ID":"5a33d99d-a153-489c-925b-1fd6a1d0cdf7","Type":"ContainerStarted","Data":"a529d801eba3ecd139767ad3960c3907ccf320b4aa99015c0eaa3d43a1f77253"} Feb 19 21:44:22 crc kubenswrapper[4771]: I0219 21:44:22.643006 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-w5psz" event={"ID":"5a33d99d-a153-489c-925b-1fd6a1d0cdf7","Type":"ContainerStarted","Data":"dd3e2fefc545ee5d075a7d9d32e7d0b842d959d4a475a8d512a79fcf3aa01da1"} Feb 19 21:44:22 crc kubenswrapper[4771]: I0219 21:44:22.660002 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-w5psz" podStartSLOduration=1.6599872169999998 podStartE2EDuration="1.659987217s" podCreationTimestamp="2026-02-19 21:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:44:22.65746262 +0000 UTC m=+962.928905090" watchObservedRunningTime="2026-02-19 21:44:22.659987217 +0000 UTC m=+962.931429687" Feb 19 21:44:33 crc kubenswrapper[4771]: I0219 21:44:33.992225 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7578n"] Feb 19 21:44:33 crc kubenswrapper[4771]: I0219 21:44:33.993558 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7578n" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:33.997335 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:33.998089 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:34.000612 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-s8sjb" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:34.022563 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7578n"] Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:34.140180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdg5d\" (UniqueName: \"kubernetes.io/projected/f5ba7aed-c8fe-4a6d-81de-102e42eb8975-kube-api-access-sdg5d\") pod \"openstack-operator-index-7578n\" (UID: \"f5ba7aed-c8fe-4a6d-81de-102e42eb8975\") " pod="openstack-operators/openstack-operator-index-7578n" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:34.241599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdg5d\" (UniqueName: \"kubernetes.io/projected/f5ba7aed-c8fe-4a6d-81de-102e42eb8975-kube-api-access-sdg5d\") pod \"openstack-operator-index-7578n\" (UID: \"f5ba7aed-c8fe-4a6d-81de-102e42eb8975\") " pod="openstack-operators/openstack-operator-index-7578n" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:34.269525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdg5d\" (UniqueName: \"kubernetes.io/projected/f5ba7aed-c8fe-4a6d-81de-102e42eb8975-kube-api-access-sdg5d\") pod \"openstack-operator-index-7578n\" (UID: \"f5ba7aed-c8fe-4a6d-81de-102e42eb8975\") " pod="openstack-operators/openstack-operator-index-7578n" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:34.313922 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7578n" Feb 19 21:44:34 crc kubenswrapper[4771]: I0219 21:44:34.834161 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7578n"] Feb 19 21:44:35 crc kubenswrapper[4771]: I0219 21:44:35.763315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7578n" event={"ID":"f5ba7aed-c8fe-4a6d-81de-102e42eb8975","Type":"ContainerStarted","Data":"28e076497b68cb3231f2d756c1bdf3b66d9d6ed7aafed683ea5ca81cf751e02e"} Feb 19 21:44:36 crc kubenswrapper[4771]: I0219 21:44:36.783410 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7578n" event={"ID":"f5ba7aed-c8fe-4a6d-81de-102e42eb8975","Type":"ContainerStarted","Data":"f9e8649e655ee08293c4676833a4f52d85cd6ecfca09d643e70853b6d32816ee"} Feb 19 21:44:36 crc kubenswrapper[4771]: I0219 21:44:36.807752 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7578n" podStartSLOduration=2.92736019 podStartE2EDuration="3.807732927s" podCreationTimestamp="2026-02-19 21:44:33 +0000 UTC" firstStartedPulling="2026-02-19 21:44:34.844876417 +0000 UTC m=+975.116318927" lastFinishedPulling="2026-02-19 21:44:35.725249194 +0000 UTC m=+975.996691664" observedRunningTime="2026-02-19 21:44:36.80105004 +0000 UTC m=+977.072492520" watchObservedRunningTime="2026-02-19 21:44:36.807732927 +0000 UTC m=+977.079175397" Feb 19 21:44:37 crc kubenswrapper[4771]: I0219 21:44:37.349465 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7578n"] Feb 19 21:44:37 crc kubenswrapper[4771]: I0219 21:44:37.961415 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qw4qh"] Feb 19 21:44:37 crc kubenswrapper[4771]: I0219 21:44:37.962138 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:37 crc kubenswrapper[4771]: I0219 21:44:37.980387 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qw4qh"] Feb 19 21:44:38 crc kubenswrapper[4771]: I0219 21:44:38.099211 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjr7w\" (UniqueName: \"kubernetes.io/projected/d4cb01c1-d9ee-4a5a-b125-bf428e3ced10-kube-api-access-jjr7w\") pod \"openstack-operator-index-qw4qh\" (UID: \"d4cb01c1-d9ee-4a5a-b125-bf428e3ced10\") " pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:38 crc kubenswrapper[4771]: I0219 21:44:38.200298 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjr7w\" (UniqueName: \"kubernetes.io/projected/d4cb01c1-d9ee-4a5a-b125-bf428e3ced10-kube-api-access-jjr7w\") pod \"openstack-operator-index-qw4qh\" (UID: \"d4cb01c1-d9ee-4a5a-b125-bf428e3ced10\") " pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:38 crc kubenswrapper[4771]: I0219 21:44:38.240635 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjr7w\" (UniqueName: \"kubernetes.io/projected/d4cb01c1-d9ee-4a5a-b125-bf428e3ced10-kube-api-access-jjr7w\") pod \"openstack-operator-index-qw4qh\" (UID: \"d4cb01c1-d9ee-4a5a-b125-bf428e3ced10\") " pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:38 crc kubenswrapper[4771]: I0219 21:44:38.295501 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:38 crc kubenswrapper[4771]: I0219 21:44:38.563477 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qw4qh"] Feb 19 21:44:38 crc kubenswrapper[4771]: W0219 21:44:38.568200 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4cb01c1_d9ee_4a5a_b125_bf428e3ced10.slice/crio-707c32d5af0dd6d2a450870982f9011edf63d8689395d3b025cb4725ccc9e88f WatchSource:0}: Error finding container 707c32d5af0dd6d2a450870982f9011edf63d8689395d3b025cb4725ccc9e88f: Status 404 returned error can't find the container with id 707c32d5af0dd6d2a450870982f9011edf63d8689395d3b025cb4725ccc9e88f Feb 19 21:44:38 crc kubenswrapper[4771]: I0219 21:44:38.797985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qw4qh" event={"ID":"d4cb01c1-d9ee-4a5a-b125-bf428e3ced10","Type":"ContainerStarted","Data":"707c32d5af0dd6d2a450870982f9011edf63d8689395d3b025cb4725ccc9e88f"} Feb 19 21:44:38 crc kubenswrapper[4771]: I0219 21:44:38.798136 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7578n" podUID="f5ba7aed-c8fe-4a6d-81de-102e42eb8975" containerName="registry-server" containerID="cri-o://f9e8649e655ee08293c4676833a4f52d85cd6ecfca09d643e70853b6d32816ee" gracePeriod=2 Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.810108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qw4qh" event={"ID":"d4cb01c1-d9ee-4a5a-b125-bf428e3ced10","Type":"ContainerStarted","Data":"f987a1df74c63369380b1da0b61e1679e1c1319464904623bcc2ed51e1504086"} Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.814117 4771 generic.go:334] "Generic (PLEG): container finished" podID="f5ba7aed-c8fe-4a6d-81de-102e42eb8975" containerID="f9e8649e655ee08293c4676833a4f52d85cd6ecfca09d643e70853b6d32816ee" exitCode=0 Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.814187 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7578n" event={"ID":"f5ba7aed-c8fe-4a6d-81de-102e42eb8975","Type":"ContainerDied","Data":"f9e8649e655ee08293c4676833a4f52d85cd6ecfca09d643e70853b6d32816ee"} Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.814225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7578n" event={"ID":"f5ba7aed-c8fe-4a6d-81de-102e42eb8975","Type":"ContainerDied","Data":"28e076497b68cb3231f2d756c1bdf3b66d9d6ed7aafed683ea5ca81cf751e02e"} Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.814248 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28e076497b68cb3231f2d756c1bdf3b66d9d6ed7aafed683ea5ca81cf751e02e" Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.821145 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7578n" Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.847100 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qw4qh" podStartSLOduration=2.3535514109999998 podStartE2EDuration="2.847073112s" podCreationTimestamp="2026-02-19 21:44:37 +0000 UTC" firstStartedPulling="2026-02-19 21:44:38.573346249 +0000 UTC m=+978.844788729" lastFinishedPulling="2026-02-19 21:44:39.06686792 +0000 UTC m=+979.338310430" observedRunningTime="2026-02-19 21:44:39.836280008 +0000 UTC m=+980.107722558" watchObservedRunningTime="2026-02-19 21:44:39.847073112 +0000 UTC m=+980.118515622" Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.933337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdg5d\" (UniqueName: \"kubernetes.io/projected/f5ba7aed-c8fe-4a6d-81de-102e42eb8975-kube-api-access-sdg5d\") pod \"f5ba7aed-c8fe-4a6d-81de-102e42eb8975\" (UID: \"f5ba7aed-c8fe-4a6d-81de-102e42eb8975\") " Feb 19 21:44:39 crc kubenswrapper[4771]: I0219 21:44:39.942450 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ba7aed-c8fe-4a6d-81de-102e42eb8975-kube-api-access-sdg5d" (OuterVolumeSpecName: "kube-api-access-sdg5d") pod "f5ba7aed-c8fe-4a6d-81de-102e42eb8975" (UID: "f5ba7aed-c8fe-4a6d-81de-102e42eb8975"). InnerVolumeSpecName "kube-api-access-sdg5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:44:40 crc kubenswrapper[4771]: I0219 21:44:40.035131 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdg5d\" (UniqueName: \"kubernetes.io/projected/f5ba7aed-c8fe-4a6d-81de-102e42eb8975-kube-api-access-sdg5d\") on node \"crc\" DevicePath \"\"" Feb 19 21:44:40 crc kubenswrapper[4771]: I0219 21:44:40.825653 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7578n" Feb 19 21:44:40 crc kubenswrapper[4771]: I0219 21:44:40.855932 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7578n"] Feb 19 21:44:40 crc kubenswrapper[4771]: I0219 21:44:40.864235 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7578n"] Feb 19 21:44:42 crc kubenswrapper[4771]: I0219 21:44:42.450735 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ba7aed-c8fe-4a6d-81de-102e42eb8975" path="/var/lib/kubelet/pods/f5ba7aed-c8fe-4a6d-81de-102e42eb8975/volumes" Feb 19 21:44:48 crc kubenswrapper[4771]: I0219 21:44:48.296328 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:48 crc kubenswrapper[4771]: I0219 21:44:48.297231 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:48 crc kubenswrapper[4771]: I0219 21:44:48.348562 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:48 crc kubenswrapper[4771]: I0219 21:44:48.949309 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qw4qh" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.082151 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt"] Feb 19 21:44:55 crc kubenswrapper[4771]: E0219 21:44:55.083216 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ba7aed-c8fe-4a6d-81de-102e42eb8975" containerName="registry-server" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.083264 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ba7aed-c8fe-4a6d-81de-102e42eb8975" containerName="registry-server" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.083536 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ba7aed-c8fe-4a6d-81de-102e42eb8975" containerName="registry-server" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.085314 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.090483 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-z96jq" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.101138 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt"] Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.238164 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hdlx\" (UniqueName: \"kubernetes.io/projected/f9a3474a-871f-4fac-9776-cf660e3f2824-kube-api-access-2hdlx\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.238210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.238231 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.340148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hdlx\" (UniqueName: \"kubernetes.io/projected/f9a3474a-871f-4fac-9776-cf660e3f2824-kube-api-access-2hdlx\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.340645 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.341374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.341563 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.341771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.368604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hdlx\" (UniqueName: \"kubernetes.io/projected/f9a3474a-871f-4fac-9776-cf660e3f2824-kube-api-access-2hdlx\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.414343 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.896373 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt"] Feb 19 21:44:55 crc kubenswrapper[4771]: I0219 21:44:55.957236 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" event={"ID":"f9a3474a-871f-4fac-9776-cf660e3f2824","Type":"ContainerStarted","Data":"c877ff888f2dbdce976c214a89cb205eee08472b2d32a76377835c8b8b6a3f14"} Feb 19 21:44:56 crc kubenswrapper[4771]: I0219 21:44:56.969575 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerID="f293f66fc3ba74de35d32e95bbe11e5565aabd0e54f5be509acb4d25e75ae1ae" exitCode=0 Feb 19 21:44:56 crc kubenswrapper[4771]: I0219 21:44:56.969695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" event={"ID":"f9a3474a-871f-4fac-9776-cf660e3f2824","Type":"ContainerDied","Data":"f293f66fc3ba74de35d32e95bbe11e5565aabd0e54f5be509acb4d25e75ae1ae"} Feb 19 21:44:57 crc kubenswrapper[4771]: I0219 21:44:57.982766 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerID="084bcc5cfcfe02e704b4a5506acdd1e5d5cb1b0b9c2ae5c90bb82c63c984bf5f" exitCode=0 Feb 19 21:44:57 crc kubenswrapper[4771]: I0219 21:44:57.982950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" event={"ID":"f9a3474a-871f-4fac-9776-cf660e3f2824","Type":"ContainerDied","Data":"084bcc5cfcfe02e704b4a5506acdd1e5d5cb1b0b9c2ae5c90bb82c63c984bf5f"} Feb 19 21:44:58 crc kubenswrapper[4771]: I0219 21:44:58.991689 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerID="11a8d3bc93954a27b4bb293ae9ef5ab37817f93e1bf1b9d28b208066fcd5ba96" exitCode=0 Feb 19 21:44:58 crc kubenswrapper[4771]: I0219 21:44:58.991767 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" event={"ID":"f9a3474a-871f-4fac-9776-cf660e3f2824","Type":"ContainerDied","Data":"11a8d3bc93954a27b4bb293ae9ef5ab37817f93e1bf1b9d28b208066fcd5ba96"} Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.187801 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx"] Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.191694 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.196619 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.196735 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.199610 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx"] Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.276438 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.319846 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d455c12-5554-439d-ac6e-61b738c8ea5c-secret-volume\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.319890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d455c12-5554-439d-ac6e-61b738c8ea5c-config-volume\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.319931 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg79s\" (UniqueName: \"kubernetes.io/projected/7d455c12-5554-439d-ac6e-61b738c8ea5c-kube-api-access-cg79s\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.420871 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hdlx\" (UniqueName: \"kubernetes.io/projected/f9a3474a-871f-4fac-9776-cf660e3f2824-kube-api-access-2hdlx\") pod \"f9a3474a-871f-4fac-9776-cf660e3f2824\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.420951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-util\") pod \"f9a3474a-871f-4fac-9776-cf660e3f2824\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.421012 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-bundle\") pod \"f9a3474a-871f-4fac-9776-cf660e3f2824\" (UID: \"f9a3474a-871f-4fac-9776-cf660e3f2824\") " Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.421324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d455c12-5554-439d-ac6e-61b738c8ea5c-secret-volume\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.421370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d455c12-5554-439d-ac6e-61b738c8ea5c-config-volume\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.421428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg79s\" (UniqueName: \"kubernetes.io/projected/7d455c12-5554-439d-ac6e-61b738c8ea5c-kube-api-access-cg79s\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.422257 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-bundle" (OuterVolumeSpecName: "bundle") pod "f9a3474a-871f-4fac-9776-cf660e3f2824" (UID: "f9a3474a-871f-4fac-9776-cf660e3f2824"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.423444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d455c12-5554-439d-ac6e-61b738c8ea5c-config-volume\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.429920 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a3474a-871f-4fac-9776-cf660e3f2824-kube-api-access-2hdlx" (OuterVolumeSpecName: "kube-api-access-2hdlx") pod "f9a3474a-871f-4fac-9776-cf660e3f2824" (UID: "f9a3474a-871f-4fac-9776-cf660e3f2824"). InnerVolumeSpecName "kube-api-access-2hdlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.430090 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d455c12-5554-439d-ac6e-61b738c8ea5c-secret-volume\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.441406 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg79s\" (UniqueName: \"kubernetes.io/projected/7d455c12-5554-439d-ac6e-61b738c8ea5c-kube-api-access-cg79s\") pod \"collect-profiles-29525625-pl4kx\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.449568 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-util" (OuterVolumeSpecName: "util") pod "f9a3474a-871f-4fac-9776-cf660e3f2824" (UID: "f9a3474a-871f-4fac-9776-cf660e3f2824"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.516893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.522773 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.522820 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hdlx\" (UniqueName: \"kubernetes.io/projected/f9a3474a-871f-4fac-9776-cf660e3f2824-kube-api-access-2hdlx\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.522842 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a3474a-871f-4fac-9776-cf660e3f2824-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:00 crc kubenswrapper[4771]: I0219 21:45:00.801871 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx"] Feb 19 21:45:00 crc kubenswrapper[4771]: W0219 21:45:00.803632 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d455c12_5554_439d_ac6e_61b738c8ea5c.slice/crio-32637947c655bd14da458cdeb0f939ae63695a0bd73b60753958a0740781d1bf WatchSource:0}: Error finding container 32637947c655bd14da458cdeb0f939ae63695a0bd73b60753958a0740781d1bf: Status 404 returned error can't find the container with id 32637947c655bd14da458cdeb0f939ae63695a0bd73b60753958a0740781d1bf Feb 19 21:45:01 crc kubenswrapper[4771]: I0219 21:45:01.005977 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" event={"ID":"7d455c12-5554-439d-ac6e-61b738c8ea5c","Type":"ContainerStarted","Data":"d910dccaf47ede0010e7a4440a4c1aff300bfd58af602a4d9af3ad760c05385d"} Feb 19 21:45:01 crc kubenswrapper[4771]: I0219 21:45:01.006524 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" event={"ID":"7d455c12-5554-439d-ac6e-61b738c8ea5c","Type":"ContainerStarted","Data":"32637947c655bd14da458cdeb0f939ae63695a0bd73b60753958a0740781d1bf"} Feb 19 21:45:01 crc kubenswrapper[4771]: I0219 21:45:01.010098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" event={"ID":"f9a3474a-871f-4fac-9776-cf660e3f2824","Type":"ContainerDied","Data":"c877ff888f2dbdce976c214a89cb205eee08472b2d32a76377835c8b8b6a3f14"} Feb 19 21:45:01 crc kubenswrapper[4771]: I0219 21:45:01.010142 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt" Feb 19 21:45:01 crc kubenswrapper[4771]: I0219 21:45:01.010160 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c877ff888f2dbdce976c214a89cb205eee08472b2d32a76377835c8b8b6a3f14" Feb 19 21:45:01 crc kubenswrapper[4771]: I0219 21:45:01.028266 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" podStartSLOduration=1.028240252 podStartE2EDuration="1.028240252s" podCreationTimestamp="2026-02-19 21:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:01.026627119 +0000 UTC m=+1001.298069589" watchObservedRunningTime="2026-02-19 21:45:01.028240252 +0000 UTC m=+1001.299682722" Feb 19 21:45:02 crc kubenswrapper[4771]: I0219 21:45:02.020672 4771 generic.go:334] "Generic (PLEG): container finished" podID="7d455c12-5554-439d-ac6e-61b738c8ea5c" containerID="d910dccaf47ede0010e7a4440a4c1aff300bfd58af602a4d9af3ad760c05385d" exitCode=0 Feb 19 21:45:02 crc kubenswrapper[4771]: I0219 21:45:02.020721 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" event={"ID":"7d455c12-5554-439d-ac6e-61b738c8ea5c","Type":"ContainerDied","Data":"d910dccaf47ede0010e7a4440a4c1aff300bfd58af602a4d9af3ad760c05385d"} Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.367003 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.474158 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d455c12-5554-439d-ac6e-61b738c8ea5c-secret-volume\") pod \"7d455c12-5554-439d-ac6e-61b738c8ea5c\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.474219 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d455c12-5554-439d-ac6e-61b738c8ea5c-config-volume\") pod \"7d455c12-5554-439d-ac6e-61b738c8ea5c\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.474320 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg79s\" (UniqueName: \"kubernetes.io/projected/7d455c12-5554-439d-ac6e-61b738c8ea5c-kube-api-access-cg79s\") pod \"7d455c12-5554-439d-ac6e-61b738c8ea5c\" (UID: \"7d455c12-5554-439d-ac6e-61b738c8ea5c\") " Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.475256 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d455c12-5554-439d-ac6e-61b738c8ea5c-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d455c12-5554-439d-ac6e-61b738c8ea5c" (UID: "7d455c12-5554-439d-ac6e-61b738c8ea5c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.481780 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d455c12-5554-439d-ac6e-61b738c8ea5c-kube-api-access-cg79s" (OuterVolumeSpecName: "kube-api-access-cg79s") pod "7d455c12-5554-439d-ac6e-61b738c8ea5c" (UID: "7d455c12-5554-439d-ac6e-61b738c8ea5c"). InnerVolumeSpecName "kube-api-access-cg79s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.482555 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d455c12-5554-439d-ac6e-61b738c8ea5c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d455c12-5554-439d-ac6e-61b738c8ea5c" (UID: "7d455c12-5554-439d-ac6e-61b738c8ea5c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.575837 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d455c12-5554-439d-ac6e-61b738c8ea5c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.575887 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg79s\" (UniqueName: \"kubernetes.io/projected/7d455c12-5554-439d-ac6e-61b738c8ea5c-kube-api-access-cg79s\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:03 crc kubenswrapper[4771]: I0219 21:45:03.575908 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d455c12-5554-439d-ac6e-61b738c8ea5c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:04 crc kubenswrapper[4771]: I0219 21:45:04.043220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" event={"ID":"7d455c12-5554-439d-ac6e-61b738c8ea5c","Type":"ContainerDied","Data":"32637947c655bd14da458cdeb0f939ae63695a0bd73b60753958a0740781d1bf"} Feb 19 21:45:04 crc kubenswrapper[4771]: I0219 21:45:04.043281 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32637947c655bd14da458cdeb0f939ae63695a0bd73b60753958a0740781d1bf" Feb 19 21:45:04 crc kubenswrapper[4771]: I0219 21:45:04.043361 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.286528 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2"] Feb 19 21:45:07 crc kubenswrapper[4771]: E0219 21:45:07.287463 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerName="extract" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.287486 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerName="extract" Feb 19 21:45:07 crc kubenswrapper[4771]: E0219 21:45:07.287520 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerName="util" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.287532 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerName="util" Feb 19 21:45:07 crc kubenswrapper[4771]: E0219 21:45:07.287549 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d455c12-5554-439d-ac6e-61b738c8ea5c" containerName="collect-profiles" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.287560 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d455c12-5554-439d-ac6e-61b738c8ea5c" containerName="collect-profiles" Feb 19 21:45:07 crc kubenswrapper[4771]: E0219 21:45:07.287583 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerName="pull" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.287594 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerName="pull" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.287767 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a3474a-871f-4fac-9776-cf660e3f2824" containerName="extract" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.287801 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d455c12-5554-439d-ac6e-61b738c8ea5c" containerName="collect-profiles" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.288457 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.290963 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-5gq2m" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.319566 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2"] Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.460478 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9thfc\" (UniqueName: \"kubernetes.io/projected/ab9851f8-bb6b-4d51-aed3-b67fd5a3044c-kube-api-access-9thfc\") pod \"openstack-operator-controller-init-6679bf9b57-hjfl2\" (UID: \"ab9851f8-bb6b-4d51-aed3-b67fd5a3044c\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.561686 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9thfc\" (UniqueName: \"kubernetes.io/projected/ab9851f8-bb6b-4d51-aed3-b67fd5a3044c-kube-api-access-9thfc\") pod \"openstack-operator-controller-init-6679bf9b57-hjfl2\" (UID: \"ab9851f8-bb6b-4d51-aed3-b67fd5a3044c\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.586861 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9thfc\" (UniqueName: \"kubernetes.io/projected/ab9851f8-bb6b-4d51-aed3-b67fd5a3044c-kube-api-access-9thfc\") pod \"openstack-operator-controller-init-6679bf9b57-hjfl2\" (UID: \"ab9851f8-bb6b-4d51-aed3-b67fd5a3044c\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.614126 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" Feb 19 21:45:07 crc kubenswrapper[4771]: I0219 21:45:07.826141 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2"] Feb 19 21:45:08 crc kubenswrapper[4771]: I0219 21:45:08.072147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" event={"ID":"ab9851f8-bb6b-4d51-aed3-b67fd5a3044c","Type":"ContainerStarted","Data":"89925770b150489aad5fbf9fb45f8c00f98b9fa924cd3a24ee4af06e19e0da9c"} Feb 19 21:45:12 crc kubenswrapper[4771]: I0219 21:45:12.957110 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:45:12 crc kubenswrapper[4771]: I0219 21:45:12.957428 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:45:13 crc kubenswrapper[4771]: I0219 21:45:13.104201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" event={"ID":"ab9851f8-bb6b-4d51-aed3-b67fd5a3044c","Type":"ContainerStarted","Data":"70bfdbab6b05242244e1e44e60def0645dad447807888bdcad0f3959b0c7750c"} Feb 19 21:45:13 crc kubenswrapper[4771]: I0219 21:45:13.104421 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" Feb 19 21:45:13 crc kubenswrapper[4771]: I0219 21:45:13.162536 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" podStartSLOduration=1.5208341490000001 podStartE2EDuration="6.162509505s" podCreationTimestamp="2026-02-19 21:45:07 +0000 UTC" firstStartedPulling="2026-02-19 21:45:07.831432464 +0000 UTC m=+1008.102874944" lastFinishedPulling="2026-02-19 21:45:12.47310779 +0000 UTC m=+1012.744550300" observedRunningTime="2026-02-19 21:45:13.150441026 +0000 UTC m=+1013.421883516" watchObservedRunningTime="2026-02-19 21:45:13.162509505 +0000 UTC m=+1013.433952005" Feb 19 21:45:17 crc kubenswrapper[4771]: I0219 21:45:17.617348 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-hjfl2" Feb 19 21:45:42 crc kubenswrapper[4771]: I0219 21:45:42.956763 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:45:42 crc kubenswrapper[4771]: I0219 21:45:42.957251 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.198583 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.199601 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.201665 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9jn9g" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.213275 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.213964 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.216335 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-s96ss" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.218179 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.221900 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.230688 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.231645 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.237906 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gnrsf" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.265602 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.295397 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-2qr26"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.296263 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.298464 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-lllmw" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.316065 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-2qr26"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.324073 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.324822 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.327786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbtq6\" (UniqueName: \"kubernetes.io/projected/747c00a6-dc75-476f-bdcd-b24d58b2fbe8-kube-api-access-tbtq6\") pod \"cinder-operator-controller-manager-5d946d989d-6jvq7\" (UID: \"747c00a6-dc75-476f-bdcd-b24d58b2fbe8\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.327862 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwp2r\" (UniqueName: \"kubernetes.io/projected/1a3c4ae6-54be-46a2-93e6-db74ebf892e3-kube-api-access-qwp2r\") pod \"barbican-operator-controller-manager-868647ff47-gvxfm\" (UID: \"1a3c4ae6-54be-46a2-93e6-db74ebf892e3\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.327920 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8f72\" (UniqueName: \"kubernetes.io/projected/93b55752-3270-48d6-a5b2-6fbf12729651-kube-api-access-t8f72\") pod \"designate-operator-controller-manager-6d8bf5c495-hsg94\" (UID: \"93b55752-3270-48d6-a5b2-6fbf12729651\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.333375 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rbzc2" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.367331 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.371156 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.371976 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.377759 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qdbtf" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.400218 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.426489 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.438751 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kpz4n" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.438965 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.472406 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64xv\" (UniqueName: \"kubernetes.io/projected/7bcef2a5-3cd6-4add-bf80-92e6eb274058-kube-api-access-s64xv\") pod \"glance-operator-controller-manager-77987464f4-2qr26\" (UID: \"7bcef2a5-3cd6-4add-bf80-92e6eb274058\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.472474 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbtq6\" (UniqueName: \"kubernetes.io/projected/747c00a6-dc75-476f-bdcd-b24d58b2fbe8-kube-api-access-tbtq6\") pod \"cinder-operator-controller-manager-5d946d989d-6jvq7\" (UID: \"747c00a6-dc75-476f-bdcd-b24d58b2fbe8\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.472527 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwp2r\" (UniqueName: \"kubernetes.io/projected/1a3c4ae6-54be-46a2-93e6-db74ebf892e3-kube-api-access-qwp2r\") pod \"barbican-operator-controller-manager-868647ff47-gvxfm\" (UID: \"1a3c4ae6-54be-46a2-93e6-db74ebf892e3\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.472560 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65crc\" (UniqueName: \"kubernetes.io/projected/a7392724-153e-4b25-a984-bff0f841aac6-kube-api-access-65crc\") pod \"heat-operator-controller-manager-69f49c598c-vdnzp\" (UID: \"a7392724-153e-4b25-a984-bff0f841aac6\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.472628 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8f72\" (UniqueName: \"kubernetes.io/projected/93b55752-3270-48d6-a5b2-6fbf12729651-kube-api-access-t8f72\") pod \"designate-operator-controller-manager-6d8bf5c495-hsg94\" (UID: \"93b55752-3270-48d6-a5b2-6fbf12729651\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.503307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbtq6\" (UniqueName: \"kubernetes.io/projected/747c00a6-dc75-476f-bdcd-b24d58b2fbe8-kube-api-access-tbtq6\") pod \"cinder-operator-controller-manager-5d946d989d-6jvq7\" (UID: \"747c00a6-dc75-476f-bdcd-b24d58b2fbe8\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.514104 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.517620 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8f72\" (UniqueName: \"kubernetes.io/projected/93b55752-3270-48d6-a5b2-6fbf12729651-kube-api-access-t8f72\") pod \"designate-operator-controller-manager-6d8bf5c495-hsg94\" (UID: \"93b55752-3270-48d6-a5b2-6fbf12729651\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.517636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwp2r\" (UniqueName: \"kubernetes.io/projected/1a3c4ae6-54be-46a2-93e6-db74ebf892e3-kube-api-access-qwp2r\") pod \"barbican-operator-controller-manager-868647ff47-gvxfm\" (UID: \"1a3c4ae6-54be-46a2-93e6-db74ebf892e3\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.529447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.529881 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.554519 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.563082 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.563910 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.574628 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktltz\" (UniqueName: \"kubernetes.io/projected/1f5bbacb-8d9b-4289-938a-05e191d519f9-kube-api-access-ktltz\") pod \"horizon-operator-controller-manager-5b9b8895d5-jvbwn\" (UID: \"1f5bbacb-8d9b-4289-938a-05e191d519f9\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.574683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64xv\" (UniqueName: \"kubernetes.io/projected/7bcef2a5-3cd6-4add-bf80-92e6eb274058-kube-api-access-s64xv\") pod \"glance-operator-controller-manager-77987464f4-2qr26\" (UID: \"7bcef2a5-3cd6-4add-bf80-92e6eb274058\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.574722 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvp9q\" (UniqueName: \"kubernetes.io/projected/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-kube-api-access-lvp9q\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.574748 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65crc\" (UniqueName: \"kubernetes.io/projected/a7392724-153e-4b25-a984-bff0f841aac6-kube-api-access-65crc\") pod \"heat-operator-controller-manager-69f49c598c-vdnzp\" (UID: \"a7392724-153e-4b25-a984-bff0f841aac6\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.574764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.578646 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cbxsz" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.650725 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65crc\" (UniqueName: \"kubernetes.io/projected/a7392724-153e-4b25-a984-bff0f841aac6-kube-api-access-65crc\") pod \"heat-operator-controller-manager-69f49c598c-vdnzp\" (UID: \"a7392724-153e-4b25-a984-bff0f841aac6\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.650752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64xv\" (UniqueName: \"kubernetes.io/projected/7bcef2a5-3cd6-4add-bf80-92e6eb274058-kube-api-access-s64xv\") pod \"glance-operator-controller-manager-77987464f4-2qr26\" (UID: \"7bcef2a5-3cd6-4add-bf80-92e6eb274058\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.676436 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.676490 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghg5v\" (UniqueName: \"kubernetes.io/projected/fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd-kube-api-access-ghg5v\") pod \"ironic-operator-controller-manager-554564d7fc-g92n5\" (UID: \"fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.676555 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktltz\" (UniqueName: \"kubernetes.io/projected/1f5bbacb-8d9b-4289-938a-05e191d519f9-kube-api-access-ktltz\") pod \"horizon-operator-controller-manager-5b9b8895d5-jvbwn\" (UID: \"1f5bbacb-8d9b-4289-938a-05e191d519f9\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.676601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvp9q\" (UniqueName: \"kubernetes.io/projected/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-kube-api-access-lvp9q\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:57 crc kubenswrapper[4771]: E0219 21:45:57.676936 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:45:57 crc kubenswrapper[4771]: E0219 21:45:57.676981 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert podName:3bfa784f-72fd-4797-b5ec-ea4eaeaefffa nodeName:}" failed. No retries permitted until 2026-02-19 21:45:58.176963469 +0000 UTC m=+1058.448405939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert") pod "infra-operator-controller-manager-79d975b745-l6wgq" (UID: "3bfa784f-72fd-4797-b5ec-ea4eaeaefffa") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.680480 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.694294 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.695141 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.702873 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gck7x" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.717110 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.718591 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvp9q\" (UniqueName: \"kubernetes.io/projected/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-kube-api-access-lvp9q\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.733621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktltz\" (UniqueName: \"kubernetes.io/projected/1f5bbacb-8d9b-4289-938a-05e191d519f9-kube-api-access-ktltz\") pod \"horizon-operator-controller-manager-5b9b8895d5-jvbwn\" (UID: \"1f5bbacb-8d9b-4289-938a-05e191d519f9\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.746109 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.746939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.748688 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sblb8" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.750474 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.779685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghg5v\" (UniqueName: \"kubernetes.io/projected/fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd-kube-api-access-ghg5v\") pod \"ironic-operator-controller-manager-554564d7fc-g92n5\" (UID: \"fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.794102 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.794910 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.802781 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-t2hwk" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.809572 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghg5v\" (UniqueName: \"kubernetes.io/projected/fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd-kube-api-access-ghg5v\") pod \"ironic-operator-controller-manager-554564d7fc-g92n5\" (UID: \"fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.813729 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.839142 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.870147 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.881784 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkz46\" (UniqueName: \"kubernetes.io/projected/2aac2409-afd9-44d5-a205-36d3f85d6ee1-kube-api-access-bkz46\") pod \"keystone-operator-controller-manager-b4d948c87-mnjh4\" (UID: \"2aac2409-afd9-44d5-a205-36d3f85d6ee1\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.881826 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phkzg\" (UniqueName: \"kubernetes.io/projected/3242e27e-7bae-47f9-a6e4-48d21eead119-kube-api-access-phkzg\") pod \"manila-operator-controller-manager-54f6768c69-l4w4n\" (UID: \"3242e27e-7bae-47f9-a6e4-48d21eead119\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.881951 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.882784 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.889120 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-h4m2x" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.889660 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.895296 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.895837 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.896136 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.904392 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7r8h8" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.904917 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-p79dq" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.910729 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.917267 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.924745 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.944112 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.956399 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w"] Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.978429 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.983576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkz46\" (UniqueName: \"kubernetes.io/projected/2aac2409-afd9-44d5-a205-36d3f85d6ee1-kube-api-access-bkz46\") pod \"keystone-operator-controller-manager-b4d948c87-mnjh4\" (UID: \"2aac2409-afd9-44d5-a205-36d3f85d6ee1\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.983623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phkzg\" (UniqueName: \"kubernetes.io/projected/3242e27e-7bae-47f9-a6e4-48d21eead119-kube-api-access-phkzg\") pod \"manila-operator-controller-manager-54f6768c69-l4w4n\" (UID: \"3242e27e-7bae-47f9-a6e4-48d21eead119\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.983644 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d8g8\" (UniqueName: \"kubernetes.io/projected/ab86456c-c478-4751-946d-064eff52fe66-kube-api-access-9d8g8\") pod \"mariadb-operator-controller-manager-6994f66f48-krkgh\" (UID: \"ab86456c-c478-4751-946d-064eff52fe66\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" Feb 19 21:45:57 crc kubenswrapper[4771]: I0219 21:45:57.983664 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lzr\" (UniqueName: \"kubernetes.io/projected/47768578-399d-48b5-bffa-a7942d4c2dcf-kube-api-access-v7lzr\") pod \"neutron-operator-controller-manager-64ddbf8bb-6mdqb\" (UID: \"47768578-399d-48b5-bffa-a7942d4c2dcf\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.020732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phkzg\" (UniqueName: \"kubernetes.io/projected/3242e27e-7bae-47f9-a6e4-48d21eead119-kube-api-access-phkzg\") pod \"manila-operator-controller-manager-54f6768c69-l4w4n\" (UID: \"3242e27e-7bae-47f9-a6e4-48d21eead119\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.021600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkz46\" (UniqueName: \"kubernetes.io/projected/2aac2409-afd9-44d5-a205-36d3f85d6ee1-kube-api-access-bkz46\") pod \"keystone-operator-controller-manager-b4d948c87-mnjh4\" (UID: \"2aac2409-afd9-44d5-a205-36d3f85d6ee1\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.053341 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.055322 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.056234 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.067440 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.067634 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-q8d4j" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.084353 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.086199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggb2h\" (UniqueName: \"kubernetes.io/projected/f9700277-9c15-42b6-9ab6-820fb9c6fc91-kube-api-access-ggb2h\") pod \"octavia-operator-controller-manager-69f8888797-2fvpw\" (UID: \"f9700277-9c15-42b6-9ab6-820fb9c6fc91\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.086269 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d8g8\" (UniqueName: \"kubernetes.io/projected/ab86456c-c478-4751-946d-064eff52fe66-kube-api-access-9d8g8\") pod \"mariadb-operator-controller-manager-6994f66f48-krkgh\" (UID: \"ab86456c-c478-4751-946d-064eff52fe66\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.086291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lzr\" (UniqueName: \"kubernetes.io/projected/47768578-399d-48b5-bffa-a7942d4c2dcf-kube-api-access-v7lzr\") pod \"neutron-operator-controller-manager-64ddbf8bb-6mdqb\" (UID: \"47768578-399d-48b5-bffa-a7942d4c2dcf\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.086333 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9cv\" (UniqueName: \"kubernetes.io/projected/4fe3eefb-4705-42c0-8107-cf25db83f1d3-kube-api-access-hp9cv\") pod \"nova-operator-controller-manager-567668f5cf-qfc7w\" (UID: \"4fe3eefb-4705-42c0-8107-cf25db83f1d3\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.099164 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.100038 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.104219 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-rp4mv" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.109110 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lzr\" (UniqueName: \"kubernetes.io/projected/47768578-399d-48b5-bffa-a7942d4c2dcf-kube-api-access-v7lzr\") pod \"neutron-operator-controller-manager-64ddbf8bb-6mdqb\" (UID: \"47768578-399d-48b5-bffa-a7942d4c2dcf\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.111767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d8g8\" (UniqueName: \"kubernetes.io/projected/ab86456c-c478-4751-946d-064eff52fe66-kube-api-access-9d8g8\") pod \"mariadb-operator-controller-manager-6994f66f48-krkgh\" (UID: \"ab86456c-c478-4751-946d-064eff52fe66\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.125236 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.135418 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.140911 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.141780 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.146241 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.148625 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xq954" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.155709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.157916 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.160063 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.167776 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-x7phr" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.170382 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.171278 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.173529 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fdndh" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.187233 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.187283 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb2h\" (UniqueName: \"kubernetes.io/projected/f9700277-9c15-42b6-9ab6-820fb9c6fc91-kube-api-access-ggb2h\") pod \"octavia-operator-controller-manager-69f8888797-2fvpw\" (UID: \"f9700277-9c15-42b6-9ab6-820fb9c6fc91\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.187326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-658vw\" (UniqueName: \"kubernetes.io/projected/bfcbf7c3-8784-4275-9838-7f6b9666e49a-kube-api-access-658vw\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.187352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.187415 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp9cv\" (UniqueName: \"kubernetes.io/projected/4fe3eefb-4705-42c0-8107-cf25db83f1d3-kube-api-access-hp9cv\") pod \"nova-operator-controller-manager-567668f5cf-qfc7w\" (UID: \"4fe3eefb-4705-42c0-8107-cf25db83f1d3\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.187850 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.187892 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert podName:3bfa784f-72fd-4797-b5ec-ea4eaeaefffa nodeName:}" failed. No retries permitted until 2026-02-19 21:45:59.187876194 +0000 UTC m=+1059.459318664 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert") pod "infra-operator-controller-manager-79d975b745-l6wgq" (UID: "3bfa784f-72fd-4797-b5ec-ea4eaeaefffa") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.204173 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.225119 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.225946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp9cv\" (UniqueName: \"kubernetes.io/projected/4fe3eefb-4705-42c0-8107-cf25db83f1d3-kube-api-access-hp9cv\") pod \"nova-operator-controller-manager-567668f5cf-qfc7w\" (UID: \"4fe3eefb-4705-42c0-8107-cf25db83f1d3\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.233094 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-62hbk"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.241327 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.241547 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggb2h\" (UniqueName: \"kubernetes.io/projected/f9700277-9c15-42b6-9ab6-820fb9c6fc91-kube-api-access-ggb2h\") pod \"octavia-operator-controller-manager-69f8888797-2fvpw\" (UID: \"f9700277-9c15-42b6-9ab6-820fb9c6fc91\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.244587 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nrnb4" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.252968 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.254493 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.256319 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.262064 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-mr7gz" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.268486 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-62hbk"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.273625 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.289676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zfc6\" (UniqueName: \"kubernetes.io/projected/9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9-kube-api-access-8zfc6\") pod \"swift-operator-controller-manager-68f46476f-4s8tm\" (UID: \"9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.289729 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gflm5\" (UniqueName: \"kubernetes.io/projected/e9edc363-0a5c-46f0-9473-ba2bddd46ad5-kube-api-access-gflm5\") pod \"placement-operator-controller-manager-8497b45c89-9rjlg\" (UID: \"e9edc363-0a5c-46f0-9473-ba2bddd46ad5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.289777 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.289833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcrhz\" (UniqueName: \"kubernetes.io/projected/956aea9e-2072-4556-a787-8da2a7286af1-kube-api-access-qcrhz\") pod \"telemetry-operator-controller-manager-7f45b4ff68-dlg9h\" (UID: \"956aea9e-2072-4556-a787-8da2a7286af1\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.289867 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh72p\" (UniqueName: \"kubernetes.io/projected/d3777627-ddbf-47da-a0f4-33d28a04d0b7-kube-api-access-mh72p\") pod \"ovn-operator-controller-manager-d44cf6b75-26qgm\" (UID: \"d3777627-ddbf-47da-a0f4-33d28a04d0b7\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.289898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-658vw\" (UniqueName: \"kubernetes.io/projected/bfcbf7c3-8784-4275-9838-7f6b9666e49a-kube-api-access-658vw\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.290224 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.290317 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert podName:bfcbf7c3-8784-4275-9838-7f6b9666e49a nodeName:}" failed. No retries permitted until 2026-02-19 21:45:58.790290071 +0000 UTC m=+1059.061732541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" (UID: "bfcbf7c3-8784-4275-9838-7f6b9666e49a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.293636 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.306404 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.307687 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.313941 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.314768 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.316303 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.316473 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.316477 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lxw2q" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.334186 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.338401 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-658vw\" (UniqueName: \"kubernetes.io/projected/bfcbf7c3-8784-4275-9838-7f6b9666e49a-kube-api-access-658vw\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.344237 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.344344 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.349689 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zf894" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.385804 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62rmc\" (UniqueName: \"kubernetes.io/projected/1f38fe26-3d54-4e26-bd41-5bf84e7e98fb-kube-api-access-62rmc\") pod \"test-operator-controller-manager-7866795846-62hbk\" (UID: \"1f38fe26-3d54-4e26-bd41-5bf84e7e98fb\") " pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391129 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgc6\" (UniqueName: \"kubernetes.io/projected/1b9a2130-fab8-409d-a908-338f07fcd307-kube-api-access-ktgc6\") pod \"watcher-operator-controller-manager-5db88f68c-vvdzm\" (UID: \"1b9a2130-fab8-409d-a908-338f07fcd307\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391163 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcrhz\" (UniqueName: \"kubernetes.io/projected/956aea9e-2072-4556-a787-8da2a7286af1-kube-api-access-qcrhz\") pod \"telemetry-operator-controller-manager-7f45b4ff68-dlg9h\" (UID: \"956aea9e-2072-4556-a787-8da2a7286af1\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391221 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh72p\" (UniqueName: \"kubernetes.io/projected/d3777627-ddbf-47da-a0f4-33d28a04d0b7-kube-api-access-mh72p\") pod \"ovn-operator-controller-manager-d44cf6b75-26qgm\" (UID: \"d3777627-ddbf-47da-a0f4-33d28a04d0b7\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391254 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8w54\" (UniqueName: \"kubernetes.io/projected/4425ef75-1f25-4996-872b-1d4e877c0e10-kube-api-access-k8w54\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zfc6\" (UniqueName: \"kubernetes.io/projected/9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9-kube-api-access-8zfc6\") pod \"swift-operator-controller-manager-68f46476f-4s8tm\" (UID: \"9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.391338 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gflm5\" (UniqueName: \"kubernetes.io/projected/e9edc363-0a5c-46f0-9473-ba2bddd46ad5-kube-api-access-gflm5\") pod \"placement-operator-controller-manager-8497b45c89-9rjlg\" (UID: \"e9edc363-0a5c-46f0-9473-ba2bddd46ad5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.411705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcrhz\" (UniqueName: \"kubernetes.io/projected/956aea9e-2072-4556-a787-8da2a7286af1-kube-api-access-qcrhz\") pod \"telemetry-operator-controller-manager-7f45b4ff68-dlg9h\" (UID: \"956aea9e-2072-4556-a787-8da2a7286af1\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.411898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zfc6\" (UniqueName: \"kubernetes.io/projected/9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9-kube-api-access-8zfc6\") pod \"swift-operator-controller-manager-68f46476f-4s8tm\" (UID: \"9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.415670 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gflm5\" (UniqueName: \"kubernetes.io/projected/e9edc363-0a5c-46f0-9473-ba2bddd46ad5-kube-api-access-gflm5\") pod \"placement-operator-controller-manager-8497b45c89-9rjlg\" (UID: \"e9edc363-0a5c-46f0-9473-ba2bddd46ad5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.421207 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh72p\" (UniqueName: \"kubernetes.io/projected/d3777627-ddbf-47da-a0f4-33d28a04d0b7-kube-api-access-mh72p\") pod \"ovn-operator-controller-manager-d44cf6b75-26qgm\" (UID: \"d3777627-ddbf-47da-a0f4-33d28a04d0b7\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.463954 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.496498 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.497129 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48gm\" (UniqueName: \"kubernetes.io/projected/f1f1ab0f-321a-40a4-bf22-68aa1246b00b-kube-api-access-m48gm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tksbc\" (UID: \"f1f1ab0f-321a-40a4-bf22-68aa1246b00b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.497170 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62rmc\" (UniqueName: \"kubernetes.io/projected/1f38fe26-3d54-4e26-bd41-5bf84e7e98fb-kube-api-access-62rmc\") pod \"test-operator-controller-manager-7866795846-62hbk\" (UID: \"1f38fe26-3d54-4e26-bd41-5bf84e7e98fb\") " pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.497196 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgc6\" (UniqueName: \"kubernetes.io/projected/1b9a2130-fab8-409d-a908-338f07fcd307-kube-api-access-ktgc6\") pod \"watcher-operator-controller-manager-5db88f68c-vvdzm\" (UID: \"1b9a2130-fab8-409d-a908-338f07fcd307\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.497224 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.497242 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.497277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8w54\" (UniqueName: \"kubernetes.io/projected/4425ef75-1f25-4996-872b-1d4e877c0e10-kube-api-access-k8w54\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.497856 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.497906 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:58.997890439 +0000 UTC m=+1059.269332909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "metrics-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.498055 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.498085 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:58.998078154 +0000 UTC m=+1059.269520624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.501587 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" event={"ID":"93b55752-3270-48d6-a5b2-6fbf12729651","Type":"ContainerStarted","Data":"9465f4f7d8c035a90c173968084ee9480581aacc28afc6d29a58154b478a1ffc"} Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.523177 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgc6\" (UniqueName: \"kubernetes.io/projected/1b9a2130-fab8-409d-a908-338f07fcd307-kube-api-access-ktgc6\") pod \"watcher-operator-controller-manager-5db88f68c-vvdzm\" (UID: \"1b9a2130-fab8-409d-a908-338f07fcd307\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.523778 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8w54\" (UniqueName: \"kubernetes.io/projected/4425ef75-1f25-4996-872b-1d4e877c0e10-kube-api-access-k8w54\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.524152 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.525160 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62rmc\" (UniqueName: \"kubernetes.io/projected/1f38fe26-3d54-4e26-bd41-5bf84e7e98fb-kube-api-access-62rmc\") pod \"test-operator-controller-manager-7866795846-62hbk\" (UID: \"1f38fe26-3d54-4e26-bd41-5bf84e7e98fb\") " pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.549812 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.585366 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.596705 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.598069 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48gm\" (UniqueName: \"kubernetes.io/projected/f1f1ab0f-321a-40a4-bf22-68aa1246b00b-kube-api-access-m48gm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tksbc\" (UID: \"f1f1ab0f-321a-40a4-bf22-68aa1246b00b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.616004 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.619112 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48gm\" (UniqueName: \"kubernetes.io/projected/f1f1ab0f-321a-40a4-bf22-68aa1246b00b-kube-api-access-m48gm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tksbc\" (UID: \"f1f1ab0f-321a-40a4-bf22-68aa1246b00b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.705418 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.746354 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.800892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.801223 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: E0219 21:45:58.801304 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert podName:bfcbf7c3-8784-4275-9838-7f6b9666e49a nodeName:}" failed. No retries permitted until 2026-02-19 21:45:59.801284538 +0000 UTC m=+1060.072727008 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" (UID: "bfcbf7c3-8784-4275-9838-7f6b9666e49a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.930693 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.950288 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp"] Feb 19 21:45:58 crc kubenswrapper[4771]: I0219 21:45:58.957720 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-2qr26"] Feb 19 21:45:59 crc kubenswrapper[4771]: W0219 21:45:59.000313 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7392724_153e_4b25_a984_bff0f841aac6.slice/crio-381627b5c2839ebda2d69872c26dd0f164aa02c7001f58e515fccac76d32d2ab WatchSource:0}: Error finding container 381627b5c2839ebda2d69872c26dd0f164aa02c7001f58e515fccac76d32d2ab: Status 404 returned error can't find the container with id 381627b5c2839ebda2d69872c26dd0f164aa02c7001f58e515fccac76d32d2ab Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.004717 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.004756 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.004911 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.004960 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:00.004945952 +0000 UTC m=+1060.276388422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "webhook-server-cert" not found Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.005355 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.005381 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:00.005374063 +0000 UTC m=+1060.276816533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "metrics-server-cert" not found Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.021997 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.218808 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.218993 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.219075 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert podName:3bfa784f-72fd-4797-b5ec-ea4eaeaefffa nodeName:}" failed. No retries permitted until 2026-02-19 21:46:01.219056071 +0000 UTC m=+1061.490498541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert") pod "infra-operator-controller-manager-79d975b745-l6wgq" (UID: "3bfa784f-72fd-4797-b5ec-ea4eaeaefffa") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.481167 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.486378 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.513394 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.529545 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.540386 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.545461 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" event={"ID":"ab86456c-c478-4751-946d-064eff52fe66","Type":"ContainerStarted","Data":"bfa997e1489aef57ffdab14f89d3cfb622419fd097421b69e546d114f027a0d6"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.547903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" event={"ID":"1f5bbacb-8d9b-4289-938a-05e191d519f9","Type":"ContainerStarted","Data":"8f5ea8ea7cbb34f1f61b2ef7792b847e9eecfdc5fc1315bd22afcea66ef29599"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.548971 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" event={"ID":"a7392724-153e-4b25-a984-bff0f841aac6","Type":"ContainerStarted","Data":"381627b5c2839ebda2d69872c26dd0f164aa02c7001f58e515fccac76d32d2ab"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.551849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" event={"ID":"7bcef2a5-3cd6-4add-bf80-92e6eb274058","Type":"ContainerStarted","Data":"2c4640d285e70af3715238758ef26ff1d766048cf24118345e0ecfa98467db3f"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.558257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" event={"ID":"3242e27e-7bae-47f9-a6e4-48d21eead119","Type":"ContainerStarted","Data":"6340501beb5f2a679317c92bc624e6d7e3bedc9cc3e9110d3018e80d97901d31"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.563410 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.565454 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" event={"ID":"1a3c4ae6-54be-46a2-93e6-db74ebf892e3","Type":"ContainerStarted","Data":"172477864d85f78a9895334d3f6fc8f999c6a8206ff3e1302550a26e916f9995"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.567124 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" event={"ID":"4fe3eefb-4705-42c0-8107-cf25db83f1d3","Type":"ContainerStarted","Data":"9b74b843d34ed21ac01239d7d571d4d2fb78d51c019d1d30b198d47e99afffbc"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.571297 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" event={"ID":"fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd","Type":"ContainerStarted","Data":"37114546bdb22ac1645a54d3f34a60526d759ae08205fe015a80a61737305d1c"} Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.572297 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb"] Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.572626 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mh72p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-26qgm_openstack-operators(d3777627-ddbf-47da-a0f4-33d28a04d0b7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.573636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" event={"ID":"2aac2409-afd9-44d5-a205-36d3f85d6ee1","Type":"ContainerStarted","Data":"5fae6e881272209f16690596e8bfa402203a0defde0741f3665d9fe1a66334b8"} Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.573689 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" podUID="d3777627-ddbf-47da-a0f4-33d28a04d0b7" Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.576477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" event={"ID":"747c00a6-dc75-476f-bdcd-b24d58b2fbe8","Type":"ContainerStarted","Data":"bcaef2deafcd340ce922106ba9ed9846083295836d0bd6e63615d6b2a78df0ac"} Feb 19 21:45:59 crc kubenswrapper[4771]: W0219 21:45:59.577591 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod956aea9e_2072_4556_a787_8da2a7286af1.slice/crio-02870e7565ad9a3bde0ef51c6c08c0a4fbce124f720983c0a7ddad4461aaa531 WatchSource:0}: Error finding container 02870e7565ad9a3bde0ef51c6c08c0a4fbce124f720983c0a7ddad4461aaa531: Status 404 returned error can't find the container with id 02870e7565ad9a3bde0ef51c6c08c0a4fbce124f720983c0a7ddad4461aaa531 Feb 19 21:45:59 crc kubenswrapper[4771]: W0219 21:45:59.581455 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b9a2130_fab8_409d_a908_338f07fcd307.slice/crio-9462621d7ae202d5a621b1081453134359d76560e874dbcd83df23a00a2e29b7 WatchSource:0}: Error finding container 9462621d7ae202d5a621b1081453134359d76560e874dbcd83df23a00a2e29b7: Status 404 returned error can't find the container with id 9462621d7ae202d5a621b1081453134359d76560e874dbcd83df23a00a2e29b7 Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.581668 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qcrhz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-dlg9h_openstack-operators(956aea9e-2072-4556-a787-8da2a7286af1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.582763 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" podUID="956aea9e-2072-4556-a787-8da2a7286af1" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.585188 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ktgc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-vvdzm_openstack-operators(1b9a2130-fab8-409d-a908-338f07fcd307): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.585355 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-62rmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-62hbk_openstack-operators(1f38fe26-3d54-4e26-bd41-5bf84e7e98fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.586346 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" podUID="1b9a2130-fab8-409d-a908-338f07fcd307" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.586669 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" podUID="1f38fe26-3d54-4e26-bd41-5bf84e7e98fb" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.586994 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m48gm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-tksbc_openstack-operators(f1f1ab0f-321a-40a4-bf22-68aa1246b00b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.588109 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w"] Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.588119 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" podUID="f1f1ab0f-321a-40a4-bf22-68aa1246b00b" Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.595914 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.604333 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.610052 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.614178 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.617759 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-62hbk"] Feb 19 21:45:59 crc kubenswrapper[4771]: I0219 21:45:59.828376 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.828600 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:45:59 crc kubenswrapper[4771]: E0219 21:45:59.828682 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert podName:bfcbf7c3-8784-4275-9838-7f6b9666e49a nodeName:}" failed. No retries permitted until 2026-02-19 21:46:01.828646304 +0000 UTC m=+1062.100088774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" (UID: "bfcbf7c3-8784-4275-9838-7f6b9666e49a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.031125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.031165 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.031272 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.031319 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.031343 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:02.031328922 +0000 UTC m=+1062.302771392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "metrics-server-cert" not found Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.031357 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:02.031351852 +0000 UTC m=+1062.302794322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "webhook-server-cert" not found Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.590711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" event={"ID":"f1f1ab0f-321a-40a4-bf22-68aa1246b00b","Type":"ContainerStarted","Data":"f584553cca063f4a02f72a8375b04f9b78a98e808d878b315876d2e639cb933c"} Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.591891 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" event={"ID":"47768578-399d-48b5-bffa-a7942d4c2dcf","Type":"ContainerStarted","Data":"9fc770f20db0c577b8114d8181da313a9e76ea2d3b1f85a59add9a180db87398"} Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.592419 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" podUID="f1f1ab0f-321a-40a4-bf22-68aa1246b00b" Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.592870 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" event={"ID":"e9edc363-0a5c-46f0-9473-ba2bddd46ad5","Type":"ContainerStarted","Data":"80ee2c85e13652230081a15c86f1be93b71cd441e123ce113aaf2116bbb7dfac"} Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.593801 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" event={"ID":"1b9a2130-fab8-409d-a908-338f07fcd307","Type":"ContainerStarted","Data":"9462621d7ae202d5a621b1081453134359d76560e874dbcd83df23a00a2e29b7"} Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.598050 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" podUID="1b9a2130-fab8-409d-a908-338f07fcd307" Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.600863 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" event={"ID":"d3777627-ddbf-47da-a0f4-33d28a04d0b7","Type":"ContainerStarted","Data":"845d2c3155850ebd04637080efb448baf588152f65b60c6c9811bef4e5d24590"} Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.602395 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" podUID="d3777627-ddbf-47da-a0f4-33d28a04d0b7" Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.602800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" event={"ID":"956aea9e-2072-4556-a787-8da2a7286af1","Type":"ContainerStarted","Data":"02870e7565ad9a3bde0ef51c6c08c0a4fbce124f720983c0a7ddad4461aaa531"} Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.603981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" event={"ID":"f9700277-9c15-42b6-9ab6-820fb9c6fc91","Type":"ContainerStarted","Data":"ee0dd4e9743005c3a76da6d122bdfe117789b5dbe466262739251e2cba5e82fd"} Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.603979 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" podUID="956aea9e-2072-4556-a787-8da2a7286af1" Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.605569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" event={"ID":"1f38fe26-3d54-4e26-bd41-5bf84e7e98fb","Type":"ContainerStarted","Data":"6d4e455673387681b797ea85ef9e06bcc8df76172efce5e6b76b0e618bb20cd7"} Feb 19 21:46:00 crc kubenswrapper[4771]: I0219 21:46:00.606774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" event={"ID":"9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9","Type":"ContainerStarted","Data":"4d5eb555ac42a4aa118e1a558f37414c287405b3f3950cf88dc8300be8b89fd0"} Feb 19 21:46:00 crc kubenswrapper[4771]: E0219 21:46:00.607151 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" podUID="1f38fe26-3d54-4e26-bd41-5bf84e7e98fb" Feb 19 21:46:01 crc kubenswrapper[4771]: I0219 21:46:01.254788 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.254967 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.255039 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert podName:3bfa784f-72fd-4797-b5ec-ea4eaeaefffa nodeName:}" failed. No retries permitted until 2026-02-19 21:46:05.255009097 +0000 UTC m=+1065.526451567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert") pod "infra-operator-controller-manager-79d975b745-l6wgq" (UID: "3bfa784f-72fd-4797-b5ec-ea4eaeaefffa") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.615490 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" podUID="f1f1ab0f-321a-40a4-bf22-68aa1246b00b" Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.615511 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" podUID="1f38fe26-3d54-4e26-bd41-5bf84e7e98fb" Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.615551 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" podUID="1b9a2130-fab8-409d-a908-338f07fcd307" Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.615557 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" podUID="956aea9e-2072-4556-a787-8da2a7286af1" Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.617573 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" podUID="d3777627-ddbf-47da-a0f4-33d28a04d0b7" Feb 19 21:46:01 crc kubenswrapper[4771]: I0219 21:46:01.865592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.865789 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:46:01 crc kubenswrapper[4771]: E0219 21:46:01.865868 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert podName:bfcbf7c3-8784-4275-9838-7f6b9666e49a nodeName:}" failed. No retries permitted until 2026-02-19 21:46:05.865847852 +0000 UTC m=+1066.137290332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" (UID: "bfcbf7c3-8784-4275-9838-7f6b9666e49a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:46:02 crc kubenswrapper[4771]: I0219 21:46:02.068424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:02 crc kubenswrapper[4771]: I0219 21:46:02.068692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:02 crc kubenswrapper[4771]: E0219 21:46:02.068651 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:46:02 crc kubenswrapper[4771]: E0219 21:46:02.068825 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:46:02 crc kubenswrapper[4771]: E0219 21:46:02.068849 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:06.068826448 +0000 UTC m=+1066.340268928 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "metrics-server-cert" not found Feb 19 21:46:02 crc kubenswrapper[4771]: E0219 21:46:02.068876 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:06.068862079 +0000 UTC m=+1066.340304549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "webhook-server-cert" not found Feb 19 21:46:05 crc kubenswrapper[4771]: I0219 21:46:05.320387 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:05 crc kubenswrapper[4771]: E0219 21:46:05.320693 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:46:05 crc kubenswrapper[4771]: E0219 21:46:05.320819 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert podName:3bfa784f-72fd-4797-b5ec-ea4eaeaefffa nodeName:}" failed. No retries permitted until 2026-02-19 21:46:13.320802346 +0000 UTC m=+1073.592244806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert") pod "infra-operator-controller-manager-79d975b745-l6wgq" (UID: "3bfa784f-72fd-4797-b5ec-ea4eaeaefffa") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:46:05 crc kubenswrapper[4771]: I0219 21:46:05.928751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:05 crc kubenswrapper[4771]: E0219 21:46:05.929027 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:46:05 crc kubenswrapper[4771]: E0219 21:46:05.929120 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert podName:bfcbf7c3-8784-4275-9838-7f6b9666e49a nodeName:}" failed. No retries permitted until 2026-02-19 21:46:13.929098804 +0000 UTC m=+1074.200541284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" (UID: "bfcbf7c3-8784-4275-9838-7f6b9666e49a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:46:06 crc kubenswrapper[4771]: I0219 21:46:06.131623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:06 crc kubenswrapper[4771]: I0219 21:46:06.131670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:06 crc kubenswrapper[4771]: E0219 21:46:06.131838 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:46:06 crc kubenswrapper[4771]: E0219 21:46:06.131945 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:14.131916255 +0000 UTC m=+1074.403358755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "metrics-server-cert" not found Feb 19 21:46:06 crc kubenswrapper[4771]: E0219 21:46:06.132072 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:46:06 crc kubenswrapper[4771]: E0219 21:46:06.132160 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:14.132139991 +0000 UTC m=+1074.403582461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "webhook-server-cert" not found Feb 19 21:46:11 crc kubenswrapper[4771]: E0219 21:46:11.093171 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 19 21:46:11 crc kubenswrapper[4771]: E0219 21:46:11.094096 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ktltz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-jvbwn_openstack-operators(1f5bbacb-8d9b-4289-938a-05e191d519f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:46:11 crc kubenswrapper[4771]: E0219 21:46:11.095629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" podUID="1f5bbacb-8d9b-4289-938a-05e191d519f9" Feb 19 21:46:11 crc kubenswrapper[4771]: E0219 21:46:11.696644 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" podUID="1f5bbacb-8d9b-4289-938a-05e191d519f9" Feb 19 21:46:12 crc kubenswrapper[4771]: I0219 21:46:12.957002 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:46:12 crc kubenswrapper[4771]: I0219 21:46:12.957113 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:46:12 crc kubenswrapper[4771]: I0219 21:46:12.957179 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:46:12 crc kubenswrapper[4771]: I0219 21:46:12.957930 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de6552d35b845cd34ff7f7f99fd7b414fe0ca3adffdf00a9aac825e2b4fe5659"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:46:12 crc kubenswrapper[4771]: I0219 21:46:12.958082 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://de6552d35b845cd34ff7f7f99fd7b414fe0ca3adffdf00a9aac825e2b4fe5659" gracePeriod=600 Feb 19 21:46:13 crc kubenswrapper[4771]: I0219 21:46:13.342804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.342967 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.343101 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert podName:3bfa784f-72fd-4797-b5ec-ea4eaeaefffa nodeName:}" failed. No retries permitted until 2026-02-19 21:46:29.343080945 +0000 UTC m=+1089.614523415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert") pod "infra-operator-controller-manager-79d975b745-l6wgq" (UID: "3bfa784f-72fd-4797-b5ec-ea4eaeaefffa") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.555653 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.555811 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkz46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-mnjh4_openstack-operators(2aac2409-afd9-44d5-a205-36d3f85d6ee1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.557160 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" podUID="2aac2409-afd9-44d5-a205-36d3f85d6ee1" Feb 19 21:46:13 crc kubenswrapper[4771]: I0219 21:46:13.713051 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="de6552d35b845cd34ff7f7f99fd7b414fe0ca3adffdf00a9aac825e2b4fe5659" exitCode=0 Feb 19 21:46:13 crc kubenswrapper[4771]: I0219 21:46:13.713068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"de6552d35b845cd34ff7f7f99fd7b414fe0ca3adffdf00a9aac825e2b4fe5659"} Feb 19 21:46:13 crc kubenswrapper[4771]: I0219 21:46:13.713662 4771 scope.go:117] "RemoveContainer" containerID="0b551f7ad6e0a61c42a9d657911c51c24f934a3d33eec6d667ab79647cf50477" Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.716285 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" podUID="2aac2409-afd9-44d5-a205-36d3f85d6ee1" Feb 19 21:46:13 crc kubenswrapper[4771]: I0219 21:46:13.954518 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.954773 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:46:13 crc kubenswrapper[4771]: E0219 21:46:13.954854 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert podName:bfcbf7c3-8784-4275-9838-7f6b9666e49a nodeName:}" failed. No retries permitted until 2026-02-19 21:46:29.954833125 +0000 UTC m=+1090.226275595 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" (UID: "bfcbf7c3-8784-4275-9838-7f6b9666e49a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:46:14 crc kubenswrapper[4771]: I0219 21:46:14.158471 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:14 crc kubenswrapper[4771]: I0219 21:46:14.158602 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:14 crc kubenswrapper[4771]: E0219 21:46:14.158687 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:46:14 crc kubenswrapper[4771]: E0219 21:46:14.158773 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:30.158754345 +0000 UTC m=+1090.430196815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "metrics-server-cert" not found Feb 19 21:46:14 crc kubenswrapper[4771]: E0219 21:46:14.158893 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:46:14 crc kubenswrapper[4771]: E0219 21:46:14.159097 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs podName:4425ef75-1f25-4996-872b-1d4e877c0e10 nodeName:}" failed. No retries permitted until 2026-02-19 21:46:30.159002312 +0000 UTC m=+1090.430444812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-t66d7" (UID: "4425ef75-1f25-4996-872b-1d4e877c0e10") : secret "webhook-server-cert" not found Feb 19 21:46:15 crc kubenswrapper[4771]: E0219 21:46:15.512841 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 19 21:46:15 crc kubenswrapper[4771]: E0219 21:46:15.513293 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hp9cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-qfc7w_openstack-operators(4fe3eefb-4705-42c0-8107-cf25db83f1d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:46:15 crc kubenswrapper[4771]: E0219 21:46:15.516201 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" podUID="4fe3eefb-4705-42c0-8107-cf25db83f1d3" Feb 19 21:46:15 crc kubenswrapper[4771]: E0219 21:46:15.726422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" podUID="4fe3eefb-4705-42c0-8107-cf25db83f1d3" Feb 19 21:46:17 crc kubenswrapper[4771]: I0219 21:46:17.740140 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" event={"ID":"1a3c4ae6-54be-46a2-93e6-db74ebf892e3","Type":"ContainerStarted","Data":"afcfaafdf92dc575f4fd62dcb430cbff041a7469a8f629ef558c96a335344df0"} Feb 19 21:46:17 crc kubenswrapper[4771]: I0219 21:46:17.740671 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" Feb 19 21:46:17 crc kubenswrapper[4771]: I0219 21:46:17.741622 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" event={"ID":"747c00a6-dc75-476f-bdcd-b24d58b2fbe8","Type":"ContainerStarted","Data":"df99cac59ba5e3812ad4a8656b503238e73a7818539260a4b4af8ca050e84390"} Feb 19 21:46:17 crc kubenswrapper[4771]: I0219 21:46:17.741952 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" Feb 19 21:46:17 crc kubenswrapper[4771]: I0219 21:46:17.746805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"b980742e22e0e4d74d8de52154d96047cf8f0b8f6b46b5fe1e26926aa9dcd46b"} Feb 19 21:46:17 crc kubenswrapper[4771]: I0219 21:46:17.754215 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" podStartSLOduration=7.516700681 podStartE2EDuration="20.754177522s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:58.951093798 +0000 UTC m=+1059.222536268" lastFinishedPulling="2026-02-19 21:46:12.188570619 +0000 UTC m=+1072.460013109" observedRunningTime="2026-02-19 21:46:17.752914268 +0000 UTC m=+1078.024356758" watchObservedRunningTime="2026-02-19 21:46:17.754177522 +0000 UTC m=+1078.025619992" Feb 19 21:46:17 crc kubenswrapper[4771]: I0219 21:46:17.797000 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" podStartSLOduration=5.816744107 podStartE2EDuration="20.796983393s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:58.551480725 +0000 UTC m=+1058.822923195" lastFinishedPulling="2026-02-19 21:46:13.531720011 +0000 UTC m=+1073.803162481" observedRunningTime="2026-02-19 21:46:17.77152104 +0000 UTC m=+1078.042963530" watchObservedRunningTime="2026-02-19 21:46:17.796983393 +0000 UTC m=+1078.068425863" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.762538 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" event={"ID":"93b55752-3270-48d6-a5b2-6fbf12729651","Type":"ContainerStarted","Data":"f04b3139560ca7828f28d398c06b341325c73eed16c1859a47b6653cc8208a43"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.762922 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.770561 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" event={"ID":"ab86456c-c478-4751-946d-064eff52fe66","Type":"ContainerStarted","Data":"6362198fa6f4e5ea911afc20e0fb0a4cc2860c5a5256947bdde18942e567619f"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.770906 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.776872 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" event={"ID":"1f38fe26-3d54-4e26-bd41-5bf84e7e98fb","Type":"ContainerStarted","Data":"5db1e69b7d080e28a907eacd49ae2a3e7bdc6d78542eae340069e5865134b7af"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.777096 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.785725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" event={"ID":"d3777627-ddbf-47da-a0f4-33d28a04d0b7","Type":"ContainerStarted","Data":"fe1f4b3d9c9664f8dcba22a8cb1b7ccbbc6bc52eb37070148b0e54ada71f3abc"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.786682 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.794601 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" event={"ID":"47768578-399d-48b5-bffa-a7942d4c2dcf","Type":"ContainerStarted","Data":"f1c7088eb7ee708daf4d8af93aa961831a62b1c2a6568e5d82b8c82f88421fe0"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.795301 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.806551 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" event={"ID":"956aea9e-2072-4556-a787-8da2a7286af1","Type":"ContainerStarted","Data":"456dac4fb64d88b8fa18ae6a8eca5fc91a4a745c87fb73f72f75cac6e8895728"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.806782 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.810030 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" podStartSLOduration=7.349922483 podStartE2EDuration="21.80999315s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:58.449750856 +0000 UTC m=+1058.721193326" lastFinishedPulling="2026-02-19 21:46:12.909821523 +0000 UTC m=+1073.181263993" observedRunningTime="2026-02-19 21:46:18.806496957 +0000 UTC m=+1079.077939427" watchObservedRunningTime="2026-02-19 21:46:18.80999315 +0000 UTC m=+1079.081435620" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.816118 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" event={"ID":"3242e27e-7bae-47f9-a6e4-48d21eead119","Type":"ContainerStarted","Data":"1baa94022c95bf014d630ba233af3e7289c7d0bbb4b9222c4938b689142c60b7"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.816183 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.831530 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" event={"ID":"7bcef2a5-3cd6-4add-bf80-92e6eb274058","Type":"ContainerStarted","Data":"5eb47f058699625c29d13eb6866acaaf51951dc806b981cd299c6931447fe9b3"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.832129 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.843129 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" event={"ID":"9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9","Type":"ContainerStarted","Data":"e0d1350dec752b40b8f0a480bea5a163bcd9834bf91fe0bf14ae32b0e4bad302"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.843180 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.851322 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" podStartSLOduration=3.84431676 podStartE2EDuration="21.851306191s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.572479453 +0000 UTC m=+1059.843921923" lastFinishedPulling="2026-02-19 21:46:17.579468884 +0000 UTC m=+1077.850911354" observedRunningTime="2026-02-19 21:46:18.847378308 +0000 UTC m=+1079.118820788" watchObservedRunningTime="2026-02-19 21:46:18.851306191 +0000 UTC m=+1079.122748661" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.872914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" event={"ID":"e9edc363-0a5c-46f0-9473-ba2bddd46ad5","Type":"ContainerStarted","Data":"6c3a823f7446cffe6c7f46ddbb35aa113cb4b4d5210b23f35c2e109c547e7ba1"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.873622 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.889332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" event={"ID":"a7392724-153e-4b25-a984-bff0f841aac6","Type":"ContainerStarted","Data":"20cee903d8339a3ac93ea626bff84f45b451f9292aed8bf2f0ab172544f08526"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.890107 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.901569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" event={"ID":"f9700277-9c15-42b6-9ab6-820fb9c6fc91","Type":"ContainerStarted","Data":"435b1c8b777d3c6e37b7b9c57c78ccdc00bc62b4f568ca5af9140f1d4b34355f"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.902232 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.903931 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" event={"ID":"fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd","Type":"ContainerStarted","Data":"a672632febef0e62da2a0bd4bfa1918986373462380ff3c9fe2527a0f84b58bf"} Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.903955 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.906095 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" podStartSLOduration=4.718749826 podStartE2EDuration="21.90607839s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.511474061 +0000 UTC m=+1059.782916521" lastFinishedPulling="2026-02-19 21:46:16.698802615 +0000 UTC m=+1076.970245085" observedRunningTime="2026-02-19 21:46:18.87773282 +0000 UTC m=+1079.149175300" watchObservedRunningTime="2026-02-19 21:46:18.90607839 +0000 UTC m=+1079.177520860" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.917711 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" podStartSLOduration=5.431791623 podStartE2EDuration="21.917695567s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.563457155 +0000 UTC m=+1059.834899625" lastFinishedPulling="2026-02-19 21:46:16.049361079 +0000 UTC m=+1076.320803569" observedRunningTime="2026-02-19 21:46:18.904874428 +0000 UTC m=+1079.176316908" watchObservedRunningTime="2026-02-19 21:46:18.917695567 +0000 UTC m=+1079.189138037" Feb 19 21:46:18 crc kubenswrapper[4771]: I0219 21:46:18.952661 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" podStartSLOduration=4.008697075 podStartE2EDuration="21.95264485s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.585099556 +0000 UTC m=+1059.856542026" lastFinishedPulling="2026-02-19 21:46:17.529047331 +0000 UTC m=+1077.800489801" observedRunningTime="2026-02-19 21:46:18.942833621 +0000 UTC m=+1079.214276101" watchObservedRunningTime="2026-02-19 21:46:18.95264485 +0000 UTC m=+1079.224087320" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.083777 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" podStartSLOduration=4.877076838 podStartE2EDuration="22.083762915s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.534521909 +0000 UTC m=+1059.805964379" lastFinishedPulling="2026-02-19 21:46:16.741207986 +0000 UTC m=+1077.012650456" observedRunningTime="2026-02-19 21:46:19.081943598 +0000 UTC m=+1079.353386088" watchObservedRunningTime="2026-02-19 21:46:19.083762915 +0000 UTC m=+1079.355205385" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.084852 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" podStartSLOduration=4.161182956 podStartE2EDuration="22.084845814s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.581507942 +0000 UTC m=+1059.852950412" lastFinishedPulling="2026-02-19 21:46:17.5051708 +0000 UTC m=+1077.776613270" observedRunningTime="2026-02-19 21:46:19.050563019 +0000 UTC m=+1079.322005479" watchObservedRunningTime="2026-02-19 21:46:19.084845814 +0000 UTC m=+1079.356288284" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.109009 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" podStartSLOduration=4.909107735 podStartE2EDuration="22.108992122s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.543206209 +0000 UTC m=+1059.814648689" lastFinishedPulling="2026-02-19 21:46:16.743090606 +0000 UTC m=+1077.014533076" observedRunningTime="2026-02-19 21:46:19.104505114 +0000 UTC m=+1079.375947574" watchObservedRunningTime="2026-02-19 21:46:19.108992122 +0000 UTC m=+1079.380434592" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.131345 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" podStartSLOduration=4.9458348359999995 podStartE2EDuration="22.131329163s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.558356079 +0000 UTC m=+1059.829798549" lastFinishedPulling="2026-02-19 21:46:16.743850406 +0000 UTC m=+1077.015292876" observedRunningTime="2026-02-19 21:46:19.12817176 +0000 UTC m=+1079.399614250" watchObservedRunningTime="2026-02-19 21:46:19.131329163 +0000 UTC m=+1079.402771633" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.146219 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" podStartSLOduration=7.63092837 podStartE2EDuration="22.146203176s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.016902307 +0000 UTC m=+1059.288344777" lastFinishedPulling="2026-02-19 21:46:13.532177113 +0000 UTC m=+1073.803619583" observedRunningTime="2026-02-19 21:46:19.143739471 +0000 UTC m=+1079.415181951" watchObservedRunningTime="2026-02-19 21:46:19.146203176 +0000 UTC m=+1079.417645646" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.168148 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" podStartSLOduration=5.6446793490000005 podStartE2EDuration="22.168129746s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.535045584 +0000 UTC m=+1059.806488054" lastFinishedPulling="2026-02-19 21:46:16.058495961 +0000 UTC m=+1076.329938451" observedRunningTime="2026-02-19 21:46:19.166382139 +0000 UTC m=+1079.437824629" watchObservedRunningTime="2026-02-19 21:46:19.168129746 +0000 UTC m=+1079.439572216" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.193867 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" podStartSLOduration=6.250615657 podStartE2EDuration="22.193852476s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.563072565 +0000 UTC m=+1059.834515035" lastFinishedPulling="2026-02-19 21:46:15.506309384 +0000 UTC m=+1075.777751854" observedRunningTime="2026-02-19 21:46:19.187278092 +0000 UTC m=+1079.458720572" watchObservedRunningTime="2026-02-19 21:46:19.193852476 +0000 UTC m=+1079.465294936" Feb 19 21:46:19 crc kubenswrapper[4771]: I0219 21:46:19.211529 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" podStartSLOduration=6.302790106 podStartE2EDuration="22.211510543s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:58.991176668 +0000 UTC m=+1059.262619138" lastFinishedPulling="2026-02-19 21:46:14.899897105 +0000 UTC m=+1075.171339575" observedRunningTime="2026-02-19 21:46:19.206355376 +0000 UTC m=+1079.477797856" watchObservedRunningTime="2026-02-19 21:46:19.211510543 +0000 UTC m=+1079.482953003" Feb 19 21:46:21 crc kubenswrapper[4771]: I0219 21:46:21.927728 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" event={"ID":"1b9a2130-fab8-409d-a908-338f07fcd307","Type":"ContainerStarted","Data":"6ef4c3477f38db9a3b905f21d62b9289428dfea40377eb1747f84d5a44c7ad97"} Feb 19 21:46:21 crc kubenswrapper[4771]: I0219 21:46:21.928466 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" Feb 19 21:46:21 crc kubenswrapper[4771]: I0219 21:46:21.946689 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" podStartSLOduration=2.860088375 podStartE2EDuration="24.94667181s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.585070236 +0000 UTC m=+1059.856512706" lastFinishedPulling="2026-02-19 21:46:21.671653661 +0000 UTC m=+1081.943096141" observedRunningTime="2026-02-19 21:46:21.945079078 +0000 UTC m=+1082.216521578" watchObservedRunningTime="2026-02-19 21:46:21.94667181 +0000 UTC m=+1082.218114280" Feb 19 21:46:22 crc kubenswrapper[4771]: I0219 21:46:22.439718 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:46:22 crc kubenswrapper[4771]: I0219 21:46:22.941665 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" event={"ID":"f1f1ab0f-321a-40a4-bf22-68aa1246b00b","Type":"ContainerStarted","Data":"8daee88ea24c9a3b88241026a39c90289ded377b469e4b98197b68196a0f5781"} Feb 19 21:46:22 crc kubenswrapper[4771]: I0219 21:46:22.963422 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tksbc" podStartSLOduration=2.861185787 podStartE2EDuration="24.963403254s" podCreationTimestamp="2026-02-19 21:45:58 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.586875014 +0000 UTC m=+1059.858317484" lastFinishedPulling="2026-02-19 21:46:21.689092451 +0000 UTC m=+1081.960534951" observedRunningTime="2026-02-19 21:46:22.95643632 +0000 UTC m=+1083.227878810" watchObservedRunningTime="2026-02-19 21:46:22.963403254 +0000 UTC m=+1083.234845724" Feb 19 21:46:23 crc kubenswrapper[4771]: I0219 21:46:23.951166 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" event={"ID":"1f5bbacb-8d9b-4289-938a-05e191d519f9","Type":"ContainerStarted","Data":"35186ec7c8ed160b7bcd25149cae4a7b71efafb5b4cdcc5590eccf6a18277fea"} Feb 19 21:46:23 crc kubenswrapper[4771]: I0219 21:46:23.951873 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" Feb 19 21:46:23 crc kubenswrapper[4771]: I0219 21:46:23.973798 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" podStartSLOduration=1.985658982 podStartE2EDuration="26.973772061s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:58.56718087 +0000 UTC m=+1058.838623340" lastFinishedPulling="2026-02-19 21:46:23.555293919 +0000 UTC m=+1083.826736419" observedRunningTime="2026-02-19 21:46:23.970785192 +0000 UTC m=+1084.242227702" watchObservedRunningTime="2026-02-19 21:46:23.973772061 +0000 UTC m=+1084.245214551" Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.534047 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-6jvq7" Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.558656 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hsg94" Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.817248 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gvxfm" Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.921180 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-2qr26" Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.948277 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vdnzp" Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.980944 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-g92n5" Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.987526 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" event={"ID":"2aac2409-afd9-44d5-a205-36d3f85d6ee1","Type":"ContainerStarted","Data":"e18874f5d6e3f5e507422a8fe133e6b6838e163eafbc36fe1625210136749e09"} Feb 19 21:46:27 crc kubenswrapper[4771]: I0219 21:46:27.987802 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.020158 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" podStartSLOduration=3.150947953 podStartE2EDuration="31.020136166s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.046085779 +0000 UTC m=+1059.317528249" lastFinishedPulling="2026-02-19 21:46:26.915273952 +0000 UTC m=+1087.186716462" observedRunningTime="2026-02-19 21:46:28.017531787 +0000 UTC m=+1088.288974287" watchObservedRunningTime="2026-02-19 21:46:28.020136166 +0000 UTC m=+1088.291578646" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.088372 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-l4w4n" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.161733 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-krkgh" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.258167 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6mdqb" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.319099 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-2fvpw" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.474294 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-26qgm" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.552723 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9rjlg" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.591182 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4s8tm" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.599810 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-dlg9h" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.619912 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 19 21:46:28 crc kubenswrapper[4771]: I0219 21:46:28.708535 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvdzm" Feb 19 21:46:29 crc kubenswrapper[4771]: I0219 21:46:29.395764 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:29 crc kubenswrapper[4771]: I0219 21:46:29.409413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bfa784f-72fd-4797-b5ec-ea4eaeaefffa-cert\") pod \"infra-operator-controller-manager-79d975b745-l6wgq\" (UID: \"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:29 crc kubenswrapper[4771]: I0219 21:46:29.583121 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kpz4n" Feb 19 21:46:29 crc kubenswrapper[4771]: I0219 21:46:29.591444 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:29 crc kubenswrapper[4771]: I0219 21:46:29.957914 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq"] Feb 19 21:46:29 crc kubenswrapper[4771]: W0219 21:46:29.976764 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfa784f_72fd_4797_b5ec_ea4eaeaefffa.slice/crio-e6678f5bd4af397c9568b3df4b366a2590bf0bc29295bad033c7c621e3acda96 WatchSource:0}: Error finding container e6678f5bd4af397c9568b3df4b366a2590bf0bc29295bad033c7c621e3acda96: Status 404 returned error can't find the container with id e6678f5bd4af397c9568b3df4b366a2590bf0bc29295bad033c7c621e3acda96 Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.006582 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.013363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" event={"ID":"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa","Type":"ContainerStarted","Data":"e6678f5bd4af397c9568b3df4b366a2590bf0bc29295bad033c7c621e3acda96"} Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.016646 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfcbf7c3-8784-4275-9838-7f6b9666e49a-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb\" (UID: \"bfcbf7c3-8784-4275-9838-7f6b9666e49a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.210405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.210494 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.216329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.217744 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4425ef75-1f25-4996-872b-1d4e877c0e10-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-t66d7\" (UID: \"4425ef75-1f25-4996-872b-1d4e877c0e10\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.218526 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-q8d4j" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.225305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lxw2q" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.225507 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.233847 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.570095 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb"] Feb 19 21:46:30 crc kubenswrapper[4771]: I0219 21:46:30.820603 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7"] Feb 19 21:46:31 crc kubenswrapper[4771]: I0219 21:46:31.022901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" event={"ID":"4425ef75-1f25-4996-872b-1d4e877c0e10","Type":"ContainerStarted","Data":"026942557d3f2927ec38f6acb2a165783a82fdf1b758f918aefc95847bab6f5e"} Feb 19 21:46:31 crc kubenswrapper[4771]: I0219 21:46:31.023226 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:31 crc kubenswrapper[4771]: I0219 21:46:31.023242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" event={"ID":"4425ef75-1f25-4996-872b-1d4e877c0e10","Type":"ContainerStarted","Data":"dd8a19ba748f143f02a4ae9a63b5bbf470f411d7262ac545972dec0da845f408"} Feb 19 21:46:31 crc kubenswrapper[4771]: I0219 21:46:31.025088 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" event={"ID":"bfcbf7c3-8784-4275-9838-7f6b9666e49a","Type":"ContainerStarted","Data":"216cf9c12537289a8a6b9d9c2eaf77bb039169e354bb022013b9707ca42eb40e"} Feb 19 21:46:31 crc kubenswrapper[4771]: I0219 21:46:31.059501 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" podStartSLOduration=34.059478024 podStartE2EDuration="34.059478024s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:31.047449286 +0000 UTC m=+1091.318891796" watchObservedRunningTime="2026-02-19 21:46:31.059478024 +0000 UTC m=+1091.330920504" Feb 19 21:46:32 crc kubenswrapper[4771]: I0219 21:46:32.038491 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" event={"ID":"4fe3eefb-4705-42c0-8107-cf25db83f1d3","Type":"ContainerStarted","Data":"7d31dacba2bc817632dca901e111c82c5fe69be1024d7185bf330f5fe19a4d60"} Feb 19 21:46:32 crc kubenswrapper[4771]: I0219 21:46:32.039079 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" Feb 19 21:46:32 crc kubenswrapper[4771]: I0219 21:46:32.058433 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" podStartSLOduration=3.619059767 podStartE2EDuration="35.058407438s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.537841818 +0000 UTC m=+1059.809284288" lastFinishedPulling="2026-02-19 21:46:30.977189449 +0000 UTC m=+1091.248631959" observedRunningTime="2026-02-19 21:46:32.052184614 +0000 UTC m=+1092.323627104" watchObservedRunningTime="2026-02-19 21:46:32.058407438 +0000 UTC m=+1092.329849938" Feb 19 21:46:33 crc kubenswrapper[4771]: I0219 21:46:33.048895 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" event={"ID":"3bfa784f-72fd-4797-b5ec-ea4eaeaefffa","Type":"ContainerStarted","Data":"f32a5d896c99825181ae92672450761562750ecdd01cc1022d573d68c69f8a62"} Feb 19 21:46:33 crc kubenswrapper[4771]: I0219 21:46:33.049355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:33 crc kubenswrapper[4771]: I0219 21:46:33.052293 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" event={"ID":"bfcbf7c3-8784-4275-9838-7f6b9666e49a","Type":"ContainerStarted","Data":"bf5e8704dd0ac16615ed638c99a9441ea6c3526bcafd6a8f6fb5ce78cf7d4039"} Feb 19 21:46:33 crc kubenswrapper[4771]: I0219 21:46:33.052477 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:33 crc kubenswrapper[4771]: I0219 21:46:33.069435 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" podStartSLOduration=33.477582012 podStartE2EDuration="36.069417311s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:46:29.98301056 +0000 UTC m=+1090.254453060" lastFinishedPulling="2026-02-19 21:46:32.574845889 +0000 UTC m=+1092.846288359" observedRunningTime="2026-02-19 21:46:33.066078493 +0000 UTC m=+1093.337520973" watchObservedRunningTime="2026-02-19 21:46:33.069417311 +0000 UTC m=+1093.340859781" Feb 19 21:46:33 crc kubenswrapper[4771]: I0219 21:46:33.105711 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" podStartSLOduration=34.098912565 podStartE2EDuration="36.10568727s" podCreationTimestamp="2026-02-19 21:45:57 +0000 UTC" firstStartedPulling="2026-02-19 21:46:30.573142698 +0000 UTC m=+1090.844585168" lastFinishedPulling="2026-02-19 21:46:32.579917403 +0000 UTC m=+1092.851359873" observedRunningTime="2026-02-19 21:46:33.100845633 +0000 UTC m=+1093.372288123" watchObservedRunningTime="2026-02-19 21:46:33.10568727 +0000 UTC m=+1093.377129740" Feb 19 21:46:37 crc kubenswrapper[4771]: I0219 21:46:37.755665 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-jvbwn" Feb 19 21:46:38 crc kubenswrapper[4771]: I0219 21:46:38.056289 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mnjh4" Feb 19 21:46:38 crc kubenswrapper[4771]: I0219 21:46:38.298962 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qfc7w" Feb 19 21:46:39 crc kubenswrapper[4771]: I0219 21:46:39.602142 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-l6wgq" Feb 19 21:46:40 crc kubenswrapper[4771]: I0219 21:46:40.236197 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb" Feb 19 21:46:40 crc kubenswrapper[4771]: I0219 21:46:40.244745 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-t66d7" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.097640 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-6br42"] Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.099507 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.102336 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vmrfc" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.102485 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.102846 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.103540 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-6br42"] Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.108009 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.139743 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlv29\" (UniqueName: \"kubernetes.io/projected/d71cb1ec-1059-4221-9eb5-731aa8429e67-kube-api-access-vlv29\") pod \"dnsmasq-dns-855cbc58c5-6br42\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.139802 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cb1ec-1059-4221-9eb5-731aa8429e67-config\") pod \"dnsmasq-dns-855cbc58c5-6br42\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.140142 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-567sg"] Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.141862 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.145199 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.161063 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-567sg"] Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.241386 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-config\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.241518 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlv29\" (UniqueName: \"kubernetes.io/projected/d71cb1ec-1059-4221-9eb5-731aa8429e67-kube-api-access-vlv29\") pod \"dnsmasq-dns-855cbc58c5-6br42\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.241554 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cb1ec-1059-4221-9eb5-731aa8429e67-config\") pod \"dnsmasq-dns-855cbc58c5-6br42\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.241583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlxcp\" (UniqueName: \"kubernetes.io/projected/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-kube-api-access-qlxcp\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.242378 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cb1ec-1059-4221-9eb5-731aa8429e67-config\") pod \"dnsmasq-dns-855cbc58c5-6br42\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.242459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.258161 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlv29\" (UniqueName: \"kubernetes.io/projected/d71cb1ec-1059-4221-9eb5-731aa8429e67-kube-api-access-vlv29\") pod \"dnsmasq-dns-855cbc58c5-6br42\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.344172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlxcp\" (UniqueName: \"kubernetes.io/projected/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-kube-api-access-qlxcp\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.344237 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.344276 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-config\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.345309 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.345448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-config\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.381311 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlxcp\" (UniqueName: \"kubernetes.io/projected/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-kube-api-access-qlxcp\") pod \"dnsmasq-dns-6fcf94d689-567sg\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.420590 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.460765 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.931652 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-6br42"] Feb 19 21:46:56 crc kubenswrapper[4771]: W0219 21:46:56.980322 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05eda1d1_93d5_457e_a6ba_cbee2151a8ed.slice/crio-efb9cbc5b8c0806c7191b23cd5d76b6e5ae9d8c5a9d7d51ac74b478e772af35d WatchSource:0}: Error finding container efb9cbc5b8c0806c7191b23cd5d76b6e5ae9d8c5a9d7d51ac74b478e772af35d: Status 404 returned error can't find the container with id efb9cbc5b8c0806c7191b23cd5d76b6e5ae9d8c5a9d7d51ac74b478e772af35d Feb 19 21:46:56 crc kubenswrapper[4771]: I0219 21:46:56.980416 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-567sg"] Feb 19 21:46:57 crc kubenswrapper[4771]: I0219 21:46:57.259848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-567sg" event={"ID":"05eda1d1-93d5-457e-a6ba-cbee2151a8ed","Type":"ContainerStarted","Data":"efb9cbc5b8c0806c7191b23cd5d76b6e5ae9d8c5a9d7d51ac74b478e772af35d"} Feb 19 21:46:57 crc kubenswrapper[4771]: I0219 21:46:57.261196 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-6br42" event={"ID":"d71cb1ec-1059-4221-9eb5-731aa8429e67","Type":"ContainerStarted","Data":"ee7bd2258339835b6f3138c9628e473d0563ffa96c970a47874c207bb5e530b4"} Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.263750 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-567sg"] Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.288070 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-qm262"] Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.289242 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.300031 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-qm262"] Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.386345 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-config\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.389240 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-dns-svc\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.389323 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgc5z\" (UniqueName: \"kubernetes.io/projected/6772348f-9ea9-4977-9a44-cd81be89e13a-kube-api-access-vgc5z\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.491124 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-dns-svc\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.491199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgc5z\" (UniqueName: \"kubernetes.io/projected/6772348f-9ea9-4977-9a44-cd81be89e13a-kube-api-access-vgc5z\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.491237 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-config\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.492370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-dns-svc\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.492740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-config\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.509345 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgc5z\" (UniqueName: \"kubernetes.io/projected/6772348f-9ea9-4977-9a44-cd81be89e13a-kube-api-access-vgc5z\") pod \"dnsmasq-dns-f54874ffc-qm262\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.608265 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.929184 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-6br42"] Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.946937 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-szhgs"] Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.948104 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:58 crc kubenswrapper[4771]: I0219 21:46:58.973716 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-szhgs"] Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.000763 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-dns-svc\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.000803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-config\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.000834 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rjw\" (UniqueName: \"kubernetes.io/projected/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-kube-api-access-m6rjw\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.051954 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-qm262"] Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.101907 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rjw\" (UniqueName: \"kubernetes.io/projected/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-kube-api-access-m6rjw\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.102055 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-dns-svc\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.102079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-config\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.102974 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-config\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.103040 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-dns-svc\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.125009 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rjw\" (UniqueName: \"kubernetes.io/projected/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-kube-api-access-m6rjw\") pod \"dnsmasq-dns-67ff45466c-szhgs\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.266646 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.289597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-qm262" event={"ID":"6772348f-9ea9-4977-9a44-cd81be89e13a","Type":"ContainerStarted","Data":"f7ba4353e9414a71ace07be115ec49c49de06b1aed93aea7357cee034402ad0c"} Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.417121 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.418709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.420439 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.420712 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.420834 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.420955 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.424198 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z2svb" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.424513 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.424646 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.432328 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508616 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508736 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnr6x\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-kube-api-access-dnr6x\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508775 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508817 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508913 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.508997 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.509038 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610406 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610458 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnr6x\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-kube-api-access-dnr6x\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610759 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610828 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610876 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.610928 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.611241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.611509 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.611700 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.612512 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.612588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.616258 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.616522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.624693 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.630036 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.630311 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnr6x\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-kube-api-access-dnr6x\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.637005 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.756798 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:46:59 crc kubenswrapper[4771]: I0219 21:46:59.772504 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-szhgs"] Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.132045 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.133954 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.136655 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.137340 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.137354 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.137555 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.137617 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.137685 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.138980 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2cf7c" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.139243 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.227985 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.301261 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" event={"ID":"8ee7e295-142a-45e9-8a2e-6681c5bb9b17","Type":"ContainerStarted","Data":"0f39c33433eb00776cdf9c5286a12415670467de0b4ea3605df2f436802680e0"} Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.303490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c671dcf6-b1eb-4c4e-ba71-ae115ce811da","Type":"ContainerStarted","Data":"fab845b0439f6a1a16bdbb8e88d1b1dc36c4ea8558910ecfa0c68190aec76024"} Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319089 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319334 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319368 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319391 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319411 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319471 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.319653 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdldr\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-kube-api-access-mdldr\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.421345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.421437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.421467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.422185 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.421862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.422268 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.422300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.422484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.422437 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.423164 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.423452 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.423558 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.430853 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.430945 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.445432 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.445474 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdldr\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-kube-api-access-mdldr\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.445529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.445547 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.447121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.455768 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.455907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.457119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.467591 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdldr\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-kube-api-access-mdldr\") pod \"rabbitmq-cell1-server-0\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:00 crc kubenswrapper[4771]: I0219 21:47:00.490298 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.065867 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:47:01 crc kubenswrapper[4771]: W0219 21:47:01.116970 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17cd62c3_f3af_4144_b6b1_ec2cafb424ad.slice/crio-dd0d59f4e36f9194b800f7a23d8e86cdc37a1d94be5b4b9b96d656186b179bf5 WatchSource:0}: Error finding container dd0d59f4e36f9194b800f7a23d8e86cdc37a1d94be5b4b9b96d656186b179bf5: Status 404 returned error can't find the container with id dd0d59f4e36f9194b800f7a23d8e86cdc37a1d94be5b4b9b96d656186b179bf5 Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.315758 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17cd62c3-f3af-4144-b6b1-ec2cafb424ad","Type":"ContainerStarted","Data":"dd0d59f4e36f9194b800f7a23d8e86cdc37a1d94be5b4b9b96d656186b179bf5"} Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.571685 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.573171 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.575711 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.575742 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.575824 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ljqtn" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.575742 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.577830 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.582963 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.672783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.672837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgd2h\" (UniqueName: \"kubernetes.io/projected/4368f372-87b3-4f85-95f9-72c2046b0cc7-kube-api-access-vgd2h\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.672882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.672983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.673066 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.673157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.673180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.673246 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776590 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776731 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776758 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776781 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgd2h\" (UniqueName: \"kubernetes.io/projected/4368f372-87b3-4f85-95f9-72c2046b0cc7-kube-api-access-vgd2h\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.776811 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.777275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.779576 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.780372 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.781142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.781600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.783534 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.789652 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.797890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgd2h\" (UniqueName: \"kubernetes.io/projected/4368f372-87b3-4f85-95f9-72c2046b0cc7-kube-api-access-vgd2h\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.807226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " pod="openstack/openstack-galera-0" Feb 19 21:47:01 crc kubenswrapper[4771]: I0219 21:47:01.892865 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:47:02 crc kubenswrapper[4771]: I0219 21:47:02.908320 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:47:02 crc kubenswrapper[4771]: I0219 21:47:02.910625 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:02 crc kubenswrapper[4771]: I0219 21:47:02.912453 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 21:47:02 crc kubenswrapper[4771]: I0219 21:47:02.912481 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-brk9q" Feb 19 21:47:02 crc kubenswrapper[4771]: I0219 21:47:02.912927 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 21:47:02 crc kubenswrapper[4771]: I0219 21:47:02.913190 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 21:47:02 crc kubenswrapper[4771]: I0219 21:47:02.920226 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997430 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997453 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7d4v\" (UniqueName: \"kubernetes.io/projected/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kube-api-access-h7d4v\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997490 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:02.997748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102563 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102585 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7d4v\" (UniqueName: \"kubernetes.io/projected/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kube-api-access-h7d4v\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102640 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102737 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.102920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.103078 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.103504 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.103809 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.104250 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.108720 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.118732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7d4v\" (UniqueName: \"kubernetes.io/projected/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kube-api-access-h7d4v\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.122614 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.126741 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.248304 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.249477 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.253227 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.253377 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-sbhh4" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.253508 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.257259 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.277074 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.306098 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb4c5\" (UniqueName: \"kubernetes.io/projected/1c812f8b-6ad1-4873-8999-e649acd07d91-kube-api-access-rb4c5\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.306139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.306192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-kolla-config\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.306369 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.306435 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-config-data\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.407672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb4c5\" (UniqueName: \"kubernetes.io/projected/1c812f8b-6ad1-4873-8999-e649acd07d91-kube-api-access-rb4c5\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.408081 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.408752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-kolla-config\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.408799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.408844 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-config-data\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.409547 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-kolla-config\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.409718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-config-data\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.414555 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.420350 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.421976 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb4c5\" (UniqueName: \"kubernetes.io/projected/1c812f8b-6ad1-4873-8999-e649acd07d91-kube-api-access-rb4c5\") pod \"memcached-0\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " pod="openstack/memcached-0" Feb 19 21:47:03 crc kubenswrapper[4771]: I0219 21:47:03.567572 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.295934 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.297348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.299885 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zmx8l" Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.310977 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.335168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb96l\" (UniqueName: \"kubernetes.io/projected/b8ee33e7-850d-465e-9add-481bfcbdd6b4-kube-api-access-lb96l\") pod \"kube-state-metrics-0\" (UID: \"b8ee33e7-850d-465e-9add-481bfcbdd6b4\") " pod="openstack/kube-state-metrics-0" Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.436340 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb96l\" (UniqueName: \"kubernetes.io/projected/b8ee33e7-850d-465e-9add-481bfcbdd6b4-kube-api-access-lb96l\") pod \"kube-state-metrics-0\" (UID: \"b8ee33e7-850d-465e-9add-481bfcbdd6b4\") " pod="openstack/kube-state-metrics-0" Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.460866 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb96l\" (UniqueName: \"kubernetes.io/projected/b8ee33e7-850d-465e-9add-481bfcbdd6b4-kube-api-access-lb96l\") pod \"kube-state-metrics-0\" (UID: \"b8ee33e7-850d-465e-9add-481bfcbdd6b4\") " pod="openstack/kube-state-metrics-0" Feb 19 21:47:05 crc kubenswrapper[4771]: I0219 21:47:05.627213 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.137779 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-js7lh"] Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.139156 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.143837 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tzmpp" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.144156 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.145935 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-94qm4"] Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.147470 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.149403 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.168428 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-js7lh"] Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.185085 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-94qm4"] Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.247587 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.248929 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.254473 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.254645 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.254691 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.254641 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7sr8m" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.255698 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.257584 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336222 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-etc-ovs\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336276 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5tj\" (UniqueName: \"kubernetes.io/projected/d016daf3-054a-4914-8711-fc82edab9f88-kube-api-access-8g5tj\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336310 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-combined-ca-bundle\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336327 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-ovn-controller-tls-certs\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336356 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336377 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-run\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nf9\" (UniqueName: \"kubernetes.io/projected/a53e8d42-c35f-4ea0-9314-f4d849908089-kube-api-access-88nf9\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-log-ovn\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336432 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run-ovn\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336447 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-log\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336487 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-lib\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d016daf3-054a-4914-8711-fc82edab9f88-scripts\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.336527 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e8d42-c35f-4ea0-9314-f4d849908089-scripts\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438146 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438218 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-log-ovn\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438253 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run-ovn\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-log\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-lib\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438367 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-etc-ovs\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438385 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5tj\" (UniqueName: \"kubernetes.io/projected/d016daf3-054a-4914-8711-fc82edab9f88-kube-api-access-8g5tj\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438426 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-combined-ca-bundle\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438441 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438463 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438496 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438527 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-run\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438561 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nf9\" (UniqueName: \"kubernetes.io/projected/a53e8d42-c35f-4ea0-9314-f4d849908089-kube-api-access-88nf9\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d016daf3-054a-4914-8711-fc82edab9f88-scripts\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e8d42-c35f-4ea0-9314-f4d849908089-scripts\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438672 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8xr7\" (UniqueName: \"kubernetes.io/projected/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-kube-api-access-k8xr7\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438693 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-config\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438724 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.438752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-ovn-controller-tls-certs\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.439397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-log\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.439799 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-lib\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.439835 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-log-ovn\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.439976 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run-ovn\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.440076 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.440095 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-run\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.444293 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-etc-ovs\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.444351 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e8d42-c35f-4ea0-9314-f4d849908089-scripts\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.444949 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d016daf3-054a-4914-8711-fc82edab9f88-scripts\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.449912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-ovn-controller-tls-certs\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.450043 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-combined-ca-bundle\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.454141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nf9\" (UniqueName: \"kubernetes.io/projected/a53e8d42-c35f-4ea0-9314-f4d849908089-kube-api-access-88nf9\") pod \"ovn-controller-ovs-94qm4\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.458177 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5tj\" (UniqueName: \"kubernetes.io/projected/d016daf3-054a-4914-8711-fc82edab9f88-kube-api-access-8g5tj\") pod \"ovn-controller-js7lh\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.467811 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.485933 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540276 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8xr7\" (UniqueName: \"kubernetes.io/projected/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-kube-api-access-k8xr7\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540323 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-config\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540415 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540553 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540587 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.540642 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.541929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.542404 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.543385 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-config\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.543846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.547274 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.549391 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.557441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.559566 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8xr7\" (UniqueName: \"kubernetes.io/projected/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-kube-api-access-k8xr7\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.579549 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:10 crc kubenswrapper[4771]: I0219 21:47:10.875055 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.411703 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.414093 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.423601 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.423741 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.424268 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-l9kkt" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.424490 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.470500 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.581764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.581946 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.582027 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.582117 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.582148 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tmwv\" (UniqueName: \"kubernetes.io/projected/9d59b620-a9f9-4539-98e2-a4ad4d97d442-kube-api-access-9tmwv\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.582207 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.582229 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.582301 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683739 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683784 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tmwv\" (UniqueName: \"kubernetes.io/projected/9d59b620-a9f9-4539-98e2-a4ad4d97d442-kube-api-access-9tmwv\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683841 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683876 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.683923 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.684395 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.684701 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.685399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.685882 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.689824 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.689863 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.690627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.707664 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.718127 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tmwv\" (UniqueName: \"kubernetes.io/projected/9d59b620-a9f9-4539-98e2-a4ad4d97d442-kube-api-access-9tmwv\") pod \"ovsdbserver-sb-0\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:12 crc kubenswrapper[4771]: I0219 21:47:12.788133 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:18 crc kubenswrapper[4771]: E0219 21:47:18.679405 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76" Feb 19 21:47:18 crc kubenswrapper[4771]: E0219 21:47:18.680192 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnr6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(c671dcf6-b1eb-4c4e-ba71-ae115ce811da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:47:18 crc kubenswrapper[4771]: E0219 21:47:18.681452 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.490218 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.490359 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6rjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-szhgs_openstack(8ee7e295-142a-45e9-8a2e-6681c5bb9b17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.491758 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" podUID="8ee7e295-142a-45e9-8a2e-6681c5bb9b17" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.524194 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.524363 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlv29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-6br42_openstack(d71cb1ec-1059-4221-9eb5-731aa8429e67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.524605 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76\\\"\"" pod="openstack/rabbitmq-server-0" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.525544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-6br42" podUID="d71cb1ec-1059-4221-9eb5-731aa8429e67" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.599081 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.600348 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qlxcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-567sg_openstack(05eda1d1-93d5-457e-a6ba-cbee2151a8ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.602162 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-567sg" podUID="05eda1d1-93d5-457e-a6ba-cbee2151a8ed" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.653368 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.653683 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vgc5z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-qm262_openstack(6772348f-9ea9-4977-9a44-cd81be89e13a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:47:19 crc kubenswrapper[4771]: E0219 21:47:19.654999 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-qm262" podUID="6772348f-9ea9-4977-9a44-cd81be89e13a" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.191788 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.214122 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.245715 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.418099 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:47:20 crc kubenswrapper[4771]: W0219 21:47:20.420966 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fbc55cb_d769_4fa4_a18e_a8ba6234fc0d.slice/crio-4f68625c29b2b8b745321c052292050f23abb03d1b28fba389eca7332250cc3c WatchSource:0}: Error finding container 4f68625c29b2b8b745321c052292050f23abb03d1b28fba389eca7332250cc3c: Status 404 returned error can't find the container with id 4f68625c29b2b8b745321c052292050f23abb03d1b28fba389eca7332250cc3c Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.485033 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.485283 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-js7lh"] Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.488233 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-94qm4"] Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.499858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1c812f8b-6ad1-4873-8999-e649acd07d91","Type":"ContainerStarted","Data":"3dd228c80bb649fb6014c610a3c0ae5edc562d347d6ed009d5224811b3a7967b"} Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.504742 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh" event={"ID":"d016daf3-054a-4914-8711-fc82edab9f88","Type":"ContainerStarted","Data":"a9f5f6c807a52b1ed4077a6997a9fd7cd8e3aca97fcd66b27da51717fbcad3cd"} Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.504768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4368f372-87b3-4f85-95f9-72c2046b0cc7","Type":"ContainerStarted","Data":"26afb60f64e505e9dcd57b481fa42cfb8da3ba5693b405f2a7d85720dc87b465"} Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.514207 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2aa7dd4a-4295-4120-bff7-fc2c82f76aed","Type":"ContainerStarted","Data":"f5593b65aa4e9e23a17ff553a791eebc4181c912f7a11d4093a4c8d42397715e"} Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.515448 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8ee33e7-850d-465e-9add-481bfcbdd6b4","Type":"ContainerStarted","Data":"2192e5a5bc14fa5d1baeecb5533857cd4e28b57a9357bf67150f8409c7bb91ea"} Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.520289 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d","Type":"ContainerStarted","Data":"4f68625c29b2b8b745321c052292050f23abb03d1b28fba389eca7332250cc3c"} Feb 19 21:47:20 crc kubenswrapper[4771]: E0219 21:47:20.530383 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" podUID="8ee7e295-142a-45e9-8a2e-6681c5bb9b17" Feb 19 21:47:20 crc kubenswrapper[4771]: E0219 21:47:20.530628 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-qm262" podUID="6772348f-9ea9-4977-9a44-cd81be89e13a" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.887893 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.914030 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.951594 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlxcp\" (UniqueName: \"kubernetes.io/projected/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-kube-api-access-qlxcp\") pod \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.951666 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cb1ec-1059-4221-9eb5-731aa8429e67-config\") pod \"d71cb1ec-1059-4221-9eb5-731aa8429e67\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.951695 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlv29\" (UniqueName: \"kubernetes.io/projected/d71cb1ec-1059-4221-9eb5-731aa8429e67-kube-api-access-vlv29\") pod \"d71cb1ec-1059-4221-9eb5-731aa8429e67\" (UID: \"d71cb1ec-1059-4221-9eb5-731aa8429e67\") " Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.951775 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-dns-svc\") pod \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.951793 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-config\") pod \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\" (UID: \"05eda1d1-93d5-457e-a6ba-cbee2151a8ed\") " Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.952578 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-config" (OuterVolumeSpecName: "config") pod "05eda1d1-93d5-457e-a6ba-cbee2151a8ed" (UID: "05eda1d1-93d5-457e-a6ba-cbee2151a8ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.953480 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05eda1d1-93d5-457e-a6ba-cbee2151a8ed" (UID: "05eda1d1-93d5-457e-a6ba-cbee2151a8ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.953513 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d71cb1ec-1059-4221-9eb5-731aa8429e67-config" (OuterVolumeSpecName: "config") pod "d71cb1ec-1059-4221-9eb5-731aa8429e67" (UID: "d71cb1ec-1059-4221-9eb5-731aa8429e67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.960193 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-kube-api-access-qlxcp" (OuterVolumeSpecName: "kube-api-access-qlxcp") pod "05eda1d1-93d5-457e-a6ba-cbee2151a8ed" (UID: "05eda1d1-93d5-457e-a6ba-cbee2151a8ed"). InnerVolumeSpecName "kube-api-access-qlxcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:20 crc kubenswrapper[4771]: I0219 21:47:20.960316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71cb1ec-1059-4221-9eb5-731aa8429e67-kube-api-access-vlv29" (OuterVolumeSpecName: "kube-api-access-vlv29") pod "d71cb1ec-1059-4221-9eb5-731aa8429e67" (UID: "d71cb1ec-1059-4221-9eb5-731aa8429e67"). InnerVolumeSpecName "kube-api-access-vlv29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.054088 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlv29\" (UniqueName: \"kubernetes.io/projected/d71cb1ec-1059-4221-9eb5-731aa8429e67-kube-api-access-vlv29\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.054125 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.054138 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.054150 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlxcp\" (UniqueName: \"kubernetes.io/projected/05eda1d1-93d5-457e-a6ba-cbee2151a8ed-kube-api-access-qlxcp\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.054165 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cb1ec-1059-4221-9eb5-731aa8429e67-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.534085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-567sg" event={"ID":"05eda1d1-93d5-457e-a6ba-cbee2151a8ed","Type":"ContainerDied","Data":"efb9cbc5b8c0806c7191b23cd5d76b6e5ae9d8c5a9d7d51ac74b478e772af35d"} Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.539302 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerStarted","Data":"b468a81d180a3db9ea5135228c181090cc908a8d5b06793a384acb7c67cc1639"} Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.539341 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-6br42" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.539379 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-6br42" event={"ID":"d71cb1ec-1059-4221-9eb5-731aa8429e67","Type":"ContainerDied","Data":"ee7bd2258339835b6f3138c9628e473d0563ffa96c970a47874c207bb5e530b4"} Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.539463 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-567sg" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.607060 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:47:21 crc kubenswrapper[4771]: W0219 21:47:21.611652 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d59b620_a9f9_4539_98e2_a4ad4d97d442.slice/crio-7addba05b8e742f3985c3a196710e220c7dbb9b61feaf9e7ae1906d6b18daf76 WatchSource:0}: Error finding container 7addba05b8e742f3985c3a196710e220c7dbb9b61feaf9e7ae1906d6b18daf76: Status 404 returned error can't find the container with id 7addba05b8e742f3985c3a196710e220c7dbb9b61feaf9e7ae1906d6b18daf76 Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.633768 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-567sg"] Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.637506 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-567sg"] Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.678801 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-6br42"] Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.699060 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-6br42"] Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.952144 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cbm2n"] Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.954011 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:21 crc kubenswrapper[4771]: I0219 21:47:21.964854 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.018898 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cbm2n"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.082032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.082206 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovn-rundir\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.082302 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qft\" (UniqueName: \"kubernetes.io/projected/5e98e55e-96cb-4936-8220-18db4047873e-kube-api-access-x9qft\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.082440 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-combined-ca-bundle\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.082573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e98e55e-96cb-4936-8220-18db4047873e-config\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.082594 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovs-rundir\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.110472 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-szhgs"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.133808 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-h4fkt"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.135065 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.142281 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.181420 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-h4fkt"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.183779 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-ovsdbserver-nb\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.183852 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qft\" (UniqueName: \"kubernetes.io/projected/5e98e55e-96cb-4936-8220-18db4047873e-kube-api-access-x9qft\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.183922 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-combined-ca-bundle\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.183970 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-config\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.183991 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4tlj\" (UniqueName: \"kubernetes.io/projected/73b6b19d-83db-492c-9c23-a435daaad2be-kube-api-access-d4tlj\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.184011 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-dns-svc\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.184055 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e98e55e-96cb-4936-8220-18db4047873e-config\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.184073 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovs-rundir\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.184101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.184143 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovn-rundir\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.184570 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovn-rundir\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.187225 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovs-rundir\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.188513 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e98e55e-96cb-4936-8220-18db4047873e-config\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.191304 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.196490 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-combined-ca-bundle\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.211081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qft\" (UniqueName: \"kubernetes.io/projected/5e98e55e-96cb-4936-8220-18db4047873e-kube-api-access-x9qft\") pod \"ovn-controller-metrics-cbm2n\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.280088 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-qm262"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.285481 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-config\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.285529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4tlj\" (UniqueName: \"kubernetes.io/projected/73b6b19d-83db-492c-9c23-a435daaad2be-kube-api-access-d4tlj\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.285550 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-dns-svc\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.285599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-ovsdbserver-nb\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.286363 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-ovsdbserver-nb\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.286943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-config\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.287273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-dns-svc\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.288052 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.307655 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4tlj\" (UniqueName: \"kubernetes.io/projected/73b6b19d-83db-492c-9c23-a435daaad2be-kube-api-access-d4tlj\") pod \"dnsmasq-dns-64f7f48db9-h4fkt\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.358211 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-nzwmx"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.360949 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.364302 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.367132 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-nzwmx"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.446604 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.448388 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05eda1d1-93d5-457e-a6ba-cbee2151a8ed" path="/var/lib/kubelet/pods/05eda1d1-93d5-457e-a6ba-cbee2151a8ed/volumes" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.448832 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71cb1ec-1059-4221-9eb5-731aa8429e67" path="/var/lib/kubelet/pods/d71cb1ec-1059-4221-9eb5-731aa8429e67/volumes" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.462917 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.487632 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6rjw\" (UniqueName: \"kubernetes.io/projected/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-kube-api-access-m6rjw\") pod \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.487710 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-dns-svc\") pod \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.487775 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-config\") pod \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\" (UID: \"8ee7e295-142a-45e9-8a2e-6681c5bb9b17\") " Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.487993 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-config\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.488032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-dns-svc\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.488093 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-nb\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.488132 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-sb\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.488158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7c5h\" (UniqueName: \"kubernetes.io/projected/f908c73c-f1fb-4e81-9441-172192216b2d-kube-api-access-n7c5h\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.490376 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-kube-api-access-m6rjw" (OuterVolumeSpecName: "kube-api-access-m6rjw") pod "8ee7e295-142a-45e9-8a2e-6681c5bb9b17" (UID: "8ee7e295-142a-45e9-8a2e-6681c5bb9b17"). InnerVolumeSpecName "kube-api-access-m6rjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.501531 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-config" (OuterVolumeSpecName: "config") pod "8ee7e295-142a-45e9-8a2e-6681c5bb9b17" (UID: "8ee7e295-142a-45e9-8a2e-6681c5bb9b17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.501525 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ee7e295-142a-45e9-8a2e-6681c5bb9b17" (UID: "8ee7e295-142a-45e9-8a2e-6681c5bb9b17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.548182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" event={"ID":"8ee7e295-142a-45e9-8a2e-6681c5bb9b17","Type":"ContainerDied","Data":"0f39c33433eb00776cdf9c5286a12415670467de0b4ea3605df2f436802680e0"} Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.548285 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-szhgs" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.563900 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d59b620-a9f9-4539-98e2-a4ad4d97d442","Type":"ContainerStarted","Data":"7addba05b8e742f3985c3a196710e220c7dbb9b61feaf9e7ae1906d6b18daf76"} Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596539 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-config\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-dns-svc\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-nb\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596716 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-sb\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596749 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7c5h\" (UniqueName: \"kubernetes.io/projected/f908c73c-f1fb-4e81-9441-172192216b2d-kube-api-access-n7c5h\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596832 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6rjw\" (UniqueName: \"kubernetes.io/projected/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-kube-api-access-m6rjw\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596848 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.596860 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ee7e295-142a-45e9-8a2e-6681c5bb9b17-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.598348 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-config\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.599403 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-dns-svc\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.600119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-nb\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.600679 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-sb\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.604052 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-szhgs"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.607202 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-szhgs"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.612699 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7c5h\" (UniqueName: \"kubernetes.io/projected/f908c73c-f1fb-4e81-9441-172192216b2d-kube-api-access-n7c5h\") pod \"dnsmasq-dns-56df986d9c-nzwmx\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.646056 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.698470 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-config\") pod \"6772348f-9ea9-4977-9a44-cd81be89e13a\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.698633 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgc5z\" (UniqueName: \"kubernetes.io/projected/6772348f-9ea9-4977-9a44-cd81be89e13a-kube-api-access-vgc5z\") pod \"6772348f-9ea9-4977-9a44-cd81be89e13a\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.698736 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-dns-svc\") pod \"6772348f-9ea9-4977-9a44-cd81be89e13a\" (UID: \"6772348f-9ea9-4977-9a44-cd81be89e13a\") " Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.700116 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-config" (OuterVolumeSpecName: "config") pod "6772348f-9ea9-4977-9a44-cd81be89e13a" (UID: "6772348f-9ea9-4977-9a44-cd81be89e13a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.700326 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6772348f-9ea9-4977-9a44-cd81be89e13a" (UID: "6772348f-9ea9-4977-9a44-cd81be89e13a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.703772 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6772348f-9ea9-4977-9a44-cd81be89e13a-kube-api-access-vgc5z" (OuterVolumeSpecName: "kube-api-access-vgc5z") pod "6772348f-9ea9-4977-9a44-cd81be89e13a" (UID: "6772348f-9ea9-4977-9a44-cd81be89e13a"). InnerVolumeSpecName "kube-api-access-vgc5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.713655 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.801810 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgc5z\" (UniqueName: \"kubernetes.io/projected/6772348f-9ea9-4977-9a44-cd81be89e13a-kube-api-access-vgc5z\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.801845 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.801855 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6772348f-9ea9-4977-9a44-cd81be89e13a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.806330 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cbm2n"] Feb 19 21:47:22 crc kubenswrapper[4771]: I0219 21:47:22.945908 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-h4fkt"] Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.198430 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-nzwmx"] Feb 19 21:47:23 crc kubenswrapper[4771]: W0219 21:47:23.206748 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf908c73c_f1fb_4e81_9441_172192216b2d.slice/crio-9455936ca07baee2d9224a2bb198b9a00060c562b1d758e76d9aa68f4eebd483 WatchSource:0}: Error finding container 9455936ca07baee2d9224a2bb198b9a00060c562b1d758e76d9aa68f4eebd483: Status 404 returned error can't find the container with id 9455936ca07baee2d9224a2bb198b9a00060c562b1d758e76d9aa68f4eebd483 Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.572791 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-qm262" Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.572788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-qm262" event={"ID":"6772348f-9ea9-4977-9a44-cd81be89e13a","Type":"ContainerDied","Data":"f7ba4353e9414a71ace07be115ec49c49de06b1aed93aea7357cee034402ad0c"} Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.573798 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" event={"ID":"f908c73c-f1fb-4e81-9441-172192216b2d","Type":"ContainerStarted","Data":"9455936ca07baee2d9224a2bb198b9a00060c562b1d758e76d9aa68f4eebd483"} Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.575350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" event={"ID":"73b6b19d-83db-492c-9c23-a435daaad2be","Type":"ContainerStarted","Data":"cce919bbb18d10feca230b624fb2260b7bc05dcb4a8563e039bc607a4e54b410"} Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.576807 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cbm2n" event={"ID":"5e98e55e-96cb-4936-8220-18db4047873e","Type":"ContainerStarted","Data":"f864a1f2e7e627f7cad94478d02f1b0dfde0bba863d56d5cc3ee767858da1331"} Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.579076 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17cd62c3-f3af-4144-b6b1-ec2cafb424ad","Type":"ContainerStarted","Data":"dcc43673db211834491a0c5c33660484520c0820b9253defee749bead7148c83"} Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.680118 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-qm262"] Feb 19 21:47:23 crc kubenswrapper[4771]: I0219 21:47:23.687263 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-qm262"] Feb 19 21:47:24 crc kubenswrapper[4771]: I0219 21:47:24.454818 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6772348f-9ea9-4977-9a44-cd81be89e13a" path="/var/lib/kubelet/pods/6772348f-9ea9-4977-9a44-cd81be89e13a/volumes" Feb 19 21:47:24 crc kubenswrapper[4771]: I0219 21:47:24.455910 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee7e295-142a-45e9-8a2e-6681c5bb9b17" path="/var/lib/kubelet/pods/8ee7e295-142a-45e9-8a2e-6681c5bb9b17/volumes" Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.663212 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d59b620-a9f9-4539-98e2-a4ad4d97d442","Type":"ContainerStarted","Data":"44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.664652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerStarted","Data":"3fa5c85f92f572849f551738e161a886712697c5a2743885746274fbacd4eff4"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.667107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8ee33e7-850d-465e-9add-481bfcbdd6b4","Type":"ContainerStarted","Data":"902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.667504 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.669072 4771 generic.go:334] "Generic (PLEG): container finished" podID="73b6b19d-83db-492c-9c23-a435daaad2be" containerID="46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149" exitCode=0 Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.669111 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" event={"ID":"73b6b19d-83db-492c-9c23-a435daaad2be","Type":"ContainerDied","Data":"46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.670679 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d","Type":"ContainerStarted","Data":"714fd91309191f6caf2b8f7b6db8af29e9a164f11470713344c8212bb188178d"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.672483 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cbm2n" event={"ID":"5e98e55e-96cb-4936-8220-18db4047873e","Type":"ContainerStarted","Data":"f081b25aaea300f6e060b204c908ec660e0941d0cadd8d054f4c1cb78c13b8e8"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.674146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2aa7dd4a-4295-4120-bff7-fc2c82f76aed","Type":"ContainerStarted","Data":"989780e22c1c2d03db82d8a1c9b2f4054eb65906251a7cbf79134c1b65f185a6"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.676404 4771 generic.go:334] "Generic (PLEG): container finished" podID="f908c73c-f1fb-4e81-9441-172192216b2d" containerID="4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c" exitCode=0 Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.676547 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" event={"ID":"f908c73c-f1fb-4e81-9441-172192216b2d","Type":"ContainerDied","Data":"4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.678117 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1c812f8b-6ad1-4873-8999-e649acd07d91","Type":"ContainerStarted","Data":"afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.678573 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.679937 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh" event={"ID":"d016daf3-054a-4914-8711-fc82edab9f88","Type":"ContainerStarted","Data":"eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.680511 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-js7lh" Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.681704 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4368f372-87b3-4f85-95f9-72c2046b0cc7","Type":"ContainerStarted","Data":"ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7"} Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.839854 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.613390712 podStartE2EDuration="30.839837446s" podCreationTimestamp="2026-02-19 21:47:03 +0000 UTC" firstStartedPulling="2026-02-19 21:47:20.255194821 +0000 UTC m=+1140.526637291" lastFinishedPulling="2026-02-19 21:47:32.481641555 +0000 UTC m=+1152.753084025" observedRunningTime="2026-02-19 21:47:33.83808396 +0000 UTC m=+1154.109526440" watchObservedRunningTime="2026-02-19 21:47:33.839837446 +0000 UTC m=+1154.111279916" Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.866172 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.996517426 podStartE2EDuration="28.866154462s" podCreationTimestamp="2026-02-19 21:47:05 +0000 UTC" firstStartedPulling="2026-02-19 21:47:20.241966131 +0000 UTC m=+1140.513408601" lastFinishedPulling="2026-02-19 21:47:33.111603167 +0000 UTC m=+1153.383045637" observedRunningTime="2026-02-19 21:47:33.864889348 +0000 UTC m=+1154.136331818" watchObservedRunningTime="2026-02-19 21:47:33.866154462 +0000 UTC m=+1154.137596932" Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.884623 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-js7lh" podStartSLOduration=11.367749427 podStartE2EDuration="23.884605739s" podCreationTimestamp="2026-02-19 21:47:10 +0000 UTC" firstStartedPulling="2026-02-19 21:47:20.492552874 +0000 UTC m=+1140.763995344" lastFinishedPulling="2026-02-19 21:47:33.009409186 +0000 UTC m=+1153.280851656" observedRunningTime="2026-02-19 21:47:33.881950349 +0000 UTC m=+1154.153392819" watchObservedRunningTime="2026-02-19 21:47:33.884605739 +0000 UTC m=+1154.156048199" Feb 19 21:47:33 crc kubenswrapper[4771]: I0219 21:47:33.962928 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cbm2n" podStartSLOduration=2.772505862 podStartE2EDuration="12.962910529s" podCreationTimestamp="2026-02-19 21:47:21 +0000 UTC" firstStartedPulling="2026-02-19 21:47:22.816113772 +0000 UTC m=+1143.087556252" lastFinishedPulling="2026-02-19 21:47:33.006518439 +0000 UTC m=+1153.277960919" observedRunningTime="2026-02-19 21:47:33.92017869 +0000 UTC m=+1154.191621170" watchObservedRunningTime="2026-02-19 21:47:33.962910529 +0000 UTC m=+1154.234352999" Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.693693 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" event={"ID":"73b6b19d-83db-492c-9c23-a435daaad2be","Type":"ContainerStarted","Data":"d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c"} Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.693960 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.696509 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d","Type":"ContainerStarted","Data":"a6a3ea0a05c4cc921bbd48b56d16a1efbe4f22bdc52aaa4b06df43fbc2c61851"} Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.699488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d59b620-a9f9-4539-98e2-a4ad4d97d442","Type":"ContainerStarted","Data":"a0ca6423b8cc74730d8f188d3660eb639df435604c4bb4ca1789f192ebeca7cd"} Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.701709 4771 generic.go:334] "Generic (PLEG): container finished" podID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerID="3fa5c85f92f572849f551738e161a886712697c5a2743885746274fbacd4eff4" exitCode=0 Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.701866 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerDied","Data":"3fa5c85f92f572849f551738e161a886712697c5a2743885746274fbacd4eff4"} Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.705901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" event={"ID":"f908c73c-f1fb-4e81-9441-172192216b2d","Type":"ContainerStarted","Data":"7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686"} Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.705950 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.757536 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.683911098 podStartE2EDuration="25.757515843s" podCreationTimestamp="2026-02-19 21:47:09 +0000 UTC" firstStartedPulling="2026-02-19 21:47:20.433887794 +0000 UTC m=+1140.705330254" lastFinishedPulling="2026-02-19 21:47:32.507492529 +0000 UTC m=+1152.778934999" observedRunningTime="2026-02-19 21:47:34.74905869 +0000 UTC m=+1155.020501160" watchObservedRunningTime="2026-02-19 21:47:34.757515843 +0000 UTC m=+1155.028958323" Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.757987 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" podStartSLOduration=3.096016225 podStartE2EDuration="12.757980675s" podCreationTimestamp="2026-02-19 21:47:22 +0000 UTC" firstStartedPulling="2026-02-19 21:47:22.947894035 +0000 UTC m=+1143.219336545" lastFinishedPulling="2026-02-19 21:47:32.609858495 +0000 UTC m=+1152.881300995" observedRunningTime="2026-02-19 21:47:34.722475697 +0000 UTC m=+1154.993918217" watchObservedRunningTime="2026-02-19 21:47:34.757980675 +0000 UTC m=+1155.029423155" Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.789164 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" podStartSLOduration=2.992142389 podStartE2EDuration="12.789135608s" podCreationTimestamp="2026-02-19 21:47:22 +0000 UTC" firstStartedPulling="2026-02-19 21:47:23.209537591 +0000 UTC m=+1143.480980061" lastFinishedPulling="2026-02-19 21:47:33.00653081 +0000 UTC m=+1153.277973280" observedRunningTime="2026-02-19 21:47:34.77481743 +0000 UTC m=+1155.046259960" watchObservedRunningTime="2026-02-19 21:47:34.789135608 +0000 UTC m=+1155.060578098" Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.861256 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.477776763 podStartE2EDuration="23.861234925s" podCreationTimestamp="2026-02-19 21:47:11 +0000 UTC" firstStartedPulling="2026-02-19 21:47:21.625614105 +0000 UTC m=+1141.897056585" lastFinishedPulling="2026-02-19 21:47:33.009072277 +0000 UTC m=+1153.280514747" observedRunningTime="2026-02-19 21:47:34.844359259 +0000 UTC m=+1155.115801739" watchObservedRunningTime="2026-02-19 21:47:34.861234925 +0000 UTC m=+1155.132677415" Feb 19 21:47:34 crc kubenswrapper[4771]: I0219 21:47:34.875496 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:35 crc kubenswrapper[4771]: I0219 21:47:35.717975 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerStarted","Data":"daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05"} Feb 19 21:47:35 crc kubenswrapper[4771]: I0219 21:47:35.718681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerStarted","Data":"e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f"} Feb 19 21:47:35 crc kubenswrapper[4771]: I0219 21:47:35.718712 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:35 crc kubenswrapper[4771]: I0219 21:47:35.718733 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:47:35 crc kubenswrapper[4771]: I0219 21:47:35.720744 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c671dcf6-b1eb-4c4e-ba71-ae115ce811da","Type":"ContainerStarted","Data":"7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9"} Feb 19 21:47:35 crc kubenswrapper[4771]: I0219 21:47:35.760679 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-94qm4" podStartSLOduration=13.850086193 podStartE2EDuration="25.760614747s" podCreationTimestamp="2026-02-19 21:47:10 +0000 UTC" firstStartedPulling="2026-02-19 21:47:20.571137282 +0000 UTC m=+1140.842579752" lastFinishedPulling="2026-02-19 21:47:32.481665836 +0000 UTC m=+1152.753108306" observedRunningTime="2026-02-19 21:47:35.752520033 +0000 UTC m=+1156.023962523" watchObservedRunningTime="2026-02-19 21:47:35.760614747 +0000 UTC m=+1156.032057257" Feb 19 21:47:35 crc kubenswrapper[4771]: I0219 21:47:35.876146 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:36 crc kubenswrapper[4771]: I0219 21:47:36.789471 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:36 crc kubenswrapper[4771]: I0219 21:47:36.847779 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:37 crc kubenswrapper[4771]: I0219 21:47:37.739343 4771 generic.go:334] "Generic (PLEG): container finished" podID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerID="989780e22c1c2d03db82d8a1c9b2f4054eb65906251a7cbf79134c1b65f185a6" exitCode=0 Feb 19 21:47:37 crc kubenswrapper[4771]: I0219 21:47:37.739424 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2aa7dd4a-4295-4120-bff7-fc2c82f76aed","Type":"ContainerDied","Data":"989780e22c1c2d03db82d8a1c9b2f4054eb65906251a7cbf79134c1b65f185a6"} Feb 19 21:47:37 crc kubenswrapper[4771]: I0219 21:47:37.742122 4771 generic.go:334] "Generic (PLEG): container finished" podID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerID="ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7" exitCode=0 Feb 19 21:47:37 crc kubenswrapper[4771]: I0219 21:47:37.742275 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4368f372-87b3-4f85-95f9-72c2046b0cc7","Type":"ContainerDied","Data":"ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7"} Feb 19 21:47:37 crc kubenswrapper[4771]: I0219 21:47:37.742879 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:37 crc kubenswrapper[4771]: I0219 21:47:37.935703 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:38 crc kubenswrapper[4771]: I0219 21:47:38.569437 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 21:47:38 crc kubenswrapper[4771]: I0219 21:47:38.754282 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2aa7dd4a-4295-4120-bff7-fc2c82f76aed","Type":"ContainerStarted","Data":"8bb34e090626c241dae7a24235616f7cf7f7c5d4a8bd8d21f038334a18fdd9f1"} Feb 19 21:47:38 crc kubenswrapper[4771]: I0219 21:47:38.757252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4368f372-87b3-4f85-95f9-72c2046b0cc7","Type":"ContainerStarted","Data":"c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9"} Feb 19 21:47:38 crc kubenswrapper[4771]: I0219 21:47:38.788098 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.99766037 podStartE2EDuration="37.788078681s" podCreationTimestamp="2026-02-19 21:47:01 +0000 UTC" firstStartedPulling="2026-02-19 21:47:20.218667506 +0000 UTC m=+1140.490109976" lastFinishedPulling="2026-02-19 21:47:33.009085807 +0000 UTC m=+1153.280528287" observedRunningTime="2026-02-19 21:47:38.78462301 +0000 UTC m=+1159.056065480" watchObservedRunningTime="2026-02-19 21:47:38.788078681 +0000 UTC m=+1159.059521161" Feb 19 21:47:38 crc kubenswrapper[4771]: I0219 21:47:38.806209 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 21:47:38 crc kubenswrapper[4771]: I0219 21:47:38.809464 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.237627072 podStartE2EDuration="38.809452276s" podCreationTimestamp="2026-02-19 21:47:00 +0000 UTC" firstStartedPulling="2026-02-19 21:47:20.435452166 +0000 UTC m=+1140.706894636" lastFinishedPulling="2026-02-19 21:47:33.00727736 +0000 UTC m=+1153.278719840" observedRunningTime="2026-02-19 21:47:38.806432596 +0000 UTC m=+1159.077875076" watchObservedRunningTime="2026-02-19 21:47:38.809452276 +0000 UTC m=+1159.080894756" Feb 19 21:47:38 crc kubenswrapper[4771]: I0219 21:47:38.824984 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.228164 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.230259 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.233622 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gngb9" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.233749 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.233946 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.241221 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.247817 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.390035 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.390078 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm77m\" (UniqueName: \"kubernetes.io/projected/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-kube-api-access-xm77m\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.390102 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-config\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.390143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.390165 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-scripts\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.390198 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.390216 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.491993 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.492112 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-scripts\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.492180 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.492222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.492323 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.492361 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm77m\" (UniqueName: \"kubernetes.io/projected/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-kube-api-access-xm77m\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.492396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-config\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.492437 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.493141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-scripts\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.493675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-config\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.499660 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.500437 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.503643 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.518450 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm77m\" (UniqueName: \"kubernetes.io/projected/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-kube-api-access-xm77m\") pod \"ovn-northd-0\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " pod="openstack/ovn-northd-0" Feb 19 21:47:39 crc kubenswrapper[4771]: I0219 21:47:39.626444 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:47:40 crc kubenswrapper[4771]: I0219 21:47:40.082770 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:47:40 crc kubenswrapper[4771]: I0219 21:47:40.787127 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa","Type":"ContainerStarted","Data":"4dd0f23e8016717430615b1264d0540e72cfdc1b1f77df4ef1637e8a7645fbed"} Feb 19 21:47:41 crc kubenswrapper[4771]: I0219 21:47:41.799413 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa","Type":"ContainerStarted","Data":"8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292"} Feb 19 21:47:41 crc kubenswrapper[4771]: I0219 21:47:41.800085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa","Type":"ContainerStarted","Data":"afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d"} Feb 19 21:47:41 crc kubenswrapper[4771]: I0219 21:47:41.800119 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 21:47:41 crc kubenswrapper[4771]: I0219 21:47:41.842785 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.620849335 podStartE2EDuration="2.842766534s" podCreationTimestamp="2026-02-19 21:47:39 +0000 UTC" firstStartedPulling="2026-02-19 21:47:40.089777408 +0000 UTC m=+1160.361219908" lastFinishedPulling="2026-02-19 21:47:41.311694607 +0000 UTC m=+1161.583137107" observedRunningTime="2026-02-19 21:47:41.831300661 +0000 UTC m=+1162.102743141" watchObservedRunningTime="2026-02-19 21:47:41.842766534 +0000 UTC m=+1162.114209014" Feb 19 21:47:41 crc kubenswrapper[4771]: I0219 21:47:41.893346 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 21:47:41 crc kubenswrapper[4771]: I0219 21:47:41.893468 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 21:47:42 crc kubenswrapper[4771]: I0219 21:47:42.473428 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:42 crc kubenswrapper[4771]: I0219 21:47:42.715139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:42 crc kubenswrapper[4771]: I0219 21:47:42.764373 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-h4fkt"] Feb 19 21:47:42 crc kubenswrapper[4771]: I0219 21:47:42.809227 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" podUID="73b6b19d-83db-492c-9c23-a435daaad2be" containerName="dnsmasq-dns" containerID="cri-o://d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c" gracePeriod=10 Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.278454 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.278806 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.321951 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.465922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tlj\" (UniqueName: \"kubernetes.io/projected/73b6b19d-83db-492c-9c23-a435daaad2be-kube-api-access-d4tlj\") pod \"73b6b19d-83db-492c-9c23-a435daaad2be\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.466208 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-dns-svc\") pod \"73b6b19d-83db-492c-9c23-a435daaad2be\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.466263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-ovsdbserver-nb\") pod \"73b6b19d-83db-492c-9c23-a435daaad2be\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.466362 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-config\") pod \"73b6b19d-83db-492c-9c23-a435daaad2be\" (UID: \"73b6b19d-83db-492c-9c23-a435daaad2be\") " Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.471717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b6b19d-83db-492c-9c23-a435daaad2be-kube-api-access-d4tlj" (OuterVolumeSpecName: "kube-api-access-d4tlj") pod "73b6b19d-83db-492c-9c23-a435daaad2be" (UID: "73b6b19d-83db-492c-9c23-a435daaad2be"). InnerVolumeSpecName "kube-api-access-d4tlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.510762 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73b6b19d-83db-492c-9c23-a435daaad2be" (UID: "73b6b19d-83db-492c-9c23-a435daaad2be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.532744 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-config" (OuterVolumeSpecName: "config") pod "73b6b19d-83db-492c-9c23-a435daaad2be" (UID: "73b6b19d-83db-492c-9c23-a435daaad2be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.533736 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73b6b19d-83db-492c-9c23-a435daaad2be" (UID: "73b6b19d-83db-492c-9c23-a435daaad2be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.569091 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.569285 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.569348 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b6b19d-83db-492c-9c23-a435daaad2be-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.569402 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4tlj\" (UniqueName: \"kubernetes.io/projected/73b6b19d-83db-492c-9c23-a435daaad2be-kube-api-access-d4tlj\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.828624 4771 generic.go:334] "Generic (PLEG): container finished" podID="73b6b19d-83db-492c-9c23-a435daaad2be" containerID="d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c" exitCode=0 Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.828674 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" event={"ID":"73b6b19d-83db-492c-9c23-a435daaad2be","Type":"ContainerDied","Data":"d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c"} Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.828696 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.828722 4771 scope.go:117] "RemoveContainer" containerID="d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.828707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-h4fkt" event={"ID":"73b6b19d-83db-492c-9c23-a435daaad2be","Type":"ContainerDied","Data":"cce919bbb18d10feca230b624fb2260b7bc05dcb4a8563e039bc607a4e54b410"} Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.857815 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-h4fkt"] Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.863648 4771 scope.go:117] "RemoveContainer" containerID="46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.867594 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-h4fkt"] Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.903656 4771 scope.go:117] "RemoveContainer" containerID="d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c" Feb 19 21:47:43 crc kubenswrapper[4771]: E0219 21:47:43.904253 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c\": container with ID starting with d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c not found: ID does not exist" containerID="d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.904288 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c"} err="failed to get container status \"d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c\": rpc error: code = NotFound desc = could not find container \"d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c\": container with ID starting with d1ebea68d62044a638ec83c4291f617aea82e4d0febbe7b7ce9ef26da690199c not found: ID does not exist" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.904309 4771 scope.go:117] "RemoveContainer" containerID="46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149" Feb 19 21:47:43 crc kubenswrapper[4771]: E0219 21:47:43.904821 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149\": container with ID starting with 46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149 not found: ID does not exist" containerID="46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149" Feb 19 21:47:43 crc kubenswrapper[4771]: I0219 21:47:43.904874 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149"} err="failed to get container status \"46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149\": rpc error: code = NotFound desc = could not find container \"46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149\": container with ID starting with 46fcf24d74503612da4e3e0a636360bff86db3e96aaa8a7c7b44fb188e9ba149 not found: ID does not exist" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.305078 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.449966 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b6b19d-83db-492c-9c23-a435daaad2be" path="/var/lib/kubelet/pods/73b6b19d-83db-492c-9c23-a435daaad2be/volumes" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.451400 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.798468 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8fb6-account-create-update-pqq7x"] Feb 19 21:47:44 crc kubenswrapper[4771]: E0219 21:47:44.798993 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b6b19d-83db-492c-9c23-a435daaad2be" containerName="dnsmasq-dns" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.799045 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b6b19d-83db-492c-9c23-a435daaad2be" containerName="dnsmasq-dns" Feb 19 21:47:44 crc kubenswrapper[4771]: E0219 21:47:44.799101 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b6b19d-83db-492c-9c23-a435daaad2be" containerName="init" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.799115 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b6b19d-83db-492c-9c23-a435daaad2be" containerName="init" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.799425 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b6b19d-83db-492c-9c23-a435daaad2be" containerName="dnsmasq-dns" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.800278 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.807910 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8fb6-account-create-update-pqq7x"] Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.811163 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.840770 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-pckmk"] Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.841720 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pckmk" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.870101 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pckmk"] Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.893412 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61c8262a-4240-4885-9d99-cdd8e0ed4c44-operator-scripts\") pod \"placement-8fb6-account-create-update-pqq7x\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.893451 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdlh7\" (UniqueName: \"kubernetes.io/projected/61c8262a-4240-4885-9d99-cdd8e0ed4c44-kube-api-access-tdlh7\") pod \"placement-8fb6-account-create-update-pqq7x\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.995594 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqj9\" (UniqueName: \"kubernetes.io/projected/bc00d259-80b7-4598-93f5-00f2ac227c87-kube-api-access-lqqj9\") pod \"placement-db-create-pckmk\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " pod="openstack/placement-db-create-pckmk" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.995827 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61c8262a-4240-4885-9d99-cdd8e0ed4c44-operator-scripts\") pod \"placement-8fb6-account-create-update-pqq7x\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.995884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdlh7\" (UniqueName: \"kubernetes.io/projected/61c8262a-4240-4885-9d99-cdd8e0ed4c44-kube-api-access-tdlh7\") pod \"placement-8fb6-account-create-update-pqq7x\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.996034 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc00d259-80b7-4598-93f5-00f2ac227c87-operator-scripts\") pod \"placement-db-create-pckmk\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " pod="openstack/placement-db-create-pckmk" Feb 19 21:47:44 crc kubenswrapper[4771]: I0219 21:47:44.997080 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61c8262a-4240-4885-9d99-cdd8e0ed4c44-operator-scripts\") pod \"placement-8fb6-account-create-update-pqq7x\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.026562 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdlh7\" (UniqueName: \"kubernetes.io/projected/61c8262a-4240-4885-9d99-cdd8e0ed4c44-kube-api-access-tdlh7\") pod \"placement-8fb6-account-create-update-pqq7x\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.097214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc00d259-80b7-4598-93f5-00f2ac227c87-operator-scripts\") pod \"placement-db-create-pckmk\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " pod="openstack/placement-db-create-pckmk" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.097577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqj9\" (UniqueName: \"kubernetes.io/projected/bc00d259-80b7-4598-93f5-00f2ac227c87-kube-api-access-lqqj9\") pod \"placement-db-create-pckmk\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " pod="openstack/placement-db-create-pckmk" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.098470 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc00d259-80b7-4598-93f5-00f2ac227c87-operator-scripts\") pod \"placement-db-create-pckmk\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " pod="openstack/placement-db-create-pckmk" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.125091 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqj9\" (UniqueName: \"kubernetes.io/projected/bc00d259-80b7-4598-93f5-00f2ac227c87-kube-api-access-lqqj9\") pod \"placement-db-create-pckmk\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " pod="openstack/placement-db-create-pckmk" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.164083 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.171816 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pckmk" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.650204 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-blfh9"] Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.652084 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.652215 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.667319 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-blfh9"] Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.695241 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8fb6-account-create-update-pqq7x"] Feb 19 21:47:45 crc kubenswrapper[4771]: W0219 21:47:45.699697 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61c8262a_4240_4885_9d99_cdd8e0ed4c44.slice/crio-320353ebaaa03d35fab6cbf6dcdd8522f816edc87ff8aeb8b971f3c1486cb426 WatchSource:0}: Error finding container 320353ebaaa03d35fab6cbf6dcdd8522f816edc87ff8aeb8b971f3c1486cb426: Status 404 returned error can't find the container with id 320353ebaaa03d35fab6cbf6dcdd8522f816edc87ff8aeb8b971f3c1486cb426 Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.712419 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-config\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.712467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.712490 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbdj\" (UniqueName: \"kubernetes.io/projected/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-kube-api-access-5zbdj\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.712517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-dns-svc\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.712536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.758291 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pckmk"] Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.823645 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-config\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.823918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.824432 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-config\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.824543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.823945 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbdj\" (UniqueName: \"kubernetes.io/projected/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-kube-api-access-5zbdj\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.830207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-dns-svc\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.830232 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.830846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-dns-svc\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.831626 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.848884 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbdj\" (UniqueName: \"kubernetes.io/projected/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-kube-api-access-5zbdj\") pod \"dnsmasq-dns-66b577f8c-blfh9\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.852557 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pckmk" event={"ID":"bc00d259-80b7-4598-93f5-00f2ac227c87","Type":"ContainerStarted","Data":"d9464638fc12b622f370bacecbf3c1d92a856fc23c28f2eef9cc873c70b3f4ed"} Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.853586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8fb6-account-create-update-pqq7x" event={"ID":"61c8262a-4240-4885-9d99-cdd8e0ed4c44","Type":"ContainerStarted","Data":"320353ebaaa03d35fab6cbf6dcdd8522f816edc87ff8aeb8b971f3c1486cb426"} Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.905178 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.982157 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:45 crc kubenswrapper[4771]: I0219 21:47:45.990970 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.424841 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-blfh9"] Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.791950 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.798199 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.800528 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-cf724" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.801248 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.801269 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.801445 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.821898 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.863256 4771 generic.go:334] "Generic (PLEG): container finished" podID="bc00d259-80b7-4598-93f5-00f2ac227c87" containerID="14d0c01548938f03e42260d3bf1044d84ba51bfc1bc8242f357fa6c92a03e723" exitCode=0 Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.863304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pckmk" event={"ID":"bc00d259-80b7-4598-93f5-00f2ac227c87","Type":"ContainerDied","Data":"14d0c01548938f03e42260d3bf1044d84ba51bfc1bc8242f357fa6c92a03e723"} Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.866708 4771 generic.go:334] "Generic (PLEG): container finished" podID="61c8262a-4240-4885-9d99-cdd8e0ed4c44" containerID="dbdb1ec746d8ae8c73b0b46b35a801715fcbe6f5e2ae5147a3b70af6f2352619" exitCode=0 Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.866767 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8fb6-account-create-update-pqq7x" event={"ID":"61c8262a-4240-4885-9d99-cdd8e0ed4c44","Type":"ContainerDied","Data":"dbdb1ec746d8ae8c73b0b46b35a801715fcbe6f5e2ae5147a3b70af6f2352619"} Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.868496 4771 generic.go:334] "Generic (PLEG): container finished" podID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerID="44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f" exitCode=0 Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.869527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" event={"ID":"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4","Type":"ContainerDied","Data":"44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f"} Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.869564 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" event={"ID":"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4","Type":"ContainerStarted","Data":"e75b986a9d31b7a35be02bb51214df630f334a7ea5ae70e8bff5436b97f5bc83"} Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.948299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.948367 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2sh6\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-kube-api-access-s2sh6\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.948462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-lock\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.948631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-cache\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.948695 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:46 crc kubenswrapper[4771]: I0219 21:47:46.948732 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.050463 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.050509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.050541 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.050569 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2sh6\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-kube-api-access-s2sh6\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.050597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-lock\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.050648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-cache\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.051028 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-cache\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: E0219 21:47:47.051509 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:47:47 crc kubenswrapper[4771]: E0219 21:47:47.051523 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:47:47 crc kubenswrapper[4771]: E0219 21:47:47.051561 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift podName:5580e95c-81dc-4c90-bb0c-9b27a4a8c971 nodeName:}" failed. No retries permitted until 2026-02-19 21:47:47.551546353 +0000 UTC m=+1167.822988823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift") pod "swift-storage-0" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971") : configmap "swift-ring-files" not found Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.051802 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.052135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-lock\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.056408 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.079280 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.084921 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2sh6\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-kube-api-access-s2sh6\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.297516 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-j7wbb"] Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.302507 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.326781 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j7wbb"] Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.331817 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.332196 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.335236 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.359696 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2vd\" (UniqueName: \"kubernetes.io/projected/afd70791-ecce-405a-ba45-968a1e967ef3-kube-api-access-mz2vd\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.359739 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-scripts\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.359809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-dispersionconf\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.359866 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-ring-data-devices\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.359894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afd70791-ecce-405a-ba45-968a1e967ef3-etc-swift\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.359911 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-combined-ca-bundle\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.359930 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-swiftconf\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.461559 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-ring-data-devices\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.462012 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afd70791-ecce-405a-ba45-968a1e967ef3-etc-swift\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.462074 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-combined-ca-bundle\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.462116 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-swiftconf\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.462190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2vd\" (UniqueName: \"kubernetes.io/projected/afd70791-ecce-405a-ba45-968a1e967ef3-kube-api-access-mz2vd\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.462230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-scripts\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.462348 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-dispersionconf\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.464232 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-scripts\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.464469 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-ring-data-devices\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.464472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afd70791-ecce-405a-ba45-968a1e967ef3-etc-swift\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.468230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-dispersionconf\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.468801 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-swiftconf\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.469502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-combined-ca-bundle\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.485159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2vd\" (UniqueName: \"kubernetes.io/projected/afd70791-ecce-405a-ba45-968a1e967ef3-kube-api-access-mz2vd\") pod \"swift-ring-rebalance-j7wbb\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.563364 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:47 crc kubenswrapper[4771]: E0219 21:47:47.563585 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:47:47 crc kubenswrapper[4771]: E0219 21:47:47.563625 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:47:47 crc kubenswrapper[4771]: E0219 21:47:47.563722 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift podName:5580e95c-81dc-4c90-bb0c-9b27a4a8c971 nodeName:}" failed. No retries permitted until 2026-02-19 21:47:48.563688454 +0000 UTC m=+1168.835130954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift") pod "swift-storage-0" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971") : configmap "swift-ring-files" not found Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.660906 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.885695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" event={"ID":"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4","Type":"ContainerStarted","Data":"eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c"} Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.886155 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:47 crc kubenswrapper[4771]: I0219 21:47:47.913254 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" podStartSLOduration=2.913231778 podStartE2EDuration="2.913231778s" podCreationTimestamp="2026-02-19 21:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:47.911334107 +0000 UTC m=+1168.182776587" watchObservedRunningTime="2026-02-19 21:47:47.913231778 +0000 UTC m=+1168.184674268" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.142320 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j7wbb"] Feb 19 21:47:48 crc kubenswrapper[4771]: W0219 21:47:48.153318 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd70791_ecce_405a_ba45_968a1e967ef3.slice/crio-2864b33bf9ba6c3199bc3cb27986dc5781155ded9d36bf01ef5af9dcbc1bdc20 WatchSource:0}: Error finding container 2864b33bf9ba6c3199bc3cb27986dc5781155ded9d36bf01ef5af9dcbc1bdc20: Status 404 returned error can't find the container with id 2864b33bf9ba6c3199bc3cb27986dc5781155ded9d36bf01ef5af9dcbc1bdc20 Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.364613 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pckmk" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.370443 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.487525 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61c8262a-4240-4885-9d99-cdd8e0ed4c44-operator-scripts\") pod \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.487632 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc00d259-80b7-4598-93f5-00f2ac227c87-operator-scripts\") pod \"bc00d259-80b7-4598-93f5-00f2ac227c87\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.487814 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqqj9\" (UniqueName: \"kubernetes.io/projected/bc00d259-80b7-4598-93f5-00f2ac227c87-kube-api-access-lqqj9\") pod \"bc00d259-80b7-4598-93f5-00f2ac227c87\" (UID: \"bc00d259-80b7-4598-93f5-00f2ac227c87\") " Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.487849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdlh7\" (UniqueName: \"kubernetes.io/projected/61c8262a-4240-4885-9d99-cdd8e0ed4c44-kube-api-access-tdlh7\") pod \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\" (UID: \"61c8262a-4240-4885-9d99-cdd8e0ed4c44\") " Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.488484 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61c8262a-4240-4885-9d99-cdd8e0ed4c44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61c8262a-4240-4885-9d99-cdd8e0ed4c44" (UID: "61c8262a-4240-4885-9d99-cdd8e0ed4c44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.488580 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc00d259-80b7-4598-93f5-00f2ac227c87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc00d259-80b7-4598-93f5-00f2ac227c87" (UID: "bc00d259-80b7-4598-93f5-00f2ac227c87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.494207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc00d259-80b7-4598-93f5-00f2ac227c87-kube-api-access-lqqj9" (OuterVolumeSpecName: "kube-api-access-lqqj9") pod "bc00d259-80b7-4598-93f5-00f2ac227c87" (UID: "bc00d259-80b7-4598-93f5-00f2ac227c87"). InnerVolumeSpecName "kube-api-access-lqqj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.494492 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61c8262a-4240-4885-9d99-cdd8e0ed4c44-kube-api-access-tdlh7" (OuterVolumeSpecName: "kube-api-access-tdlh7") pod "61c8262a-4240-4885-9d99-cdd8e0ed4c44" (UID: "61c8262a-4240-4885-9d99-cdd8e0ed4c44"). InnerVolumeSpecName "kube-api-access-tdlh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.589817 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:48 crc kubenswrapper[4771]: E0219 21:47:48.590140 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:47:48 crc kubenswrapper[4771]: E0219 21:47:48.590178 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.590202 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqqj9\" (UniqueName: \"kubernetes.io/projected/bc00d259-80b7-4598-93f5-00f2ac227c87-kube-api-access-lqqj9\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.590239 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdlh7\" (UniqueName: \"kubernetes.io/projected/61c8262a-4240-4885-9d99-cdd8e0ed4c44-kube-api-access-tdlh7\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:48 crc kubenswrapper[4771]: E0219 21:47:48.590279 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift podName:5580e95c-81dc-4c90-bb0c-9b27a4a8c971 nodeName:}" failed. No retries permitted until 2026-02-19 21:47:50.590248167 +0000 UTC m=+1170.861690677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift") pod "swift-storage-0" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971") : configmap "swift-ring-files" not found Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.590320 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61c8262a-4240-4885-9d99-cdd8e0ed4c44-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.590350 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc00d259-80b7-4598-93f5-00f2ac227c87-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.816923 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lbm9h"] Feb 19 21:47:48 crc kubenswrapper[4771]: E0219 21:47:48.817369 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c8262a-4240-4885-9d99-cdd8e0ed4c44" containerName="mariadb-account-create-update" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.817393 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c8262a-4240-4885-9d99-cdd8e0ed4c44" containerName="mariadb-account-create-update" Feb 19 21:47:48 crc kubenswrapper[4771]: E0219 21:47:48.817453 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc00d259-80b7-4598-93f5-00f2ac227c87" containerName="mariadb-database-create" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.817463 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc00d259-80b7-4598-93f5-00f2ac227c87" containerName="mariadb-database-create" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.817661 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c8262a-4240-4885-9d99-cdd8e0ed4c44" containerName="mariadb-account-create-update" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.817680 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc00d259-80b7-4598-93f5-00f2ac227c87" containerName="mariadb-database-create" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.818408 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.827016 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lbm9h"] Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.901754 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxk2\" (UniqueName: \"kubernetes.io/projected/af945941-8fd6-4601-a954-899b8fb66625-kube-api-access-zcxk2\") pod \"glance-db-create-lbm9h\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.901816 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af945941-8fd6-4601-a954-899b8fb66625-operator-scripts\") pod \"glance-db-create-lbm9h\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.906526 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j7wbb" event={"ID":"afd70791-ecce-405a-ba45-968a1e967ef3","Type":"ContainerStarted","Data":"2864b33bf9ba6c3199bc3cb27986dc5781155ded9d36bf01ef5af9dcbc1bdc20"} Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.911192 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pckmk" event={"ID":"bc00d259-80b7-4598-93f5-00f2ac227c87","Type":"ContainerDied","Data":"d9464638fc12b622f370bacecbf3c1d92a856fc23c28f2eef9cc873c70b3f4ed"} Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.911225 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9464638fc12b622f370bacecbf3c1d92a856fc23c28f2eef9cc873c70b3f4ed" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.911289 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pckmk" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.913978 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-pqq7x" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.913978 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8fb6-account-create-update-pqq7x" event={"ID":"61c8262a-4240-4885-9d99-cdd8e0ed4c44","Type":"ContainerDied","Data":"320353ebaaa03d35fab6cbf6dcdd8522f816edc87ff8aeb8b971f3c1486cb426"} Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.914037 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="320353ebaaa03d35fab6cbf6dcdd8522f816edc87ff8aeb8b971f3c1486cb426" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.914984 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3163-account-create-update-qmkbx"] Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.916220 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.919669 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 21:47:48 crc kubenswrapper[4771]: I0219 21:47:48.925625 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3163-account-create-update-qmkbx"] Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.003550 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7135669-613c-45c3-90fa-f8eee17faa38-operator-scripts\") pod \"glance-3163-account-create-update-qmkbx\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.003676 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxk2\" (UniqueName: \"kubernetes.io/projected/af945941-8fd6-4601-a954-899b8fb66625-kube-api-access-zcxk2\") pod \"glance-db-create-lbm9h\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.003723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af945941-8fd6-4601-a954-899b8fb66625-operator-scripts\") pod \"glance-db-create-lbm9h\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.003882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plmmr\" (UniqueName: \"kubernetes.io/projected/d7135669-613c-45c3-90fa-f8eee17faa38-kube-api-access-plmmr\") pod \"glance-3163-account-create-update-qmkbx\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.004712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af945941-8fd6-4601-a954-899b8fb66625-operator-scripts\") pod \"glance-db-create-lbm9h\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:49 crc kubenswrapper[4771]: E0219 21:47:49.012428 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc00d259_80b7_4598_93f5_00f2ac227c87.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc00d259_80b7_4598_93f5_00f2ac227c87.slice/crio-d9464638fc12b622f370bacecbf3c1d92a856fc23c28f2eef9cc873c70b3f4ed\": RecentStats: unable to find data in memory cache]" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.020190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxk2\" (UniqueName: \"kubernetes.io/projected/af945941-8fd6-4601-a954-899b8fb66625-kube-api-access-zcxk2\") pod \"glance-db-create-lbm9h\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.104794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plmmr\" (UniqueName: \"kubernetes.io/projected/d7135669-613c-45c3-90fa-f8eee17faa38-kube-api-access-plmmr\") pod \"glance-3163-account-create-update-qmkbx\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.105190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7135669-613c-45c3-90fa-f8eee17faa38-operator-scripts\") pod \"glance-3163-account-create-update-qmkbx\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.105916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7135669-613c-45c3-90fa-f8eee17faa38-operator-scripts\") pod \"glance-3163-account-create-update-qmkbx\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.120819 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plmmr\" (UniqueName: \"kubernetes.io/projected/d7135669-613c-45c3-90fa-f8eee17faa38-kube-api-access-plmmr\") pod \"glance-3163-account-create-update-qmkbx\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.158824 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.249336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.612955 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lbm9h"] Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.849906 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3163-account-create-update-qmkbx"] Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.934556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lbm9h" event={"ID":"af945941-8fd6-4601-a954-899b8fb66625","Type":"ContainerStarted","Data":"4e14f018cde901d4da996f236082091e2c86ad393c05bc554a9a701ee38321a3"} Feb 19 21:47:49 crc kubenswrapper[4771]: I0219 21:47:49.934605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lbm9h" event={"ID":"af945941-8fd6-4601-a954-899b8fb66625","Type":"ContainerStarted","Data":"1950c1648a3e5db552e8311a0f217d768fdb94e7097bcec7052162e72aa7cb3a"} Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.515069 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xz5sl"] Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.516497 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.520145 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.529081 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xz5sl"] Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.644992 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft29d\" (UniqueName: \"kubernetes.io/projected/a7989728-10af-4f46-9b05-946ab44d7145-kube-api-access-ft29d\") pod \"root-account-create-update-xz5sl\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.645294 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7989728-10af-4f46-9b05-946ab44d7145-operator-scripts\") pod \"root-account-create-update-xz5sl\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.645437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:50 crc kubenswrapper[4771]: E0219 21:47:50.645589 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:47:50 crc kubenswrapper[4771]: E0219 21:47:50.645625 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:47:50 crc kubenswrapper[4771]: E0219 21:47:50.645683 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift podName:5580e95c-81dc-4c90-bb0c-9b27a4a8c971 nodeName:}" failed. No retries permitted until 2026-02-19 21:47:54.645664953 +0000 UTC m=+1174.917107423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift") pod "swift-storage-0" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971") : configmap "swift-ring-files" not found Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.747134 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft29d\" (UniqueName: \"kubernetes.io/projected/a7989728-10af-4f46-9b05-946ab44d7145-kube-api-access-ft29d\") pod \"root-account-create-update-xz5sl\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.747207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7989728-10af-4f46-9b05-946ab44d7145-operator-scripts\") pod \"root-account-create-update-xz5sl\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.748557 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7989728-10af-4f46-9b05-946ab44d7145-operator-scripts\") pod \"root-account-create-update-xz5sl\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.787536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft29d\" (UniqueName: \"kubernetes.io/projected/a7989728-10af-4f46-9b05-946ab44d7145-kube-api-access-ft29d\") pod \"root-account-create-update-xz5sl\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.844330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.945433 4771 generic.go:334] "Generic (PLEG): container finished" podID="af945941-8fd6-4601-a954-899b8fb66625" containerID="4e14f018cde901d4da996f236082091e2c86ad393c05bc554a9a701ee38321a3" exitCode=0 Feb 19 21:47:50 crc kubenswrapper[4771]: I0219 21:47:50.945477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lbm9h" event={"ID":"af945941-8fd6-4601-a954-899b8fb66625","Type":"ContainerDied","Data":"4e14f018cde901d4da996f236082091e2c86ad393c05bc554a9a701ee38321a3"} Feb 19 21:47:51 crc kubenswrapper[4771]: W0219 21:47:51.963838 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7135669_613c_45c3_90fa_f8eee17faa38.slice/crio-c6b3aff0f798c0eabf5512a603ee00b1efd68fa13156e09aee18727d36edbeeb WatchSource:0}: Error finding container c6b3aff0f798c0eabf5512a603ee00b1efd68fa13156e09aee18727d36edbeeb: Status 404 returned error can't find the container with id c6b3aff0f798c0eabf5512a603ee00b1efd68fa13156e09aee18727d36edbeeb Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.169569 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.271891 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af945941-8fd6-4601-a954-899b8fb66625-operator-scripts\") pod \"af945941-8fd6-4601-a954-899b8fb66625\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.272240 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcxk2\" (UniqueName: \"kubernetes.io/projected/af945941-8fd6-4601-a954-899b8fb66625-kube-api-access-zcxk2\") pod \"af945941-8fd6-4601-a954-899b8fb66625\" (UID: \"af945941-8fd6-4601-a954-899b8fb66625\") " Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.272569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af945941-8fd6-4601-a954-899b8fb66625-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af945941-8fd6-4601-a954-899b8fb66625" (UID: "af945941-8fd6-4601-a954-899b8fb66625"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.272771 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af945941-8fd6-4601-a954-899b8fb66625-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.284192 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af945941-8fd6-4601-a954-899b8fb66625-kube-api-access-zcxk2" (OuterVolumeSpecName: "kube-api-access-zcxk2") pod "af945941-8fd6-4601-a954-899b8fb66625" (UID: "af945941-8fd6-4601-a954-899b8fb66625"). InnerVolumeSpecName "kube-api-access-zcxk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.374750 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcxk2\" (UniqueName: \"kubernetes.io/projected/af945941-8fd6-4601-a954-899b8fb66625-kube-api-access-zcxk2\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.495205 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xz5sl"] Feb 19 21:47:52 crc kubenswrapper[4771]: W0219 21:47:52.495750 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7989728_10af_4f46_9b05_946ab44d7145.slice/crio-e9e78a5cbaa79e6b2966beb196e7b36d9fd38d461de6f23457f81e7dfb77d64a WatchSource:0}: Error finding container e9e78a5cbaa79e6b2966beb196e7b36d9fd38d461de6f23457f81e7dfb77d64a: Status 404 returned error can't find the container with id e9e78a5cbaa79e6b2966beb196e7b36d9fd38d461de6f23457f81e7dfb77d64a Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.965514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lbm9h" event={"ID":"af945941-8fd6-4601-a954-899b8fb66625","Type":"ContainerDied","Data":"1950c1648a3e5db552e8311a0f217d768fdb94e7097bcec7052162e72aa7cb3a"} Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.966665 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1950c1648a3e5db552e8311a0f217d768fdb94e7097bcec7052162e72aa7cb3a" Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.965570 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lbm9h" Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.968377 4771 generic.go:334] "Generic (PLEG): container finished" podID="d7135669-613c-45c3-90fa-f8eee17faa38" containerID="4473d10264a5ab1810bbb85934a6cba54c8cf4d08a9c56acd1cb434e928da82e" exitCode=0 Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.968467 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3163-account-create-update-qmkbx" event={"ID":"d7135669-613c-45c3-90fa-f8eee17faa38","Type":"ContainerDied","Data":"4473d10264a5ab1810bbb85934a6cba54c8cf4d08a9c56acd1cb434e928da82e"} Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.968499 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3163-account-create-update-qmkbx" event={"ID":"d7135669-613c-45c3-90fa-f8eee17faa38","Type":"ContainerStarted","Data":"c6b3aff0f798c0eabf5512a603ee00b1efd68fa13156e09aee18727d36edbeeb"} Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.971408 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j7wbb" event={"ID":"afd70791-ecce-405a-ba45-968a1e967ef3","Type":"ContainerStarted","Data":"ad68261083ec7a599ecf4fd0bb7914ee14b7d216c3177a543f672928b2ec7a8a"} Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.981319 4771 generic.go:334] "Generic (PLEG): container finished" podID="a7989728-10af-4f46-9b05-946ab44d7145" containerID="179eaa3bb6001bb07dcf706e87adb665dfcbf4bf3d1b7de103517856791e3149" exitCode=0 Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.981351 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xz5sl" event={"ID":"a7989728-10af-4f46-9b05-946ab44d7145","Type":"ContainerDied","Data":"179eaa3bb6001bb07dcf706e87adb665dfcbf4bf3d1b7de103517856791e3149"} Feb 19 21:47:52 crc kubenswrapper[4771]: I0219 21:47:52.981384 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xz5sl" event={"ID":"a7989728-10af-4f46-9b05-946ab44d7145","Type":"ContainerStarted","Data":"e9e78a5cbaa79e6b2966beb196e7b36d9fd38d461de6f23457f81e7dfb77d64a"} Feb 19 21:47:53 crc kubenswrapper[4771]: I0219 21:47:53.036715 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-j7wbb" podStartSLOduration=2.163507874 podStartE2EDuration="6.036694722s" podCreationTimestamp="2026-02-19 21:47:47 +0000 UTC" firstStartedPulling="2026-02-19 21:47:48.155553787 +0000 UTC m=+1168.426996257" lastFinishedPulling="2026-02-19 21:47:52.028740625 +0000 UTC m=+1172.300183105" observedRunningTime="2026-02-19 21:47:53.030664161 +0000 UTC m=+1173.302106641" watchObservedRunningTime="2026-02-19 21:47:53.036694722 +0000 UTC m=+1173.308137202" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.532490 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7jgvs"] Feb 19 21:47:54 crc kubenswrapper[4771]: E0219 21:47:54.536153 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af945941-8fd6-4601-a954-899b8fb66625" containerName="mariadb-database-create" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.536500 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="af945941-8fd6-4601-a954-899b8fb66625" containerName="mariadb-database-create" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.537104 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="af945941-8fd6-4601-a954-899b8fb66625" containerName="mariadb-database-create" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.538491 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.543889 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7jgvs"] Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.549616 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.558085 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.632101 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9a3b-account-create-update-w4lvm"] Feb 19 21:47:54 crc kubenswrapper[4771]: E0219 21:47:54.632433 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7135669-613c-45c3-90fa-f8eee17faa38" containerName="mariadb-account-create-update" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.632445 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7135669-613c-45c3-90fa-f8eee17faa38" containerName="mariadb-account-create-update" Feb 19 21:47:54 crc kubenswrapper[4771]: E0219 21:47:54.632454 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7989728-10af-4f46-9b05-946ab44d7145" containerName="mariadb-account-create-update" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.632461 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7989728-10af-4f46-9b05-946ab44d7145" containerName="mariadb-account-create-update" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.632643 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7135669-613c-45c3-90fa-f8eee17faa38" containerName="mariadb-account-create-update" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.632663 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7989728-10af-4f46-9b05-946ab44d7145" containerName="mariadb-account-create-update" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.633218 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.635952 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.639926 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plmmr\" (UniqueName: \"kubernetes.io/projected/d7135669-613c-45c3-90fa-f8eee17faa38-kube-api-access-plmmr\") pod \"d7135669-613c-45c3-90fa-f8eee17faa38\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.639984 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7135669-613c-45c3-90fa-f8eee17faa38-operator-scripts\") pod \"d7135669-613c-45c3-90fa-f8eee17faa38\" (UID: \"d7135669-613c-45c3-90fa-f8eee17faa38\") " Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.640050 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft29d\" (UniqueName: \"kubernetes.io/projected/a7989728-10af-4f46-9b05-946ab44d7145-kube-api-access-ft29d\") pod \"a7989728-10af-4f46-9b05-946ab44d7145\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.640072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7989728-10af-4f46-9b05-946ab44d7145-operator-scripts\") pod \"a7989728-10af-4f46-9b05-946ab44d7145\" (UID: \"a7989728-10af-4f46-9b05-946ab44d7145\") " Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.640341 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf8wh\" (UniqueName: \"kubernetes.io/projected/5fea7660-6ec1-4456-b730-44b1e3ad5941-kube-api-access-zf8wh\") pod \"keystone-db-create-7jgvs\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.640561 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fea7660-6ec1-4456-b730-44b1e3ad5941-operator-scripts\") pod \"keystone-db-create-7jgvs\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.640578 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9a3b-account-create-update-w4lvm"] Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.641508 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7135669-613c-45c3-90fa-f8eee17faa38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7135669-613c-45c3-90fa-f8eee17faa38" (UID: "d7135669-613c-45c3-90fa-f8eee17faa38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.641591 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7989728-10af-4f46-9b05-946ab44d7145-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7989728-10af-4f46-9b05-946ab44d7145" (UID: "a7989728-10af-4f46-9b05-946ab44d7145"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.646292 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7989728-10af-4f46-9b05-946ab44d7145-kube-api-access-ft29d" (OuterVolumeSpecName: "kube-api-access-ft29d") pod "a7989728-10af-4f46-9b05-946ab44d7145" (UID: "a7989728-10af-4f46-9b05-946ab44d7145"). InnerVolumeSpecName "kube-api-access-ft29d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.650084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7135669-613c-45c3-90fa-f8eee17faa38-kube-api-access-plmmr" (OuterVolumeSpecName: "kube-api-access-plmmr") pod "d7135669-613c-45c3-90fa-f8eee17faa38" (UID: "d7135669-613c-45c3-90fa-f8eee17faa38"). InnerVolumeSpecName "kube-api-access-plmmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.742040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-operator-scripts\") pod \"keystone-9a3b-account-create-update-w4lvm\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.742346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf8wh\" (UniqueName: \"kubernetes.io/projected/5fea7660-6ec1-4456-b730-44b1e3ad5941-kube-api-access-zf8wh\") pod \"keystone-db-create-7jgvs\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.742491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.742651 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fea7660-6ec1-4456-b730-44b1e3ad5941-operator-scripts\") pod \"keystone-db-create-7jgvs\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.742786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5b4k\" (UniqueName: \"kubernetes.io/projected/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-kube-api-access-p5b4k\") pod \"keystone-9a3b-account-create-update-w4lvm\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:54 crc kubenswrapper[4771]: E0219 21:47:54.742875 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:47:54 crc kubenswrapper[4771]: E0219 21:47:54.742913 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:47:54 crc kubenswrapper[4771]: E0219 21:47:54.742979 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift podName:5580e95c-81dc-4c90-bb0c-9b27a4a8c971 nodeName:}" failed. No retries permitted until 2026-02-19 21:48:02.742958485 +0000 UTC m=+1183.014400975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift") pod "swift-storage-0" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971") : configmap "swift-ring-files" not found Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.743147 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7135669-613c-45c3-90fa-f8eee17faa38-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.743249 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft29d\" (UniqueName: \"kubernetes.io/projected/a7989728-10af-4f46-9b05-946ab44d7145-kube-api-access-ft29d\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.743346 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7989728-10af-4f46-9b05-946ab44d7145-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.743427 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plmmr\" (UniqueName: \"kubernetes.io/projected/d7135669-613c-45c3-90fa-f8eee17faa38-kube-api-access-plmmr\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.744656 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fea7660-6ec1-4456-b730-44b1e3ad5941-operator-scripts\") pod \"keystone-db-create-7jgvs\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.762566 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf8wh\" (UniqueName: \"kubernetes.io/projected/5fea7660-6ec1-4456-b730-44b1e3ad5941-kube-api-access-zf8wh\") pod \"keystone-db-create-7jgvs\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.844534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5b4k\" (UniqueName: \"kubernetes.io/projected/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-kube-api-access-p5b4k\") pod \"keystone-9a3b-account-create-update-w4lvm\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.844619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-operator-scripts\") pod \"keystone-9a3b-account-create-update-w4lvm\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.845321 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-operator-scripts\") pod \"keystone-9a3b-account-create-update-w4lvm\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.871765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5b4k\" (UniqueName: \"kubernetes.io/projected/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-kube-api-access-p5b4k\") pod \"keystone-9a3b-account-create-update-w4lvm\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:54 crc kubenswrapper[4771]: I0219 21:47:54.878768 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.010176 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.016110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3163-account-create-update-qmkbx" event={"ID":"d7135669-613c-45c3-90fa-f8eee17faa38","Type":"ContainerDied","Data":"c6b3aff0f798c0eabf5512a603ee00b1efd68fa13156e09aee18727d36edbeeb"} Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.016150 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6b3aff0f798c0eabf5512a603ee00b1efd68fa13156e09aee18727d36edbeeb" Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.016250 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-qmkbx" Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.027896 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xz5sl" event={"ID":"a7989728-10af-4f46-9b05-946ab44d7145","Type":"ContainerDied","Data":"e9e78a5cbaa79e6b2966beb196e7b36d9fd38d461de6f23457f81e7dfb77d64a"} Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.027940 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e78a5cbaa79e6b2966beb196e7b36d9fd38d461de6f23457f81e7dfb77d64a" Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.027994 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xz5sl" Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.194857 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7jgvs"] Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.327090 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9a3b-account-create-update-w4lvm"] Feb 19 21:47:55 crc kubenswrapper[4771]: I0219 21:47:55.984265 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.040661 4771 generic.go:334] "Generic (PLEG): container finished" podID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerID="dcc43673db211834491a0c5c33660484520c0820b9253defee749bead7148c83" exitCode=0 Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.040792 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17cd62c3-f3af-4144-b6b1-ec2cafb424ad","Type":"ContainerDied","Data":"dcc43673db211834491a0c5c33660484520c0820b9253defee749bead7148c83"} Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.043604 4771 generic.go:334] "Generic (PLEG): container finished" podID="40643c92-54ba-4c6b-bad6-c8d0f7f8529b" containerID="5c65d1c446aced68f3ad290c01c85b2c1b2009f812ff2b51fc255014187d7cd9" exitCode=0 Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.043711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a3b-account-create-update-w4lvm" event={"ID":"40643c92-54ba-4c6b-bad6-c8d0f7f8529b","Type":"ContainerDied","Data":"5c65d1c446aced68f3ad290c01c85b2c1b2009f812ff2b51fc255014187d7cd9"} Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.043756 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a3b-account-create-update-w4lvm" event={"ID":"40643c92-54ba-4c6b-bad6-c8d0f7f8529b","Type":"ContainerStarted","Data":"1efbdc1b45e83487dd3d6770fd8eaf1c43e10e6d770193799afeb5b034426bc4"} Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.046380 4771 generic.go:334] "Generic (PLEG): container finished" podID="5fea7660-6ec1-4456-b730-44b1e3ad5941" containerID="3c74c9c0e0a6ff63a98ee8134cab58ee1ebb426589db225fc216652b578d5a5e" exitCode=0 Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.046452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7jgvs" event={"ID":"5fea7660-6ec1-4456-b730-44b1e3ad5941","Type":"ContainerDied","Data":"3c74c9c0e0a6ff63a98ee8134cab58ee1ebb426589db225fc216652b578d5a5e"} Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.046490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7jgvs" event={"ID":"5fea7660-6ec1-4456-b730-44b1e3ad5941","Type":"ContainerStarted","Data":"0ad374ab44b8c96b67c530690aaea00b5a15356b9916d4c200a38da5100e1cd3"} Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.072875 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-nzwmx"] Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.073389 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" podUID="f908c73c-f1fb-4e81-9441-172192216b2d" containerName="dnsmasq-dns" containerID="cri-o://7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686" gracePeriod=10 Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.539716 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.681161 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-config\") pod \"f908c73c-f1fb-4e81-9441-172192216b2d\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.681222 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-nb\") pod \"f908c73c-f1fb-4e81-9441-172192216b2d\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.681283 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-dns-svc\") pod \"f908c73c-f1fb-4e81-9441-172192216b2d\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.681310 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7c5h\" (UniqueName: \"kubernetes.io/projected/f908c73c-f1fb-4e81-9441-172192216b2d-kube-api-access-n7c5h\") pod \"f908c73c-f1fb-4e81-9441-172192216b2d\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.681388 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-sb\") pod \"f908c73c-f1fb-4e81-9441-172192216b2d\" (UID: \"f908c73c-f1fb-4e81-9441-172192216b2d\") " Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.687231 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f908c73c-f1fb-4e81-9441-172192216b2d-kube-api-access-n7c5h" (OuterVolumeSpecName: "kube-api-access-n7c5h") pod "f908c73c-f1fb-4e81-9441-172192216b2d" (UID: "f908c73c-f1fb-4e81-9441-172192216b2d"). InnerVolumeSpecName "kube-api-access-n7c5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.729885 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f908c73c-f1fb-4e81-9441-172192216b2d" (UID: "f908c73c-f1fb-4e81-9441-172192216b2d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.736580 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f908c73c-f1fb-4e81-9441-172192216b2d" (UID: "f908c73c-f1fb-4e81-9441-172192216b2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.737437 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-config" (OuterVolumeSpecName: "config") pod "f908c73c-f1fb-4e81-9441-172192216b2d" (UID: "f908c73c-f1fb-4e81-9441-172192216b2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.738958 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f908c73c-f1fb-4e81-9441-172192216b2d" (UID: "f908c73c-f1fb-4e81-9441-172192216b2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.782916 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.782945 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.782953 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.782962 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f908c73c-f1fb-4e81-9441-172192216b2d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.782971 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7c5h\" (UniqueName: \"kubernetes.io/projected/f908c73c-f1fb-4e81-9441-172192216b2d-kube-api-access-n7c5h\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.855305 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xz5sl"] Feb 19 21:47:56 crc kubenswrapper[4771]: I0219 21:47:56.860499 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xz5sl"] Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.055848 4771 generic.go:334] "Generic (PLEG): container finished" podID="f908c73c-f1fb-4e81-9441-172192216b2d" containerID="7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686" exitCode=0 Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.055911 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.055919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" event={"ID":"f908c73c-f1fb-4e81-9441-172192216b2d","Type":"ContainerDied","Data":"7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686"} Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.055965 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-nzwmx" event={"ID":"f908c73c-f1fb-4e81-9441-172192216b2d","Type":"ContainerDied","Data":"9455936ca07baee2d9224a2bb198b9a00060c562b1d758e76d9aa68f4eebd483"} Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.055985 4771 scope.go:117] "RemoveContainer" containerID="7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.058330 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17cd62c3-f3af-4144-b6b1-ec2cafb424ad","Type":"ContainerStarted","Data":"9649db40b00ac2459bb1317904d312d05a0f8a16cff8030457aea979837d6541"} Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.059106 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.085082 4771 scope.go:117] "RemoveContainer" containerID="4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.106086 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.585924187 podStartE2EDuration="58.106065461s" podCreationTimestamp="2026-02-19 21:46:59 +0000 UTC" firstStartedPulling="2026-02-19 21:47:01.12269669 +0000 UTC m=+1121.394139160" lastFinishedPulling="2026-02-19 21:47:19.642837944 +0000 UTC m=+1139.914280434" observedRunningTime="2026-02-19 21:47:57.1030356 +0000 UTC m=+1177.374478090" watchObservedRunningTime="2026-02-19 21:47:57.106065461 +0000 UTC m=+1177.377507931" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.123877 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-nzwmx"] Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.130958 4771 scope.go:117] "RemoveContainer" containerID="7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686" Feb 19 21:47:57 crc kubenswrapper[4771]: E0219 21:47:57.131339 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686\": container with ID starting with 7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686 not found: ID does not exist" containerID="7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.131367 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686"} err="failed to get container status \"7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686\": rpc error: code = NotFound desc = could not find container \"7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686\": container with ID starting with 7420d3be3a653c2103528d0b89ad1318a8aefa4d1b6d376180bcf9ba48101686 not found: ID does not exist" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.131387 4771 scope.go:117] "RemoveContainer" containerID="4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c" Feb 19 21:47:57 crc kubenswrapper[4771]: E0219 21:47:57.131545 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c\": container with ID starting with 4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c not found: ID does not exist" containerID="4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.131562 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c"} err="failed to get container status \"4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c\": rpc error: code = NotFound desc = could not find container \"4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c\": container with ID starting with 4eacda3b43d44a8d92c62453f48435bd83455784890ea37a52e82daf1fcfb33c not found: ID does not exist" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.135740 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-nzwmx"] Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.382790 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.495555 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.497128 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-operator-scripts\") pod \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.497534 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5b4k\" (UniqueName: \"kubernetes.io/projected/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-kube-api-access-p5b4k\") pod \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\" (UID: \"40643c92-54ba-4c6b-bad6-c8d0f7f8529b\") " Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.497599 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40643c92-54ba-4c6b-bad6-c8d0f7f8529b" (UID: "40643c92-54ba-4c6b-bad6-c8d0f7f8529b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.498100 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.501718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-kube-api-access-p5b4k" (OuterVolumeSpecName: "kube-api-access-p5b4k") pod "40643c92-54ba-4c6b-bad6-c8d0f7f8529b" (UID: "40643c92-54ba-4c6b-bad6-c8d0f7f8529b"). InnerVolumeSpecName "kube-api-access-p5b4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.599600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf8wh\" (UniqueName: \"kubernetes.io/projected/5fea7660-6ec1-4456-b730-44b1e3ad5941-kube-api-access-zf8wh\") pod \"5fea7660-6ec1-4456-b730-44b1e3ad5941\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.599776 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fea7660-6ec1-4456-b730-44b1e3ad5941-operator-scripts\") pod \"5fea7660-6ec1-4456-b730-44b1e3ad5941\" (UID: \"5fea7660-6ec1-4456-b730-44b1e3ad5941\") " Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.600207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fea7660-6ec1-4456-b730-44b1e3ad5941-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fea7660-6ec1-4456-b730-44b1e3ad5941" (UID: "5fea7660-6ec1-4456-b730-44b1e3ad5941"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.600243 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5b4k\" (UniqueName: \"kubernetes.io/projected/40643c92-54ba-4c6b-bad6-c8d0f7f8529b-kube-api-access-p5b4k\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.603449 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fea7660-6ec1-4456-b730-44b1e3ad5941-kube-api-access-zf8wh" (OuterVolumeSpecName: "kube-api-access-zf8wh") pod "5fea7660-6ec1-4456-b730-44b1e3ad5941" (UID: "5fea7660-6ec1-4456-b730-44b1e3ad5941"). InnerVolumeSpecName "kube-api-access-zf8wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.702284 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fea7660-6ec1-4456-b730-44b1e3ad5941-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:57 crc kubenswrapper[4771]: I0219 21:47:57.702339 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf8wh\" (UniqueName: \"kubernetes.io/projected/5fea7660-6ec1-4456-b730-44b1e3ad5941-kube-api-access-zf8wh\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.073428 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a3b-account-create-update-w4lvm" event={"ID":"40643c92-54ba-4c6b-bad6-c8d0f7f8529b","Type":"ContainerDied","Data":"1efbdc1b45e83487dd3d6770fd8eaf1c43e10e6d770193799afeb5b034426bc4"} Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.073499 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1efbdc1b45e83487dd3d6770fd8eaf1c43e10e6d770193799afeb5b034426bc4" Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.073596 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-w4lvm" Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.093239 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7jgvs" Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.093227 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7jgvs" event={"ID":"5fea7660-6ec1-4456-b730-44b1e3ad5941","Type":"ContainerDied","Data":"0ad374ab44b8c96b67c530690aaea00b5a15356b9916d4c200a38da5100e1cd3"} Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.093500 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad374ab44b8c96b67c530690aaea00b5a15356b9916d4c200a38da5100e1cd3" Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.456189 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7989728-10af-4f46-9b05-946ab44d7145" path="/var/lib/kubelet/pods/a7989728-10af-4f46-9b05-946ab44d7145/volumes" Feb 19 21:47:58 crc kubenswrapper[4771]: I0219 21:47:58.457259 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f908c73c-f1fb-4e81-9441-172192216b2d" path="/var/lib/kubelet/pods/f908c73c-f1fb-4e81-9441-172192216b2d/volumes" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.107488 4771 generic.go:334] "Generic (PLEG): container finished" podID="afd70791-ecce-405a-ba45-968a1e967ef3" containerID="ad68261083ec7a599ecf4fd0bb7914ee14b7d216c3177a543f672928b2ec7a8a" exitCode=0 Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.107578 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j7wbb" event={"ID":"afd70791-ecce-405a-ba45-968a1e967ef3","Type":"ContainerDied","Data":"ad68261083ec7a599ecf4fd0bb7914ee14b7d216c3177a543f672928b2ec7a8a"} Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.162788 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-n8v7z"] Feb 19 21:47:59 crc kubenswrapper[4771]: E0219 21:47:59.163162 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fea7660-6ec1-4456-b730-44b1e3ad5941" containerName="mariadb-database-create" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.163180 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fea7660-6ec1-4456-b730-44b1e3ad5941" containerName="mariadb-database-create" Feb 19 21:47:59 crc kubenswrapper[4771]: E0219 21:47:59.163205 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f908c73c-f1fb-4e81-9441-172192216b2d" containerName="dnsmasq-dns" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.163211 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f908c73c-f1fb-4e81-9441-172192216b2d" containerName="dnsmasq-dns" Feb 19 21:47:59 crc kubenswrapper[4771]: E0219 21:47:59.163220 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f908c73c-f1fb-4e81-9441-172192216b2d" containerName="init" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.163227 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f908c73c-f1fb-4e81-9441-172192216b2d" containerName="init" Feb 19 21:47:59 crc kubenswrapper[4771]: E0219 21:47:59.163245 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40643c92-54ba-4c6b-bad6-c8d0f7f8529b" containerName="mariadb-account-create-update" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.163251 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="40643c92-54ba-4c6b-bad6-c8d0f7f8529b" containerName="mariadb-account-create-update" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.163404 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="40643c92-54ba-4c6b-bad6-c8d0f7f8529b" containerName="mariadb-account-create-update" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.163414 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fea7660-6ec1-4456-b730-44b1e3ad5941" containerName="mariadb-database-create" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.163431 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f908c73c-f1fb-4e81-9441-172192216b2d" containerName="dnsmasq-dns" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.167338 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.170993 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.171268 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6wk49" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.180208 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n8v7z"] Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.235052 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-config-data\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.235096 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-db-sync-config-data\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.235116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-combined-ca-bundle\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.235131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wmv\" (UniqueName: \"kubernetes.io/projected/cb551a27-bdd4-433e-9d91-48ce01e7592c-kube-api-access-t2wmv\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.336645 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-config-data\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.336689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-db-sync-config-data\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.336706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-combined-ca-bundle\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.336723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wmv\" (UniqueName: \"kubernetes.io/projected/cb551a27-bdd4-433e-9d91-48ce01e7592c-kube-api-access-t2wmv\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.342692 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-db-sync-config-data\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.343254 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-combined-ca-bundle\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.345096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-config-data\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.353848 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wmv\" (UniqueName: \"kubernetes.io/projected/cb551a27-bdd4-433e-9d91-48ce01e7592c-kube-api-access-t2wmv\") pod \"glance-db-sync-n8v7z\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.488721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n8v7z" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.714104 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 21:47:59 crc kubenswrapper[4771]: I0219 21:47:59.885480 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n8v7z"] Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.117450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n8v7z" event={"ID":"cb551a27-bdd4-433e-9d91-48ce01e7592c","Type":"ContainerStarted","Data":"b1f0030ec85f235f4da9dd2fec8c651dc4d85b79573fe9a0aa9aa9de12f2341c"} Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.519387 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.545032 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zgrgw"] Feb 19 21:48:00 crc kubenswrapper[4771]: E0219 21:48:00.545410 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd70791-ecce-405a-ba45-968a1e967ef3" containerName="swift-ring-rebalance" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.545426 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd70791-ecce-405a-ba45-968a1e967ef3" containerName="swift-ring-rebalance" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.545573 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd70791-ecce-405a-ba45-968a1e967ef3" containerName="swift-ring-rebalance" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.546123 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.548043 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.562810 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zgrgw"] Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.658575 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-swiftconf\") pod \"afd70791-ecce-405a-ba45-968a1e967ef3\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.658653 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-dispersionconf\") pod \"afd70791-ecce-405a-ba45-968a1e967ef3\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.659179 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-combined-ca-bundle\") pod \"afd70791-ecce-405a-ba45-968a1e967ef3\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.659284 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-scripts\") pod \"afd70791-ecce-405a-ba45-968a1e967ef3\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.659307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afd70791-ecce-405a-ba45-968a1e967ef3-etc-swift\") pod \"afd70791-ecce-405a-ba45-968a1e967ef3\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.659390 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-ring-data-devices\") pod \"afd70791-ecce-405a-ba45-968a1e967ef3\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.659466 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz2vd\" (UniqueName: \"kubernetes.io/projected/afd70791-ecce-405a-ba45-968a1e967ef3-kube-api-access-mz2vd\") pod \"afd70791-ecce-405a-ba45-968a1e967ef3\" (UID: \"afd70791-ecce-405a-ba45-968a1e967ef3\") " Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.660204 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4233ac-21ce-423a-9f15-50f7a8062161-operator-scripts\") pod \"root-account-create-update-zgrgw\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.660547 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afd70791-ecce-405a-ba45-968a1e967ef3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "afd70791-ecce-405a-ba45-968a1e967ef3" (UID: "afd70791-ecce-405a-ba45-968a1e967ef3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.660760 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/da4233ac-21ce-423a-9f15-50f7a8062161-kube-api-access-7sqqv\") pod \"root-account-create-update-zgrgw\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.660846 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afd70791-ecce-405a-ba45-968a1e967ef3-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.660873 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "afd70791-ecce-405a-ba45-968a1e967ef3" (UID: "afd70791-ecce-405a-ba45-968a1e967ef3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.664264 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd70791-ecce-405a-ba45-968a1e967ef3-kube-api-access-mz2vd" (OuterVolumeSpecName: "kube-api-access-mz2vd") pod "afd70791-ecce-405a-ba45-968a1e967ef3" (UID: "afd70791-ecce-405a-ba45-968a1e967ef3"). InnerVolumeSpecName "kube-api-access-mz2vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.666536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "afd70791-ecce-405a-ba45-968a1e967ef3" (UID: "afd70791-ecce-405a-ba45-968a1e967ef3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.678557 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-scripts" (OuterVolumeSpecName: "scripts") pod "afd70791-ecce-405a-ba45-968a1e967ef3" (UID: "afd70791-ecce-405a-ba45-968a1e967ef3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.680402 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afd70791-ecce-405a-ba45-968a1e967ef3" (UID: "afd70791-ecce-405a-ba45-968a1e967ef3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.690088 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "afd70791-ecce-405a-ba45-968a1e967ef3" (UID: "afd70791-ecce-405a-ba45-968a1e967ef3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762187 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4233ac-21ce-423a-9f15-50f7a8062161-operator-scripts\") pod \"root-account-create-update-zgrgw\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762247 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/da4233ac-21ce-423a-9f15-50f7a8062161-kube-api-access-7sqqv\") pod \"root-account-create-update-zgrgw\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762305 4771 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762316 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz2vd\" (UniqueName: \"kubernetes.io/projected/afd70791-ecce-405a-ba45-968a1e967ef3-kube-api-access-mz2vd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762328 4771 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762336 4771 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762345 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd70791-ecce-405a-ba45-968a1e967ef3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.762354 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd70791-ecce-405a-ba45-968a1e967ef3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.763248 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4233ac-21ce-423a-9f15-50f7a8062161-operator-scripts\") pod \"root-account-create-update-zgrgw\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.781335 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/da4233ac-21ce-423a-9f15-50f7a8062161-kube-api-access-7sqqv\") pod \"root-account-create-update-zgrgw\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:00 crc kubenswrapper[4771]: I0219 21:48:00.859717 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:01 crc kubenswrapper[4771]: I0219 21:48:01.125824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j7wbb" event={"ID":"afd70791-ecce-405a-ba45-968a1e967ef3","Type":"ContainerDied","Data":"2864b33bf9ba6c3199bc3cb27986dc5781155ded9d36bf01ef5af9dcbc1bdc20"} Feb 19 21:48:01 crc kubenswrapper[4771]: I0219 21:48:01.126236 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2864b33bf9ba6c3199bc3cb27986dc5781155ded9d36bf01ef5af9dcbc1bdc20" Feb 19 21:48:01 crc kubenswrapper[4771]: I0219 21:48:01.125890 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j7wbb" Feb 19 21:48:01 crc kubenswrapper[4771]: I0219 21:48:01.323739 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zgrgw"] Feb 19 21:48:02 crc kubenswrapper[4771]: I0219 21:48:02.133869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgrgw" event={"ID":"da4233ac-21ce-423a-9f15-50f7a8062161","Type":"ContainerStarted","Data":"8cca09843dd3644e93ed1e8cc9ecc581a4f9dd9ee78bc9019acbe6a06ad7313a"} Feb 19 21:48:02 crc kubenswrapper[4771]: I0219 21:48:02.134201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgrgw" event={"ID":"da4233ac-21ce-423a-9f15-50f7a8062161","Type":"ContainerStarted","Data":"94c67a9b907db71cdd3afc09197bb5b49cc7ffbbc4e6db6e13e1ca55d15a247e"} Feb 19 21:48:02 crc kubenswrapper[4771]: I0219 21:48:02.156148 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-zgrgw" podStartSLOduration=2.156126113 podStartE2EDuration="2.156126113s" podCreationTimestamp="2026-02-19 21:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:02.151845209 +0000 UTC m=+1182.423287699" watchObservedRunningTime="2026-02-19 21:48:02.156126113 +0000 UTC m=+1182.427568613" Feb 19 21:48:02 crc kubenswrapper[4771]: I0219 21:48:02.793978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:48:02 crc kubenswrapper[4771]: I0219 21:48:02.801520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"swift-storage-0\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " pod="openstack/swift-storage-0" Feb 19 21:48:03 crc kubenswrapper[4771]: I0219 21:48:03.019031 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:48:03 crc kubenswrapper[4771]: I0219 21:48:03.141636 4771 generic.go:334] "Generic (PLEG): container finished" podID="da4233ac-21ce-423a-9f15-50f7a8062161" containerID="8cca09843dd3644e93ed1e8cc9ecc581a4f9dd9ee78bc9019acbe6a06ad7313a" exitCode=0 Feb 19 21:48:03 crc kubenswrapper[4771]: I0219 21:48:03.141685 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgrgw" event={"ID":"da4233ac-21ce-423a-9f15-50f7a8062161","Type":"ContainerDied","Data":"8cca09843dd3644e93ed1e8cc9ecc581a4f9dd9ee78bc9019acbe6a06ad7313a"} Feb 19 21:48:03 crc kubenswrapper[4771]: I0219 21:48:03.615548 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:48:03 crc kubenswrapper[4771]: W0219 21:48:03.625362 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-91f546d67d148f31cb9294b4c3aa8ef7221cd85a19519fec2d7f4f58bdcc573a WatchSource:0}: Error finding container 91f546d67d148f31cb9294b4c3aa8ef7221cd85a19519fec2d7f4f58bdcc573a: Status 404 returned error can't find the container with id 91f546d67d148f31cb9294b4c3aa8ef7221cd85a19519fec2d7f4f58bdcc573a Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.163149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"91f546d67d148f31cb9294b4c3aa8ef7221cd85a19519fec2d7f4f58bdcc573a"} Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.570822 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.725754 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4233ac-21ce-423a-9f15-50f7a8062161-operator-scripts\") pod \"da4233ac-21ce-423a-9f15-50f7a8062161\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.726004 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/da4233ac-21ce-423a-9f15-50f7a8062161-kube-api-access-7sqqv\") pod \"da4233ac-21ce-423a-9f15-50f7a8062161\" (UID: \"da4233ac-21ce-423a-9f15-50f7a8062161\") " Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.728174 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da4233ac-21ce-423a-9f15-50f7a8062161-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da4233ac-21ce-423a-9f15-50f7a8062161" (UID: "da4233ac-21ce-423a-9f15-50f7a8062161"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.736353 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4233ac-21ce-423a-9f15-50f7a8062161-kube-api-access-7sqqv" (OuterVolumeSpecName: "kube-api-access-7sqqv") pod "da4233ac-21ce-423a-9f15-50f7a8062161" (UID: "da4233ac-21ce-423a-9f15-50f7a8062161"). InnerVolumeSpecName "kube-api-access-7sqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.828954 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4233ac-21ce-423a-9f15-50f7a8062161-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:04 crc kubenswrapper[4771]: I0219 21:48:04.829078 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/da4233ac-21ce-423a-9f15-50f7a8062161-kube-api-access-7sqqv\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:05 crc kubenswrapper[4771]: I0219 21:48:05.179817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgrgw" event={"ID":"da4233ac-21ce-423a-9f15-50f7a8062161","Type":"ContainerDied","Data":"94c67a9b907db71cdd3afc09197bb5b49cc7ffbbc4e6db6e13e1ca55d15a247e"} Feb 19 21:48:05 crc kubenswrapper[4771]: I0219 21:48:05.180387 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94c67a9b907db71cdd3afc09197bb5b49cc7ffbbc4e6db6e13e1ca55d15a247e" Feb 19 21:48:05 crc kubenswrapper[4771]: I0219 21:48:05.180206 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgrgw" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.519561 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-js7lh" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:48:06 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 21:48:06 crc kubenswrapper[4771]: > Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.556470 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.561822 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.817844 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-js7lh-config-9q84n"] Feb 19 21:48:06 crc kubenswrapper[4771]: E0219 21:48:05.818416 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4233ac-21ce-423a-9f15-50f7a8062161" containerName="mariadb-account-create-update" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.818439 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4233ac-21ce-423a-9f15-50f7a8062161" containerName="mariadb-account-create-update" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.818744 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4233ac-21ce-423a-9f15-50f7a8062161" containerName="mariadb-account-create-update" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.819558 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.824618 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.839251 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-js7lh-config-9q84n"] Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.948335 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-log-ovn\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.948396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.948494 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5tpf\" (UniqueName: \"kubernetes.io/projected/5de3aba5-1018-4a1e-a626-3241efd0fbc8-kube-api-access-l5tpf\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.948601 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-additional-scripts\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.948680 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run-ovn\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:05.948747 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-scripts\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5tpf\" (UniqueName: \"kubernetes.io/projected/5de3aba5-1018-4a1e-a626-3241efd0fbc8-kube-api-access-l5tpf\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050349 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-additional-scripts\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050413 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run-ovn\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050443 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-scripts\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050547 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-log-ovn\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050578 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050750 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run-ovn\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050770 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.050758 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-log-ovn\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.051330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-additional-scripts\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.052619 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-scripts\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.074780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5tpf\" (UniqueName: \"kubernetes.io/projected/5de3aba5-1018-4a1e-a626-3241efd0fbc8-kube-api-access-l5tpf\") pod \"ovn-controller-js7lh-config-9q84n\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.139512 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.634594 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-js7lh-config-9q84n"] Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.873188 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zgrgw"] Feb 19 21:48:06 crc kubenswrapper[4771]: I0219 21:48:06.881660 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zgrgw"] Feb 19 21:48:08 crc kubenswrapper[4771]: I0219 21:48:08.211716 4771 generic.go:334] "Generic (PLEG): container finished" podID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerID="7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9" exitCode=0 Feb 19 21:48:08 crc kubenswrapper[4771]: I0219 21:48:08.211810 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c671dcf6-b1eb-4c4e-ba71-ae115ce811da","Type":"ContainerDied","Data":"7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9"} Feb 19 21:48:08 crc kubenswrapper[4771]: I0219 21:48:08.451103 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4233ac-21ce-423a-9f15-50f7a8062161" path="/var/lib/kubelet/pods/da4233ac-21ce-423a-9f15-50f7a8062161/volumes" Feb 19 21:48:10 crc kubenswrapper[4771]: I0219 21:48:10.493616 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:48:10 crc kubenswrapper[4771]: I0219 21:48:10.512171 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-js7lh" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:48:10 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 21:48:10 crc kubenswrapper[4771]: > Feb 19 21:48:11 crc kubenswrapper[4771]: I0219 21:48:11.878869 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-x6lqr"] Feb 19 21:48:11 crc kubenswrapper[4771]: I0219 21:48:11.880719 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:11 crc kubenswrapper[4771]: I0219 21:48:11.889681 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 21:48:11 crc kubenswrapper[4771]: I0219 21:48:11.894194 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x6lqr"] Feb 19 21:48:11 crc kubenswrapper[4771]: I0219 21:48:11.973249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6rs8\" (UniqueName: \"kubernetes.io/projected/138981ec-af45-4888-a757-b666d60513d3-kube-api-access-b6rs8\") pod \"root-account-create-update-x6lqr\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:11 crc kubenswrapper[4771]: I0219 21:48:11.973377 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138981ec-af45-4888-a757-b666d60513d3-operator-scripts\") pod \"root-account-create-update-x6lqr\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:12 crc kubenswrapper[4771]: I0219 21:48:12.074595 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138981ec-af45-4888-a757-b666d60513d3-operator-scripts\") pod \"root-account-create-update-x6lqr\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:12 crc kubenswrapper[4771]: I0219 21:48:12.074773 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6rs8\" (UniqueName: \"kubernetes.io/projected/138981ec-af45-4888-a757-b666d60513d3-kube-api-access-b6rs8\") pod \"root-account-create-update-x6lqr\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:12 crc kubenswrapper[4771]: I0219 21:48:12.075353 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138981ec-af45-4888-a757-b666d60513d3-operator-scripts\") pod \"root-account-create-update-x6lqr\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:12 crc kubenswrapper[4771]: I0219 21:48:12.117174 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6rs8\" (UniqueName: \"kubernetes.io/projected/138981ec-af45-4888-a757-b666d60513d3-kube-api-access-b6rs8\") pod \"root-account-create-update-x6lqr\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:12 crc kubenswrapper[4771]: I0219 21:48:12.213087 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:13 crc kubenswrapper[4771]: E0219 21:48:13.788249 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2a146cb0eb1a819e7b367354687fa3eeb3894fa4a03eadd0dc2e2c849345cbf0" Feb 19 21:48:13 crc kubenswrapper[4771]: E0219 21:48:13.789112 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2a146cb0eb1a819e7b367354687fa3eeb3894fa4a03eadd0dc2e2c849345cbf0,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2wmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-n8v7z_openstack(cb551a27-bdd4-433e-9d91-48ce01e7592c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:48:13 crc kubenswrapper[4771]: E0219 21:48:13.790861 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-n8v7z" podUID="cb551a27-bdd4-433e-9d91-48ce01e7592c" Feb 19 21:48:14 crc kubenswrapper[4771]: I0219 21:48:14.232185 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x6lqr"] Feb 19 21:48:14 crc kubenswrapper[4771]: I0219 21:48:14.281903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh-config-9q84n" event={"ID":"5de3aba5-1018-4a1e-a626-3241efd0fbc8","Type":"ContainerStarted","Data":"ad4613573e9fc8ef5d2dd08b879f2f2aafd896bfe9a23fbca9764b89917681da"} Feb 19 21:48:14 crc kubenswrapper[4771]: I0219 21:48:14.290081 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x6lqr" event={"ID":"138981ec-af45-4888-a757-b666d60513d3","Type":"ContainerStarted","Data":"21c7cdb7d3ce794b0cf203704151935a87844b26961b9e1939aa7a4bb1731a35"} Feb 19 21:48:14 crc kubenswrapper[4771]: E0219 21:48:14.291759 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2a146cb0eb1a819e7b367354687fa3eeb3894fa4a03eadd0dc2e2c849345cbf0\\\"\"" pod="openstack/glance-db-sync-n8v7z" podUID="cb551a27-bdd4-433e-9d91-48ce01e7592c" Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.300421 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"f711f4133d60e80e95d60ebf53d43892d134a7d9eb5c4126f1c2d737d2fa0c70"} Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.301076 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"0d8112d0d11bb87f970617cd7d38234c852cb777a0ea50e9330bd9eec7a4c5b5"} Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.301099 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"b55b4a1bb4d3ccc6512f177b1e2310a9648853f7b9b1181b71f4d9e34527c221"} Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.301116 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"25638182c5db50cf9a8ae3158a1f97e75098b4e1dd04c4bf5390d848f2cf8116"} Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.303633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c671dcf6-b1eb-4c4e-ba71-ae115ce811da","Type":"ContainerStarted","Data":"2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0"} Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.303869 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.306241 4771 generic.go:334] "Generic (PLEG): container finished" podID="138981ec-af45-4888-a757-b666d60513d3" containerID="b644a611fb45629cd8753f4162c8679d62729f768396dc3a438befa376dfbcc8" exitCode=0 Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.306361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x6lqr" event={"ID":"138981ec-af45-4888-a757-b666d60513d3","Type":"ContainerDied","Data":"b644a611fb45629cd8753f4162c8679d62729f768396dc3a438befa376dfbcc8"} Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.308689 4771 generic.go:334] "Generic (PLEG): container finished" podID="5de3aba5-1018-4a1e-a626-3241efd0fbc8" containerID="cdda98f4356ab69d51d2afb39a09470445a49137d8bf71e2e8719ca4e104b16a" exitCode=0 Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.308743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh-config-9q84n" event={"ID":"5de3aba5-1018-4a1e-a626-3241efd0fbc8","Type":"ContainerDied","Data":"cdda98f4356ab69d51d2afb39a09470445a49137d8bf71e2e8719ca4e104b16a"} Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.352166 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371959.502636 podStartE2EDuration="1m17.352139614s" podCreationTimestamp="2026-02-19 21:46:58 +0000 UTC" firstStartedPulling="2026-02-19 21:47:00.239109284 +0000 UTC m=+1120.510551754" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:15.341157472 +0000 UTC m=+1195.612600012" watchObservedRunningTime="2026-02-19 21:48:15.352139614 +0000 UTC m=+1195.623582094" Feb 19 21:48:15 crc kubenswrapper[4771]: I0219 21:48:15.519398 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-js7lh" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.635015 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.645738 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.756837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-scripts\") pod \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.756894 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5tpf\" (UniqueName: \"kubernetes.io/projected/5de3aba5-1018-4a1e-a626-3241efd0fbc8-kube-api-access-l5tpf\") pod \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.756956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run-ovn\") pod \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757043 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138981ec-af45-4888-a757-b666d60513d3-operator-scripts\") pod \"138981ec-af45-4888-a757-b666d60513d3\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757098 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run\") pod \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757127 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-additional-scripts\") pod \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757192 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-log-ovn\") pod \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\" (UID: \"5de3aba5-1018-4a1e-a626-3241efd0fbc8\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757228 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6rs8\" (UniqueName: \"kubernetes.io/projected/138981ec-af45-4888-a757-b666d60513d3-kube-api-access-b6rs8\") pod \"138981ec-af45-4888-a757-b666d60513d3\" (UID: \"138981ec-af45-4888-a757-b666d60513d3\") " Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run" (OuterVolumeSpecName: "var-run") pod "5de3aba5-1018-4a1e-a626-3241efd0fbc8" (UID: "5de3aba5-1018-4a1e-a626-3241efd0fbc8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757345 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5de3aba5-1018-4a1e-a626-3241efd0fbc8" (UID: "5de3aba5-1018-4a1e-a626-3241efd0fbc8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5de3aba5-1018-4a1e-a626-3241efd0fbc8" (UID: "5de3aba5-1018-4a1e-a626-3241efd0fbc8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138981ec-af45-4888-a757-b666d60513d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "138981ec-af45-4888-a757-b666d60513d3" (UID: "138981ec-af45-4888-a757-b666d60513d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.757917 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5de3aba5-1018-4a1e-a626-3241efd0fbc8" (UID: "5de3aba5-1018-4a1e-a626-3241efd0fbc8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.758182 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-scripts" (OuterVolumeSpecName: "scripts") pod "5de3aba5-1018-4a1e-a626-3241efd0fbc8" (UID: "5de3aba5-1018-4a1e-a626-3241efd0fbc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.758287 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.758305 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.758317 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.758331 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138981ec-af45-4888-a757-b666d60513d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.758344 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5de3aba5-1018-4a1e-a626-3241efd0fbc8-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.758355 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5de3aba5-1018-4a1e-a626-3241efd0fbc8-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.763717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138981ec-af45-4888-a757-b666d60513d3-kube-api-access-b6rs8" (OuterVolumeSpecName: "kube-api-access-b6rs8") pod "138981ec-af45-4888-a757-b666d60513d3" (UID: "138981ec-af45-4888-a757-b666d60513d3"). InnerVolumeSpecName "kube-api-access-b6rs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.764594 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de3aba5-1018-4a1e-a626-3241efd0fbc8-kube-api-access-l5tpf" (OuterVolumeSpecName: "kube-api-access-l5tpf") pod "5de3aba5-1018-4a1e-a626-3241efd0fbc8" (UID: "5de3aba5-1018-4a1e-a626-3241efd0fbc8"). InnerVolumeSpecName "kube-api-access-l5tpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.859356 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5tpf\" (UniqueName: \"kubernetes.io/projected/5de3aba5-1018-4a1e-a626-3241efd0fbc8-kube-api-access-l5tpf\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:16 crc kubenswrapper[4771]: I0219 21:48:16.859381 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6rs8\" (UniqueName: \"kubernetes.io/projected/138981ec-af45-4888-a757-b666d60513d3-kube-api-access-b6rs8\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.327619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x6lqr" event={"ID":"138981ec-af45-4888-a757-b666d60513d3","Type":"ContainerDied","Data":"21c7cdb7d3ce794b0cf203704151935a87844b26961b9e1939aa7a4bb1731a35"} Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.327914 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21c7cdb7d3ce794b0cf203704151935a87844b26961b9e1939aa7a4bb1731a35" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.327861 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6lqr" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.330631 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-9q84n" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.331115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh-config-9q84n" event={"ID":"5de3aba5-1018-4a1e-a626-3241efd0fbc8","Type":"ContainerDied","Data":"ad4613573e9fc8ef5d2dd08b879f2f2aafd896bfe9a23fbca9764b89917681da"} Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.331179 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4613573e9fc8ef5d2dd08b879f2f2aafd896bfe9a23fbca9764b89917681da" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.335620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"53da6e944f47c7e6208414ec28ec1aa3e842f3fe992ab24148a4550284b94cd8"} Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.335661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"793312b4189c234b06ff32b07e8e0d5dfc74266fc0a1de9300ef68cf6768b6c2"} Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.335696 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"3e4e004099ba936950340e5e238183d77b40a39f4a97474d8ddb30dfaefb94a9"} Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.335718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"eb415e751672ce67a387f56913bb8398561b2f551eec4c660857fbfeb0059682"} Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.761009 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-js7lh-config-9q84n"] Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.774182 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-js7lh-config-9q84n"] Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.894852 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-js7lh-config-d6w4q"] Feb 19 21:48:17 crc kubenswrapper[4771]: E0219 21:48:17.895260 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de3aba5-1018-4a1e-a626-3241efd0fbc8" containerName="ovn-config" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.895281 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de3aba5-1018-4a1e-a626-3241efd0fbc8" containerName="ovn-config" Feb 19 21:48:17 crc kubenswrapper[4771]: E0219 21:48:17.895313 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138981ec-af45-4888-a757-b666d60513d3" containerName="mariadb-account-create-update" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.895321 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="138981ec-af45-4888-a757-b666d60513d3" containerName="mariadb-account-create-update" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.895516 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="138981ec-af45-4888-a757-b666d60513d3" containerName="mariadb-account-create-update" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.895547 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de3aba5-1018-4a1e-a626-3241efd0fbc8" containerName="ovn-config" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.897286 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.899924 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.923047 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-js7lh-config-d6w4q"] Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.980117 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-log-ovn\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.980174 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-scripts\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.980200 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run-ovn\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.980229 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ctjf\" (UniqueName: \"kubernetes.io/projected/113b41f4-7bad-44b1-885f-2b673d53ad18-kube-api-access-8ctjf\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.980259 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-additional-scripts\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:17 crc kubenswrapper[4771]: I0219 21:48:17.980304 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.083003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.083239 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-log-ovn\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.083270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-scripts\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.083297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run-ovn\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.083320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ctjf\" (UniqueName: \"kubernetes.io/projected/113b41f4-7bad-44b1-885f-2b673d53ad18-kube-api-access-8ctjf\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.083359 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-additional-scripts\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.084286 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-additional-scripts\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.084320 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.084320 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run-ovn\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.084382 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-log-ovn\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.085888 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-scripts\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.115717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ctjf\" (UniqueName: \"kubernetes.io/projected/113b41f4-7bad-44b1-885f-2b673d53ad18-kube-api-access-8ctjf\") pod \"ovn-controller-js7lh-config-d6w4q\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.224599 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.460373 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de3aba5-1018-4a1e-a626-3241efd0fbc8" path="/var/lib/kubelet/pods/5de3aba5-1018-4a1e-a626-3241efd0fbc8/volumes" Feb 19 21:48:18 crc kubenswrapper[4771]: I0219 21:48:18.896644 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-js7lh-config-d6w4q"] Feb 19 21:48:18 crc kubenswrapper[4771]: W0219 21:48:18.900747 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod113b41f4_7bad_44b1_885f_2b673d53ad18.slice/crio-6408a1c5fd926ac84477e44a7cf8609de6ad795a351f6b32b11117f0d4fdaa60 WatchSource:0}: Error finding container 6408a1c5fd926ac84477e44a7cf8609de6ad795a351f6b32b11117f0d4fdaa60: Status 404 returned error can't find the container with id 6408a1c5fd926ac84477e44a7cf8609de6ad795a351f6b32b11117f0d4fdaa60 Feb 19 21:48:19 crc kubenswrapper[4771]: I0219 21:48:19.356494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"d59b28bcc6dd2c89348367ae07b2b6652ecabd2e3eb7e906cb4ce9dfbdfd12d9"} Feb 19 21:48:19 crc kubenswrapper[4771]: I0219 21:48:19.356903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"89041a4e9d9ebc9c0d58f88cef337d9f987ef35bcc4c225ef45ad3a183a71e6a"} Feb 19 21:48:19 crc kubenswrapper[4771]: I0219 21:48:19.356916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"56daf9ee26ebf4c3d4c6900b71838aba932c5f271a2526fe5d21871ad34fdb76"} Feb 19 21:48:19 crc kubenswrapper[4771]: I0219 21:48:19.357615 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh-config-d6w4q" event={"ID":"113b41f4-7bad-44b1-885f-2b673d53ad18","Type":"ContainerStarted","Data":"a48ec811c0a5e4d47be5eb264a30dddf34cda4b99f616321d34ff066b2ee7c3e"} Feb 19 21:48:19 crc kubenswrapper[4771]: I0219 21:48:19.357658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh-config-d6w4q" event={"ID":"113b41f4-7bad-44b1-885f-2b673d53ad18","Type":"ContainerStarted","Data":"6408a1c5fd926ac84477e44a7cf8609de6ad795a351f6b32b11117f0d4fdaa60"} Feb 19 21:48:19 crc kubenswrapper[4771]: I0219 21:48:19.371521 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-js7lh-config-d6w4q" podStartSLOduration=2.371502831 podStartE2EDuration="2.371502831s" podCreationTimestamp="2026-02-19 21:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:19.370565157 +0000 UTC m=+1199.642007697" watchObservedRunningTime="2026-02-19 21:48:19.371502831 +0000 UTC m=+1199.642945301" Feb 19 21:48:20 crc kubenswrapper[4771]: I0219 21:48:20.371696 4771 generic.go:334] "Generic (PLEG): container finished" podID="113b41f4-7bad-44b1-885f-2b673d53ad18" containerID="a48ec811c0a5e4d47be5eb264a30dddf34cda4b99f616321d34ff066b2ee7c3e" exitCode=0 Feb 19 21:48:20 crc kubenswrapper[4771]: I0219 21:48:20.373175 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh-config-d6w4q" event={"ID":"113b41f4-7bad-44b1-885f-2b673d53ad18","Type":"ContainerDied","Data":"a48ec811c0a5e4d47be5eb264a30dddf34cda4b99f616321d34ff066b2ee7c3e"} Feb 19 21:48:20 crc kubenswrapper[4771]: I0219 21:48:20.384511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"37b58896182f7320a318dedbc51a682c252143cf1f5f7dc1cc5a4d0d41fd6a9e"} Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.148286 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168064 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-log-ovn\") pod \"113b41f4-7bad-44b1-885f-2b673d53ad18\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168192 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-scripts\") pod \"113b41f4-7bad-44b1-885f-2b673d53ad18\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168225 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "113b41f4-7bad-44b1-885f-2b673d53ad18" (UID: "113b41f4-7bad-44b1-885f-2b673d53ad18"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168317 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ctjf\" (UniqueName: \"kubernetes.io/projected/113b41f4-7bad-44b1-885f-2b673d53ad18-kube-api-access-8ctjf\") pod \"113b41f4-7bad-44b1-885f-2b673d53ad18\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168446 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run\") pod \"113b41f4-7bad-44b1-885f-2b673d53ad18\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168527 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run-ovn\") pod \"113b41f4-7bad-44b1-885f-2b673d53ad18\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168565 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-additional-scripts\") pod \"113b41f4-7bad-44b1-885f-2b673d53ad18\" (UID: \"113b41f4-7bad-44b1-885f-2b673d53ad18\") " Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168637 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run" (OuterVolumeSpecName: "var-run") pod "113b41f4-7bad-44b1-885f-2b673d53ad18" (UID: "113b41f4-7bad-44b1-885f-2b673d53ad18"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.168702 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "113b41f4-7bad-44b1-885f-2b673d53ad18" (UID: "113b41f4-7bad-44b1-885f-2b673d53ad18"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.169533 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.169558 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "113b41f4-7bad-44b1-885f-2b673d53ad18" (UID: "113b41f4-7bad-44b1-885f-2b673d53ad18"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.169576 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.169605 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/113b41f4-7bad-44b1-885f-2b673d53ad18-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.171410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-scripts" (OuterVolumeSpecName: "scripts") pod "113b41f4-7bad-44b1-885f-2b673d53ad18" (UID: "113b41f4-7bad-44b1-885f-2b673d53ad18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.204147 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113b41f4-7bad-44b1-885f-2b673d53ad18-kube-api-access-8ctjf" (OuterVolumeSpecName: "kube-api-access-8ctjf") pod "113b41f4-7bad-44b1-885f-2b673d53ad18" (UID: "113b41f4-7bad-44b1-885f-2b673d53ad18"). InnerVolumeSpecName "kube-api-access-8ctjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.270945 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.270986 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ctjf\" (UniqueName: \"kubernetes.io/projected/113b41f4-7bad-44b1-885f-2b673d53ad18-kube-api-access-8ctjf\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.270999 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/113b41f4-7bad-44b1-885f-2b673d53ad18-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.419710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh-config-d6w4q" event={"ID":"113b41f4-7bad-44b1-885f-2b673d53ad18","Type":"ContainerDied","Data":"6408a1c5fd926ac84477e44a7cf8609de6ad795a351f6b32b11117f0d4fdaa60"} Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.420176 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6408a1c5fd926ac84477e44a7cf8609de6ad795a351f6b32b11117f0d4fdaa60" Feb 19 21:48:23 crc kubenswrapper[4771]: I0219 21:48:23.419776 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh-config-d6w4q" Feb 19 21:48:24 crc kubenswrapper[4771]: I0219 21:48:24.265168 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-js7lh-config-d6w4q"] Feb 19 21:48:24 crc kubenswrapper[4771]: I0219 21:48:24.307187 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-js7lh-config-d6w4q"] Feb 19 21:48:24 crc kubenswrapper[4771]: I0219 21:48:24.447771 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113b41f4-7bad-44b1-885f-2b673d53ad18" path="/var/lib/kubelet/pods/113b41f4-7bad-44b1-885f-2b673d53ad18/volumes" Feb 19 21:48:24 crc kubenswrapper[4771]: I0219 21:48:24.448554 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"f2531aed0329bf883f8ab6b773f55605090d3db1851e36c0cc40d24deefd3f69"} Feb 19 21:48:24 crc kubenswrapper[4771]: I0219 21:48:24.448583 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"e632fa06b132d46b46064debfe1c40044370bb1e249759e09ae099922fe08194"} Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.466530 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerStarted","Data":"50d7f0fe603dcd477e42bf6d5cd23293366abcc57115b1583128c070088d06e9"} Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.539877 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=25.481360315 podStartE2EDuration="40.539855087s" podCreationTimestamp="2026-02-19 21:47:45 +0000 UTC" firstStartedPulling="2026-02-19 21:48:03.627368341 +0000 UTC m=+1183.898810811" lastFinishedPulling="2026-02-19 21:48:18.685863113 +0000 UTC m=+1198.957305583" observedRunningTime="2026-02-19 21:48:25.535650585 +0000 UTC m=+1205.807093135" watchObservedRunningTime="2026-02-19 21:48:25.539855087 +0000 UTC m=+1205.811297587" Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.853586 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-778944759-v7skj"] Feb 19 21:48:25 crc kubenswrapper[4771]: E0219 21:48:25.853906 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113b41f4-7bad-44b1-885f-2b673d53ad18" containerName="ovn-config" Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.853921 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="113b41f4-7bad-44b1-885f-2b673d53ad18" containerName="ovn-config" Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.854074 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="113b41f4-7bad-44b1-885f-2b673d53ad18" containerName="ovn-config" Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.854800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.856594 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 21:48:25 crc kubenswrapper[4771]: I0219 21:48:25.875167 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778944759-v7skj"] Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.026691 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-config\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.026738 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9x72\" (UniqueName: \"kubernetes.io/projected/4aca3142-bb0a-424d-95e6-5819e352d96d-kube-api-access-x9x72\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.026786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-swift-storage-0\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.026802 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-nb\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.026824 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-svc\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.026852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-sb\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.128143 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-config\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.128213 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9x72\" (UniqueName: \"kubernetes.io/projected/4aca3142-bb0a-424d-95e6-5819e352d96d-kube-api-access-x9x72\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.128279 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-swift-storage-0\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.128303 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-nb\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.128339 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-svc\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.128379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-sb\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.129480 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-nb\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.129630 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-sb\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.129770 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-config\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.129994 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-swift-storage-0\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.130823 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-svc\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.150331 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9x72\" (UniqueName: \"kubernetes.io/projected/4aca3142-bb0a-424d-95e6-5819e352d96d-kube-api-access-x9x72\") pod \"dnsmasq-dns-778944759-v7skj\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.172337 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.424105 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778944759-v7skj"] Feb 19 21:48:26 crc kubenswrapper[4771]: I0219 21:48:26.489549 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-v7skj" event={"ID":"4aca3142-bb0a-424d-95e6-5819e352d96d","Type":"ContainerStarted","Data":"56c110f682a30d113bf3505366a1ab2fcb9e6191b42a614396a88a5ff097514e"} Feb 19 21:48:27 crc kubenswrapper[4771]: I0219 21:48:27.503398 4771 generic.go:334] "Generic (PLEG): container finished" podID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerID="9c9311be0e9f613b3ce79a463316a523997b3a2aad17a9f59ce2814154b53a0c" exitCode=0 Feb 19 21:48:27 crc kubenswrapper[4771]: I0219 21:48:27.503517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-v7skj" event={"ID":"4aca3142-bb0a-424d-95e6-5819e352d96d","Type":"ContainerDied","Data":"9c9311be0e9f613b3ce79a463316a523997b3a2aad17a9f59ce2814154b53a0c"} Feb 19 21:48:28 crc kubenswrapper[4771]: I0219 21:48:28.521254 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-v7skj" event={"ID":"4aca3142-bb0a-424d-95e6-5819e352d96d","Type":"ContainerStarted","Data":"bdea3e5cad7287c3601f8d85ab79ea379fb9829b7f8b529b698f94c74c14672d"} Feb 19 21:48:28 crc kubenswrapper[4771]: I0219 21:48:28.521661 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:28 crc kubenswrapper[4771]: I0219 21:48:28.553462 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-778944759-v7skj" podStartSLOduration=3.5534316649999997 podStartE2EDuration="3.553431665s" podCreationTimestamp="2026-02-19 21:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:28.544709473 +0000 UTC m=+1208.816151963" watchObservedRunningTime="2026-02-19 21:48:28.553431665 +0000 UTC m=+1208.824874205" Feb 19 21:48:29 crc kubenswrapper[4771]: I0219 21:48:29.533123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n8v7z" event={"ID":"cb551a27-bdd4-433e-9d91-48ce01e7592c","Type":"ContainerStarted","Data":"4cd6b0ceee64a599a8e1a3fcddded6e00c86fe15dc3839c5337074fb5721bc3e"} Feb 19 21:48:29 crc kubenswrapper[4771]: I0219 21:48:29.564309 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-n8v7z" podStartSLOduration=2.588801422 podStartE2EDuration="30.564286349s" podCreationTimestamp="2026-02-19 21:47:59 +0000 UTC" firstStartedPulling="2026-02-19 21:47:59.886509494 +0000 UTC m=+1180.157951954" lastFinishedPulling="2026-02-19 21:48:27.861994381 +0000 UTC m=+1208.133436881" observedRunningTime="2026-02-19 21:48:29.551596052 +0000 UTC m=+1209.823038592" watchObservedRunningTime="2026-02-19 21:48:29.564286349 +0000 UTC m=+1209.835728849" Feb 19 21:48:29 crc kubenswrapper[4771]: I0219 21:48:29.762536 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.167464 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5p7sb"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.168352 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.176422 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5p7sb"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.281534 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6938-account-create-update-jnb4q"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.282436 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.286002 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.304223 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6938-account-create-update-jnb4q"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.311244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np5fj\" (UniqueName: \"kubernetes.io/projected/80696dba-d7cc-41e6-ba67-4299cdd675ff-kube-api-access-np5fj\") pod \"cinder-db-create-5p7sb\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.311338 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80696dba-d7cc-41e6-ba67-4299cdd675ff-operator-scripts\") pod \"cinder-db-create-5p7sb\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.367192 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dgk76"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.368348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.386127 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8302-account-create-update-gjj56"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.387305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.389072 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.394222 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dgk76"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.416768 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80696dba-d7cc-41e6-ba67-4299cdd675ff-operator-scripts\") pod \"cinder-db-create-5p7sb\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.416881 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33435064-417d-458a-ba36-b343066cfbed-operator-scripts\") pod \"barbican-6938-account-create-update-jnb4q\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.416958 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klkmh\" (UniqueName: \"kubernetes.io/projected/33435064-417d-458a-ba36-b343066cfbed-kube-api-access-klkmh\") pod \"barbican-6938-account-create-update-jnb4q\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.417032 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np5fj\" (UniqueName: \"kubernetes.io/projected/80696dba-d7cc-41e6-ba67-4299cdd675ff-kube-api-access-np5fj\") pod \"cinder-db-create-5p7sb\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.418143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80696dba-d7cc-41e6-ba67-4299cdd675ff-operator-scripts\") pod \"cinder-db-create-5p7sb\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.420762 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8302-account-create-update-gjj56"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.451795 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-c4gbj"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.453465 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.459631 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.459764 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c4gbj"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.459847 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59frt" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.459902 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.459992 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.468649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np5fj\" (UniqueName: \"kubernetes.io/projected/80696dba-d7cc-41e6-ba67-4299cdd675ff-kube-api-access-np5fj\") pod \"cinder-db-create-5p7sb\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.485214 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.492822 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-56s9g"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.494225 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.499868 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f977-account-create-update-n8dq9"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.501176 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.505543 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.519389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679ad423-9540-41da-8fc6-02004a615642-operator-scripts\") pod \"barbican-db-create-dgk76\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.519470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-operator-scripts\") pod \"cinder-8302-account-create-update-gjj56\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.519554 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x4vr\" (UniqueName: \"kubernetes.io/projected/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-kube-api-access-8x4vr\") pod \"cinder-8302-account-create-update-gjj56\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.519615 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbb28\" (UniqueName: \"kubernetes.io/projected/679ad423-9540-41da-8fc6-02004a615642-kube-api-access-fbb28\") pod \"barbican-db-create-dgk76\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.519654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33435064-417d-458a-ba36-b343066cfbed-operator-scripts\") pod \"barbican-6938-account-create-update-jnb4q\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.519831 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klkmh\" (UniqueName: \"kubernetes.io/projected/33435064-417d-458a-ba36-b343066cfbed-kube-api-access-klkmh\") pod \"barbican-6938-account-create-update-jnb4q\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.520564 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33435064-417d-458a-ba36-b343066cfbed-operator-scripts\") pod \"barbican-6938-account-create-update-jnb4q\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.529476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-56s9g"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.534947 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f977-account-create-update-n8dq9"] Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.566886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klkmh\" (UniqueName: \"kubernetes.io/projected/33435064-417d-458a-ba36-b343066cfbed-kube-api-access-klkmh\") pod \"barbican-6938-account-create-update-jnb4q\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:30 crc kubenswrapper[4771]: I0219 21:48:30.596574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.620822 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-operator-scripts\") pod \"neutron-f977-account-create-update-n8dq9\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.620915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679ad423-9540-41da-8fc6-02004a615642-operator-scripts\") pod \"barbican-db-create-dgk76\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.620935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a50dde-d805-4a5a-8340-1d5b5ede00eb-operator-scripts\") pod \"neutron-db-create-56s9g\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621011 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-config-data\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-operator-scripts\") pod \"cinder-8302-account-create-update-gjj56\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94jz\" (UniqueName: \"kubernetes.io/projected/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-kube-api-access-k94jz\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276q8\" (UniqueName: \"kubernetes.io/projected/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-kube-api-access-276q8\") pod \"neutron-f977-account-create-update-n8dq9\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621153 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-combined-ca-bundle\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621175 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x4vr\" (UniqueName: \"kubernetes.io/projected/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-kube-api-access-8x4vr\") pod \"cinder-8302-account-create-update-gjj56\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621206 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4zc\" (UniqueName: \"kubernetes.io/projected/11a50dde-d805-4a5a-8340-1d5b5ede00eb-kube-api-access-7b4zc\") pod \"neutron-db-create-56s9g\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbb28\" (UniqueName: \"kubernetes.io/projected/679ad423-9540-41da-8fc6-02004a615642-kube-api-access-fbb28\") pod \"barbican-db-create-dgk76\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.621712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679ad423-9540-41da-8fc6-02004a615642-operator-scripts\") pod \"barbican-db-create-dgk76\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.622880 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-operator-scripts\") pod \"cinder-8302-account-create-update-gjj56\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.643947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x4vr\" (UniqueName: \"kubernetes.io/projected/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-kube-api-access-8x4vr\") pod \"cinder-8302-account-create-update-gjj56\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.645568 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbb28\" (UniqueName: \"kubernetes.io/projected/679ad423-9540-41da-8fc6-02004a615642-kube-api-access-fbb28\") pod \"barbican-db-create-dgk76\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.683329 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.708574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.723084 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4zc\" (UniqueName: \"kubernetes.io/projected/11a50dde-d805-4a5a-8340-1d5b5ede00eb-kube-api-access-7b4zc\") pod \"neutron-db-create-56s9g\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.723153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-operator-scripts\") pod \"neutron-f977-account-create-update-n8dq9\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.723199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a50dde-d805-4a5a-8340-1d5b5ede00eb-operator-scripts\") pod \"neutron-db-create-56s9g\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.723240 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-config-data\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.723279 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94jz\" (UniqueName: \"kubernetes.io/projected/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-kube-api-access-k94jz\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.723302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276q8\" (UniqueName: \"kubernetes.io/projected/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-kube-api-access-276q8\") pod \"neutron-f977-account-create-update-n8dq9\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.723355 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-combined-ca-bundle\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.724964 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-operator-scripts\") pod \"neutron-f977-account-create-update-n8dq9\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.725426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a50dde-d805-4a5a-8340-1d5b5ede00eb-operator-scripts\") pod \"neutron-db-create-56s9g\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.726627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-combined-ca-bundle\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.737929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-config-data\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.742397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276q8\" (UniqueName: \"kubernetes.io/projected/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-kube-api-access-276q8\") pod \"neutron-f977-account-create-update-n8dq9\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.744895 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94jz\" (UniqueName: \"kubernetes.io/projected/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-kube-api-access-k94jz\") pod \"keystone-db-sync-c4gbj\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.747964 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4zc\" (UniqueName: \"kubernetes.io/projected/11a50dde-d805-4a5a-8340-1d5b5ede00eb-kube-api-access-7b4zc\") pod \"neutron-db-create-56s9g\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.804140 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.824130 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:30.997601 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.367539 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8302-account-create-update-gjj56"] Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.386065 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6938-account-create-update-jnb4q"] Feb 19 21:48:31 crc kubenswrapper[4771]: W0219 21:48:31.394430 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2cc086f_16e1_41b6_b21b_a18c032ea3f3.slice/crio-c28337b62ad5885b45ea8a0a1b8a72c43339c97d7b3616150d758c2064a9a050 WatchSource:0}: Error finding container c28337b62ad5885b45ea8a0a1b8a72c43339c97d7b3616150d758c2064a9a050: Status 404 returned error can't find the container with id c28337b62ad5885b45ea8a0a1b8a72c43339c97d7b3616150d758c2064a9a050 Feb 19 21:48:31 crc kubenswrapper[4771]: W0219 21:48:31.401778 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80696dba_d7cc_41e6_ba67_4299cdd675ff.slice/crio-65d05d3eeb5092191d4688e21fb4ba17f9fd52fe0ec3526b8185c72dbe44d520 WatchSource:0}: Error finding container 65d05d3eeb5092191d4688e21fb4ba17f9fd52fe0ec3526b8185c72dbe44d520: Status 404 returned error can't find the container with id 65d05d3eeb5092191d4688e21fb4ba17f9fd52fe0ec3526b8185c72dbe44d520 Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.416962 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c4gbj"] Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.425830 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5p7sb"] Feb 19 21:48:31 crc kubenswrapper[4771]: W0219 21:48:31.429890 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod679ad423_9540_41da_8fc6_02004a615642.slice/crio-0dbe600cee00a8f72abf62ac03901db05d2634fa83a2c066e4f38cbb3dbb034e WatchSource:0}: Error finding container 0dbe600cee00a8f72abf62ac03901db05d2634fa83a2c066e4f38cbb3dbb034e: Status 404 returned error can't find the container with id 0dbe600cee00a8f72abf62ac03901db05d2634fa83a2c066e4f38cbb3dbb034e Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.432717 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dgk76"] Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.545411 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-56s9g"] Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.590323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8302-account-create-update-gjj56" event={"ID":"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b","Type":"ContainerStarted","Data":"43a14b47eadf04d7189a4a09253a5f1068e9334b228120d809281717ccf3879d"} Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.597253 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f977-account-create-update-n8dq9"] Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.602902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dgk76" event={"ID":"679ad423-9540-41da-8fc6-02004a615642","Type":"ContainerStarted","Data":"0dbe600cee00a8f72abf62ac03901db05d2634fa83a2c066e4f38cbb3dbb034e"} Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.607058 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4gbj" event={"ID":"e2cc086f-16e1-41b6-b21b-a18c032ea3f3","Type":"ContainerStarted","Data":"c28337b62ad5885b45ea8a0a1b8a72c43339c97d7b3616150d758c2064a9a050"} Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.609146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6938-account-create-update-jnb4q" event={"ID":"33435064-417d-458a-ba36-b343066cfbed","Type":"ContainerStarted","Data":"f59c222b4032c5b8024f9b437756cf9bfb384ca41457fa59038ee9a0089a8b35"} Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.610380 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-56s9g" event={"ID":"11a50dde-d805-4a5a-8340-1d5b5ede00eb","Type":"ContainerStarted","Data":"0b1c04dc631f0a764a4b1aa71fa9e9f841a69ad18765a596f4b949dd4ca06686"} Feb 19 21:48:31 crc kubenswrapper[4771]: I0219 21:48:31.611606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5p7sb" event={"ID":"80696dba-d7cc-41e6-ba67-4299cdd675ff","Type":"ContainerStarted","Data":"65d05d3eeb5092191d4688e21fb4ba17f9fd52fe0ec3526b8185c72dbe44d520"} Feb 19 21:48:31 crc kubenswrapper[4771]: W0219 21:48:31.612584 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd938b633_a2fc_4b3a_a2d1_6e080af4a3aa.slice/crio-b476f8374cfa3a1396e9f75843ec14caa359ec1b604220c553bfe7c98a6ec589 WatchSource:0}: Error finding container b476f8374cfa3a1396e9f75843ec14caa359ec1b604220c553bfe7c98a6ec589: Status 404 returned error can't find the container with id b476f8374cfa3a1396e9f75843ec14caa359ec1b604220c553bfe7c98a6ec589 Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.620540 4771 generic.go:334] "Generic (PLEG): container finished" podID="d938b633-a2fc-4b3a-a2d1-6e080af4a3aa" containerID="7e29e380a4d40a5bef1636c27f62822a8ea13ed037c70a4a10bf1a1c0caab247" exitCode=0 Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.620635 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f977-account-create-update-n8dq9" event={"ID":"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa","Type":"ContainerDied","Data":"7e29e380a4d40a5bef1636c27f62822a8ea13ed037c70a4a10bf1a1c0caab247"} Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.620920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f977-account-create-update-n8dq9" event={"ID":"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa","Type":"ContainerStarted","Data":"b476f8374cfa3a1396e9f75843ec14caa359ec1b604220c553bfe7c98a6ec589"} Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.623576 4771 generic.go:334] "Generic (PLEG): container finished" podID="11a50dde-d805-4a5a-8340-1d5b5ede00eb" containerID="26444679d7cec06fdc9c103258262da0f05d898c62474cf9b60ff923c1258255" exitCode=0 Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.623642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-56s9g" event={"ID":"11a50dde-d805-4a5a-8340-1d5b5ede00eb","Type":"ContainerDied","Data":"26444679d7cec06fdc9c103258262da0f05d898c62474cf9b60ff923c1258255"} Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.625220 4771 generic.go:334] "Generic (PLEG): container finished" podID="80696dba-d7cc-41e6-ba67-4299cdd675ff" containerID="c39227b1f1a133079941597205bef527ddb978dee2a720fe0152b89dacaddbf8" exitCode=0 Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.625292 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5p7sb" event={"ID":"80696dba-d7cc-41e6-ba67-4299cdd675ff","Type":"ContainerDied","Data":"c39227b1f1a133079941597205bef527ddb978dee2a720fe0152b89dacaddbf8"} Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.626682 4771 generic.go:334] "Generic (PLEG): container finished" podID="ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b" containerID="ea658d77089dbfc69308dbad9dd2e54d36caee337d295277c26f16fb2c64e832" exitCode=0 Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.626717 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8302-account-create-update-gjj56" event={"ID":"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b","Type":"ContainerDied","Data":"ea658d77089dbfc69308dbad9dd2e54d36caee337d295277c26f16fb2c64e832"} Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.628489 4771 generic.go:334] "Generic (PLEG): container finished" podID="679ad423-9540-41da-8fc6-02004a615642" containerID="1adfb4b311f6a1160416ef7fc218352891df01330cf223d05202cbfbf7986ce3" exitCode=0 Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.628566 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dgk76" event={"ID":"679ad423-9540-41da-8fc6-02004a615642","Type":"ContainerDied","Data":"1adfb4b311f6a1160416ef7fc218352891df01330cf223d05202cbfbf7986ce3"} Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.629971 4771 generic.go:334] "Generic (PLEG): container finished" podID="33435064-417d-458a-ba36-b343066cfbed" containerID="59e8690151de99e0e2f8e5b6aa921e4b3e247c672687ea7a8b2e61fa6b3f510b" exitCode=0 Feb 19 21:48:32 crc kubenswrapper[4771]: I0219 21:48:32.630007 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6938-account-create-update-jnb4q" event={"ID":"33435064-417d-458a-ba36-b343066cfbed","Type":"ContainerDied","Data":"59e8690151de99e0e2f8e5b6aa921e4b3e247c672687ea7a8b2e61fa6b3f510b"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.661157 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.668270 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.668674 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-56s9g" event={"ID":"11a50dde-d805-4a5a-8340-1d5b5ede00eb","Type":"ContainerDied","Data":"0b1c04dc631f0a764a4b1aa71fa9e9f841a69ad18765a596f4b949dd4ca06686"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.668722 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b1c04dc631f0a764a4b1aa71fa9e9f841a69ad18765a596f4b949dd4ca06686" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.671841 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5p7sb" event={"ID":"80696dba-d7cc-41e6-ba67-4299cdd675ff","Type":"ContainerDied","Data":"65d05d3eeb5092191d4688e21fb4ba17f9fd52fe0ec3526b8185c72dbe44d520"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.671864 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5p7sb" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.671901 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65d05d3eeb5092191d4688e21fb4ba17f9fd52fe0ec3526b8185c72dbe44d520" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.673156 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.677301 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8302-account-create-update-gjj56" event={"ID":"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b","Type":"ContainerDied","Data":"43a14b47eadf04d7189a4a09253a5f1068e9334b228120d809281717ccf3879d"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.677410 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43a14b47eadf04d7189a4a09253a5f1068e9334b228120d809281717ccf3879d" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.680273 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dgk76" event={"ID":"679ad423-9540-41da-8fc6-02004a615642","Type":"ContainerDied","Data":"0dbe600cee00a8f72abf62ac03901db05d2634fa83a2c066e4f38cbb3dbb034e"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.680324 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dbe600cee00a8f72abf62ac03901db05d2634fa83a2c066e4f38cbb3dbb034e" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.682381 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jnb4q" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.682373 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6938-account-create-update-jnb4q" event={"ID":"33435064-417d-458a-ba36-b343066cfbed","Type":"ContainerDied","Data":"f59c222b4032c5b8024f9b437756cf9bfb384ca41457fa59038ee9a0089a8b35"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.682524 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f59c222b4032c5b8024f9b437756cf9bfb384ca41457fa59038ee9a0089a8b35" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.684281 4771 generic.go:334] "Generic (PLEG): container finished" podID="cb551a27-bdd4-433e-9d91-48ce01e7592c" containerID="4cd6b0ceee64a599a8e1a3fcddded6e00c86fe15dc3839c5337074fb5721bc3e" exitCode=0 Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.684355 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n8v7z" event={"ID":"cb551a27-bdd4-433e-9d91-48ce01e7592c","Type":"ContainerDied","Data":"4cd6b0ceee64a599a8e1a3fcddded6e00c86fe15dc3839c5337074fb5721bc3e"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.685800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f977-account-create-update-n8dq9" event={"ID":"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa","Type":"ContainerDied","Data":"b476f8374cfa3a1396e9f75843ec14caa359ec1b604220c553bfe7c98a6ec589"} Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.685832 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b476f8374cfa3a1396e9f75843ec14caa359ec1b604220c553bfe7c98a6ec589" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.739983 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.779824 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.784936 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824285 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276q8\" (UniqueName: \"kubernetes.io/projected/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-kube-api-access-276q8\") pod \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824335 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np5fj\" (UniqueName: \"kubernetes.io/projected/80696dba-d7cc-41e6-ba67-4299cdd675ff-kube-api-access-np5fj\") pod \"80696dba-d7cc-41e6-ba67-4299cdd675ff\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824360 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-operator-scripts\") pod \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824386 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbb28\" (UniqueName: \"kubernetes.io/projected/679ad423-9540-41da-8fc6-02004a615642-kube-api-access-fbb28\") pod \"679ad423-9540-41da-8fc6-02004a615642\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824424 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679ad423-9540-41da-8fc6-02004a615642-operator-scripts\") pod \"679ad423-9540-41da-8fc6-02004a615642\" (UID: \"679ad423-9540-41da-8fc6-02004a615642\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824440 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x4vr\" (UniqueName: \"kubernetes.io/projected/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-kube-api-access-8x4vr\") pod \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\" (UID: \"ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824458 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33435064-417d-458a-ba36-b343066cfbed-operator-scripts\") pod \"33435064-417d-458a-ba36-b343066cfbed\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b4zc\" (UniqueName: \"kubernetes.io/projected/11a50dde-d805-4a5a-8340-1d5b5ede00eb-kube-api-access-7b4zc\") pod \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klkmh\" (UniqueName: \"kubernetes.io/projected/33435064-417d-458a-ba36-b343066cfbed-kube-api-access-klkmh\") pod \"33435064-417d-458a-ba36-b343066cfbed\" (UID: \"33435064-417d-458a-ba36-b343066cfbed\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824537 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a50dde-d805-4a5a-8340-1d5b5ede00eb-operator-scripts\") pod \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\" (UID: \"11a50dde-d805-4a5a-8340-1d5b5ede00eb\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824554 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80696dba-d7cc-41e6-ba67-4299cdd675ff-operator-scripts\") pod \"80696dba-d7cc-41e6-ba67-4299cdd675ff\" (UID: \"80696dba-d7cc-41e6-ba67-4299cdd675ff\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.824577 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-operator-scripts\") pod \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\" (UID: \"d938b633-a2fc-4b3a-a2d1-6e080af4a3aa\") " Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.825627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d938b633-a2fc-4b3a-a2d1-6e080af4a3aa" (UID: "d938b633-a2fc-4b3a-a2d1-6e080af4a3aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.826476 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11a50dde-d805-4a5a-8340-1d5b5ede00eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11a50dde-d805-4a5a-8340-1d5b5ede00eb" (UID: "11a50dde-d805-4a5a-8340-1d5b5ede00eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.826794 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33435064-417d-458a-ba36-b343066cfbed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33435064-417d-458a-ba36-b343066cfbed" (UID: "33435064-417d-458a-ba36-b343066cfbed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.826852 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80696dba-d7cc-41e6-ba67-4299cdd675ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80696dba-d7cc-41e6-ba67-4299cdd675ff" (UID: "80696dba-d7cc-41e6-ba67-4299cdd675ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.827297 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b" (UID: "ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.832131 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a50dde-d805-4a5a-8340-1d5b5ede00eb-kube-api-access-7b4zc" (OuterVolumeSpecName: "kube-api-access-7b4zc") pod "11a50dde-d805-4a5a-8340-1d5b5ede00eb" (UID: "11a50dde-d805-4a5a-8340-1d5b5ede00eb"). InnerVolumeSpecName "kube-api-access-7b4zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.832355 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-kube-api-access-8x4vr" (OuterVolumeSpecName: "kube-api-access-8x4vr") pod "ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b" (UID: "ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b"). InnerVolumeSpecName "kube-api-access-8x4vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.832196 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-kube-api-access-276q8" (OuterVolumeSpecName: "kube-api-access-276q8") pod "d938b633-a2fc-4b3a-a2d1-6e080af4a3aa" (UID: "d938b633-a2fc-4b3a-a2d1-6e080af4a3aa"). InnerVolumeSpecName "kube-api-access-276q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.832587 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/679ad423-9540-41da-8fc6-02004a615642-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "679ad423-9540-41da-8fc6-02004a615642" (UID: "679ad423-9540-41da-8fc6-02004a615642"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.832666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80696dba-d7cc-41e6-ba67-4299cdd675ff-kube-api-access-np5fj" (OuterVolumeSpecName: "kube-api-access-np5fj") pod "80696dba-d7cc-41e6-ba67-4299cdd675ff" (UID: "80696dba-d7cc-41e6-ba67-4299cdd675ff"). InnerVolumeSpecName "kube-api-access-np5fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.835618 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33435064-417d-458a-ba36-b343066cfbed-kube-api-access-klkmh" (OuterVolumeSpecName: "kube-api-access-klkmh") pod "33435064-417d-458a-ba36-b343066cfbed" (UID: "33435064-417d-458a-ba36-b343066cfbed"). InnerVolumeSpecName "kube-api-access-klkmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.837036 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679ad423-9540-41da-8fc6-02004a615642-kube-api-access-fbb28" (OuterVolumeSpecName: "kube-api-access-fbb28") pod "679ad423-9540-41da-8fc6-02004a615642" (UID: "679ad423-9540-41da-8fc6-02004a615642"). InnerVolumeSpecName "kube-api-access-fbb28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926540 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926585 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276q8\" (UniqueName: \"kubernetes.io/projected/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa-kube-api-access-276q8\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926603 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np5fj\" (UniqueName: \"kubernetes.io/projected/80696dba-d7cc-41e6-ba67-4299cdd675ff-kube-api-access-np5fj\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926615 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926629 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbb28\" (UniqueName: \"kubernetes.io/projected/679ad423-9540-41da-8fc6-02004a615642-kube-api-access-fbb28\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926639 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679ad423-9540-41da-8fc6-02004a615642-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926650 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x4vr\" (UniqueName: \"kubernetes.io/projected/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b-kube-api-access-8x4vr\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926662 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33435064-417d-458a-ba36-b343066cfbed-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926674 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b4zc\" (UniqueName: \"kubernetes.io/projected/11a50dde-d805-4a5a-8340-1d5b5ede00eb-kube-api-access-7b4zc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926685 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klkmh\" (UniqueName: \"kubernetes.io/projected/33435064-417d-458a-ba36-b343066cfbed-kube-api-access-klkmh\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926697 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a50dde-d805-4a5a-8340-1d5b5ede00eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:35 crc kubenswrapper[4771]: I0219 21:48:35.926707 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80696dba-d7cc-41e6-ba67-4299cdd675ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.174949 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.269001 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-blfh9"] Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.269337 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" podUID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerName="dnsmasq-dns" containerID="cri-o://eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c" gracePeriod=10 Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.678336 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.702604 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4gbj" event={"ID":"e2cc086f-16e1-41b6-b21b-a18c032ea3f3","Type":"ContainerStarted","Data":"b2899341ffff7f0d7884e652ee1901e056f8cf1f617ef5cdc34eee16e4731b80"} Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.708646 4771 generic.go:334] "Generic (PLEG): container finished" podID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerID="eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c" exitCode=0 Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.708916 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.708961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" event={"ID":"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4","Type":"ContainerDied","Data":"eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c"} Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.708993 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-blfh9" event={"ID":"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4","Type":"ContainerDied","Data":"e75b986a9d31b7a35be02bb51214df630f334a7ea5ae70e8bff5436b97f5bc83"} Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.709031 4771 scope.go:117] "RemoveContainer" containerID="eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.709229 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dgk76" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.709365 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-gjj56" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.709395 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-n8dq9" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.709418 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-56s9g" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.753448 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-c4gbj" podStartSLOduration=2.6399581210000003 podStartE2EDuration="6.753430914s" podCreationTimestamp="2026-02-19 21:48:30 +0000 UTC" firstStartedPulling="2026-02-19 21:48:31.408496064 +0000 UTC m=+1211.679938534" lastFinishedPulling="2026-02-19 21:48:35.521968857 +0000 UTC m=+1215.793411327" observedRunningTime="2026-02-19 21:48:36.733196945 +0000 UTC m=+1217.004639415" watchObservedRunningTime="2026-02-19 21:48:36.753430914 +0000 UTC m=+1217.024873384" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.760726 4771 scope.go:117] "RemoveContainer" containerID="44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.783212 4771 scope.go:117] "RemoveContainer" containerID="eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c" Feb 19 21:48:36 crc kubenswrapper[4771]: E0219 21:48:36.783719 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c\": container with ID starting with eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c not found: ID does not exist" containerID="eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.783768 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c"} err="failed to get container status \"eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c\": rpc error: code = NotFound desc = could not find container \"eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c\": container with ID starting with eaa410b8c63cf8ceb3848fba4a5bbd3ac302ef97d85b24d15809d5a5a432e30c not found: ID does not exist" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.783796 4771 scope.go:117] "RemoveContainer" containerID="44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f" Feb 19 21:48:36 crc kubenswrapper[4771]: E0219 21:48:36.784153 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f\": container with ID starting with 44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f not found: ID does not exist" containerID="44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.784193 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f"} err="failed to get container status \"44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f\": rpc error: code = NotFound desc = could not find container \"44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f\": container with ID starting with 44c2d91e1b673c1593ceb0e934937f23538a141946859e76a2e28a4f0e90db4f not found: ID does not exist" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.847800 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-config\") pod \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.847926 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-sb\") pod \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.847963 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zbdj\" (UniqueName: \"kubernetes.io/projected/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-kube-api-access-5zbdj\") pod \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.847988 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-dns-svc\") pod \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.848091 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-nb\") pod \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\" (UID: \"a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4\") " Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.856785 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-kube-api-access-5zbdj" (OuterVolumeSpecName: "kube-api-access-5zbdj") pod "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" (UID: "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4"). InnerVolumeSpecName "kube-api-access-5zbdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.885931 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" (UID: "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.887631 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-config" (OuterVolumeSpecName: "config") pod "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" (UID: "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.890763 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" (UID: "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.940068 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" (UID: "a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.952490 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.952540 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.952561 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zbdj\" (UniqueName: \"kubernetes.io/projected/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-kube-api-access-5zbdj\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.952574 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:36 crc kubenswrapper[4771]: I0219 21:48:36.952586 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.068341 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-blfh9"] Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.076783 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-blfh9"] Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.126614 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n8v7z" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.261958 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-db-sync-config-data\") pod \"cb551a27-bdd4-433e-9d91-48ce01e7592c\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.262396 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-combined-ca-bundle\") pod \"cb551a27-bdd4-433e-9d91-48ce01e7592c\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.262581 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-config-data\") pod \"cb551a27-bdd4-433e-9d91-48ce01e7592c\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.262654 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wmv\" (UniqueName: \"kubernetes.io/projected/cb551a27-bdd4-433e-9d91-48ce01e7592c-kube-api-access-t2wmv\") pod \"cb551a27-bdd4-433e-9d91-48ce01e7592c\" (UID: \"cb551a27-bdd4-433e-9d91-48ce01e7592c\") " Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.270529 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cb551a27-bdd4-433e-9d91-48ce01e7592c" (UID: "cb551a27-bdd4-433e-9d91-48ce01e7592c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.270910 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb551a27-bdd4-433e-9d91-48ce01e7592c-kube-api-access-t2wmv" (OuterVolumeSpecName: "kube-api-access-t2wmv") pod "cb551a27-bdd4-433e-9d91-48ce01e7592c" (UID: "cb551a27-bdd4-433e-9d91-48ce01e7592c"). InnerVolumeSpecName "kube-api-access-t2wmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.300093 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb551a27-bdd4-433e-9d91-48ce01e7592c" (UID: "cb551a27-bdd4-433e-9d91-48ce01e7592c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.344356 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-config-data" (OuterVolumeSpecName: "config-data") pod "cb551a27-bdd4-433e-9d91-48ce01e7592c" (UID: "cb551a27-bdd4-433e-9d91-48ce01e7592c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.365059 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wmv\" (UniqueName: \"kubernetes.io/projected/cb551a27-bdd4-433e-9d91-48ce01e7592c-kube-api-access-t2wmv\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.365193 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.365268 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.365282 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb551a27-bdd4-433e-9d91-48ce01e7592c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.724314 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n8v7z" event={"ID":"cb551a27-bdd4-433e-9d91-48ce01e7592c","Type":"ContainerDied","Data":"b1f0030ec85f235f4da9dd2fec8c651dc4d85b79573fe9a0aa9aa9de12f2341c"} Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.724358 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1f0030ec85f235f4da9dd2fec8c651dc4d85b79573fe9a0aa9aa9de12f2341c" Feb 19 21:48:37 crc kubenswrapper[4771]: I0219 21:48:37.724327 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n8v7z" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237270 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-s9xks"] Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237560 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a50dde-d805-4a5a-8340-1d5b5ede00eb" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237576 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a50dde-d805-4a5a-8340-1d5b5ede00eb" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237589 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80696dba-d7cc-41e6-ba67-4299cdd675ff" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237595 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="80696dba-d7cc-41e6-ba67-4299cdd675ff" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237604 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237612 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237620 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33435064-417d-458a-ba36-b343066cfbed" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237626 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="33435064-417d-458a-ba36-b343066cfbed" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237637 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerName="dnsmasq-dns" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237643 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerName="dnsmasq-dns" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237653 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679ad423-9540-41da-8fc6-02004a615642" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237659 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="679ad423-9540-41da-8fc6-02004a615642" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237673 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerName="init" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237679 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerName="init" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237690 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d938b633-a2fc-4b3a-a2d1-6e080af4a3aa" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237697 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d938b633-a2fc-4b3a-a2d1-6e080af4a3aa" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: E0219 21:48:38.237707 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb551a27-bdd4-433e-9d91-48ce01e7592c" containerName="glance-db-sync" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237713 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb551a27-bdd4-433e-9d91-48ce01e7592c" containerName="glance-db-sync" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237845 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb551a27-bdd4-433e-9d91-48ce01e7592c" containerName="glance-db-sync" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237854 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" containerName="dnsmasq-dns" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237864 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237870 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a50dde-d805-4a5a-8340-1d5b5ede00eb" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237881 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="80696dba-d7cc-41e6-ba67-4299cdd675ff" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237889 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d938b633-a2fc-4b3a-a2d1-6e080af4a3aa" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237896 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="33435064-417d-458a-ba36-b343066cfbed" containerName="mariadb-account-create-update" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.237907 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="679ad423-9540-41da-8fc6-02004a615642" containerName="mariadb-database-create" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.238697 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.258359 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-s9xks"] Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.383543 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-config\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.384079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.384288 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.384380 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.384474 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.384579 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qng5k\" (UniqueName: \"kubernetes.io/projected/00f00e62-5475-4280-9c01-534d495a3983-kube-api-access-qng5k\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.454359 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4" path="/var/lib/kubelet/pods/a75e68ff-0c0d-46f2-b2e3-46bb1ee296e4/volumes" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.485551 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.485699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.485732 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.485763 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.485814 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qng5k\" (UniqueName: \"kubernetes.io/projected/00f00e62-5475-4280-9c01-534d495a3983-kube-api-access-qng5k\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.485846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-config\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.486783 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.486828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-config\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.486838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.487703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.487737 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.504196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qng5k\" (UniqueName: \"kubernetes.io/projected/00f00e62-5475-4280-9c01-534d495a3983-kube-api-access-qng5k\") pod \"dnsmasq-dns-96fb4d4c9-s9xks\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:38 crc kubenswrapper[4771]: I0219 21:48:38.563363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:39 crc kubenswrapper[4771]: I0219 21:48:39.050242 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-s9xks"] Feb 19 21:48:39 crc kubenswrapper[4771]: I0219 21:48:39.745291 4771 generic.go:334] "Generic (PLEG): container finished" podID="00f00e62-5475-4280-9c01-534d495a3983" containerID="90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e" exitCode=0 Feb 19 21:48:39 crc kubenswrapper[4771]: I0219 21:48:39.745363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" event={"ID":"00f00e62-5475-4280-9c01-534d495a3983","Type":"ContainerDied","Data":"90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e"} Feb 19 21:48:39 crc kubenswrapper[4771]: I0219 21:48:39.745679 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" event={"ID":"00f00e62-5475-4280-9c01-534d495a3983","Type":"ContainerStarted","Data":"232511501df850c0959365fb9b377467200302958c77f3b5f91813d9d2cdfbe4"} Feb 19 21:48:39 crc kubenswrapper[4771]: I0219 21:48:39.747360 4771 generic.go:334] "Generic (PLEG): container finished" podID="e2cc086f-16e1-41b6-b21b-a18c032ea3f3" containerID="b2899341ffff7f0d7884e652ee1901e056f8cf1f617ef5cdc34eee16e4731b80" exitCode=0 Feb 19 21:48:39 crc kubenswrapper[4771]: I0219 21:48:39.747411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4gbj" event={"ID":"e2cc086f-16e1-41b6-b21b-a18c032ea3f3","Type":"ContainerDied","Data":"b2899341ffff7f0d7884e652ee1901e056f8cf1f617ef5cdc34eee16e4731b80"} Feb 19 21:48:40 crc kubenswrapper[4771]: I0219 21:48:40.776749 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" event={"ID":"00f00e62-5475-4280-9c01-534d495a3983","Type":"ContainerStarted","Data":"7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69"} Feb 19 21:48:40 crc kubenswrapper[4771]: I0219 21:48:40.813385 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" podStartSLOduration=2.813361412 podStartE2EDuration="2.813361412s" podCreationTimestamp="2026-02-19 21:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:40.805273137 +0000 UTC m=+1221.076715617" watchObservedRunningTime="2026-02-19 21:48:40.813361412 +0000 UTC m=+1221.084803892" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.120981 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.238734 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-combined-ca-bundle\") pod \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.238807 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-config-data\") pod \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.238862 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94jz\" (UniqueName: \"kubernetes.io/projected/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-kube-api-access-k94jz\") pod \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\" (UID: \"e2cc086f-16e1-41b6-b21b-a18c032ea3f3\") " Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.246297 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-kube-api-access-k94jz" (OuterVolumeSpecName: "kube-api-access-k94jz") pod "e2cc086f-16e1-41b6-b21b-a18c032ea3f3" (UID: "e2cc086f-16e1-41b6-b21b-a18c032ea3f3"). InnerVolumeSpecName "kube-api-access-k94jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.273672 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2cc086f-16e1-41b6-b21b-a18c032ea3f3" (UID: "e2cc086f-16e1-41b6-b21b-a18c032ea3f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.293264 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-config-data" (OuterVolumeSpecName: "config-data") pod "e2cc086f-16e1-41b6-b21b-a18c032ea3f3" (UID: "e2cc086f-16e1-41b6-b21b-a18c032ea3f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.341415 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.341650 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.341751 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94jz\" (UniqueName: \"kubernetes.io/projected/e2cc086f-16e1-41b6-b21b-a18c032ea3f3-kube-api-access-k94jz\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.797605 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4gbj" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.798251 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4gbj" event={"ID":"e2cc086f-16e1-41b6-b21b-a18c032ea3f3","Type":"ContainerDied","Data":"c28337b62ad5885b45ea8a0a1b8a72c43339c97d7b3616150d758c2064a9a050"} Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.798294 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c28337b62ad5885b45ea8a0a1b8a72c43339c97d7b3616150d758c2064a9a050" Feb 19 21:48:41 crc kubenswrapper[4771]: I0219 21:48:41.798314 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.042472 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-s9xks"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.059664 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k2672"] Feb 19 21:48:42 crc kubenswrapper[4771]: E0219 21:48:42.060055 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cc086f-16e1-41b6-b21b-a18c032ea3f3" containerName="keystone-db-sync" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.060070 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cc086f-16e1-41b6-b21b-a18c032ea3f3" containerName="keystone-db-sync" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.060230 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cc086f-16e1-41b6-b21b-a18c032ea3f3" containerName="keystone-db-sync" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.060796 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.064711 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.064760 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59frt" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.064996 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.065207 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.065261 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.069393 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-pjqmb"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.070656 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.084207 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k2672"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.097805 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-pjqmb"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-scripts\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153300 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153325 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-credential-keys\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-fernet-keys\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-config-data\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153414 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-config\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153432 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153482 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czs55\" (UniqueName: \"kubernetes.io/projected/cbc86282-634c-43a2-8b33-65e687ce3975-kube-api-access-czs55\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153502 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxzbr\" (UniqueName: \"kubernetes.io/projected/737b7348-6bb1-4a77-b63e-c9cf183804fa-kube-api-access-mxzbr\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153533 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-combined-ca-bundle\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.153560 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.208991 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.210745 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.215279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.215718 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.219112 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.251361 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bvxqv"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.254981 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czs55\" (UniqueName: \"kubernetes.io/projected/cbc86282-634c-43a2-8b33-65e687ce3975-kube-api-access-czs55\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255076 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxzbr\" (UniqueName: \"kubernetes.io/projected/737b7348-6bb1-4a77-b63e-c9cf183804fa-kube-api-access-mxzbr\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-combined-ca-bundle\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-scripts\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-credential-keys\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255252 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-fernet-keys\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255271 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-config-data\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255285 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-config\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.255305 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.256908 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.257463 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.262769 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-combined-ca-bundle\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.262920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-credential-keys\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.263239 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.263329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-config\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.264925 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.267429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-scripts\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.268380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-fernet-keys\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.268641 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-config-data\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.270983 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.274049 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.274380 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-58l5w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.274836 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bvxqv"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.275877 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.302892 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czs55\" (UniqueName: \"kubernetes.io/projected/cbc86282-634c-43a2-8b33-65e687ce3975-kube-api-access-czs55\") pod \"keystone-bootstrap-k2672\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.318833 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxzbr\" (UniqueName: \"kubernetes.io/projected/737b7348-6bb1-4a77-b63e-c9cf183804fa-kube-api-access-mxzbr\") pod \"dnsmasq-dns-c4fdd6b7-pjqmb\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.355505 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-hs22q"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356301 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-config\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356365 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-run-httpd\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356429 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8q8\" (UniqueName: \"kubernetes.io/projected/4f7e5bf4-6d90-4348-bba6-0d062714aace-kube-api-access-7j8q8\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356481 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-config-data\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356497 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-combined-ca-bundle\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356535 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-log-httpd\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356554 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twm2n\" (UniqueName: \"kubernetes.io/projected/63e1bdf2-42cd-4470-af16-8ebafd6580cf-kube-api-access-twm2n\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.356584 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-scripts\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.357183 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.363257 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-h5558" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.363935 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.364100 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.371961 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hs22q"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.374828 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.399189 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.460852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-scripts\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.460898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-run-httpd\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.460927 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.460949 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6263f86-8184-4c2a-b2b0-80cfecba212d-etc-machine-id\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.460992 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8q8\" (UniqueName: \"kubernetes.io/projected/4f7e5bf4-6d90-4348-bba6-0d062714aace-kube-api-access-7j8q8\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461037 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-config-data\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461057 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461082 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-config-data\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461102 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-combined-ca-bundle\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461123 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-log-httpd\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-db-sync-config-data\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461162 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twm2n\" (UniqueName: \"kubernetes.io/projected/63e1bdf2-42cd-4470-af16-8ebafd6580cf-kube-api-access-twm2n\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461190 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-combined-ca-bundle\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-scripts\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rrmc\" (UniqueName: \"kubernetes.io/projected/b6263f86-8184-4c2a-b2b0-80cfecba212d-kube-api-access-7rrmc\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.461315 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-config\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.462460 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-run-httpd\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.462969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-log-httpd\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.474000 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.483255 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-combined-ca-bundle\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.483492 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-config\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.485771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-config-data\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.486620 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-scripts\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.506743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8q8\" (UniqueName: \"kubernetes.io/projected/4f7e5bf4-6d90-4348-bba6-0d062714aace-kube-api-access-7j8q8\") pod \"neutron-db-sync-bvxqv\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.511878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twm2n\" (UniqueName: \"kubernetes.io/projected/63e1bdf2-42cd-4470-af16-8ebafd6580cf-kube-api-access-twm2n\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.524799 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.525164 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.532168 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-pjqmb"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.532206 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-7kf98"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.533466 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.562532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rrmc\" (UniqueName: \"kubernetes.io/projected/b6263f86-8184-4c2a-b2b0-80cfecba212d-kube-api-access-7rrmc\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.562600 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.562636 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.562675 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.562706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-config\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.562750 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-scripts\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.562945 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6263f86-8184-4c2a-b2b0-80cfecba212d-etc-machine-id\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.563043 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.563175 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6263f86-8184-4c2a-b2b0-80cfecba212d-etc-machine-id\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.563238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-config-data\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.563264 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-db-sync-config-data\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.563284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjg8x\" (UniqueName: \"kubernetes.io/projected/10b1d546-2532-46e6-8545-d099ebad7fa6-kube-api-access-qjg8x\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.563316 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-combined-ca-bundle\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.566207 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-combined-ca-bundle\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.566436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-scripts\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.568121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-db-sync-config-data\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.570740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-config-data\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.570766 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-7kf98"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.578867 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rrmc\" (UniqueName: \"kubernetes.io/projected/b6263f86-8184-4c2a-b2b0-80cfecba212d-kube-api-access-7rrmc\") pod \"cinder-db-sync-hs22q\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.595733 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5s24g"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.597121 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.599754 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tdhnr" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.602714 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.605708 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5s24g"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.617808 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4q67w"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.618939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.622628 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.622877 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.622928 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-h2sgl" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.627520 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4q67w"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.664996 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-combined-ca-bundle\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665179 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665248 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcqdr\" (UniqueName: \"kubernetes.io/projected/d2fcae0a-de14-4853-83b4-7862f9fd5d53-kube-api-access-wcqdr\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665331 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bpbs\" (UniqueName: \"kubernetes.io/projected/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-kube-api-access-9bpbs\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-config-data\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-config\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665509 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-logs\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665553 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-combined-ca-bundle\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665582 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-db-sync-config-data\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665608 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-scripts\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.665680 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjg8x\" (UniqueName: \"kubernetes.io/projected/10b1d546-2532-46e6-8545-d099ebad7fa6-kube-api-access-qjg8x\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.667108 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.667240 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-config\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.667300 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.667532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.668165 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.686333 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjg8x\" (UniqueName: \"kubernetes.io/projected/10b1d546-2532-46e6-8545-d099ebad7fa6-kube-api-access-qjg8x\") pod \"dnsmasq-dns-69c85d5ff7-7kf98\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.718151 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.733431 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hs22q" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.766929 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-logs\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.766980 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-combined-ca-bundle\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.767006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-db-sync-config-data\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.767047 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-scripts\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.767091 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-combined-ca-bundle\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.767138 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcqdr\" (UniqueName: \"kubernetes.io/projected/d2fcae0a-de14-4853-83b4-7862f9fd5d53-kube-api-access-wcqdr\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.767157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bpbs\" (UniqueName: \"kubernetes.io/projected/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-kube-api-access-9bpbs\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.767185 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-config-data\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.767374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-logs\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.771060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-combined-ca-bundle\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.771424 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-scripts\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.771458 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-config-data\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.772115 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-db-sync-config-data\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.775340 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-combined-ca-bundle\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.784749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bpbs\" (UniqueName: \"kubernetes.io/projected/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-kube-api-access-9bpbs\") pod \"placement-db-sync-4q67w\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.785348 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcqdr\" (UniqueName: \"kubernetes.io/projected/d2fcae0a-de14-4853-83b4-7862f9fd5d53-kube-api-access-wcqdr\") pod \"barbican-db-sync-5s24g\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.877584 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.930766 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s24g" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.942494 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k2672"] Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.944810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4q67w" Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.956965 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:48:42 crc kubenswrapper[4771]: I0219 21:48:42.957212 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:48:42 crc kubenswrapper[4771]: W0219 21:48:42.971045 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbc86282_634c_43a2_8b33_65e687ce3975.slice/crio-b11c8bd318f4d0e4bbd5581e32186d496b523722c52f629bab62c1e410e0e6d7 WatchSource:0}: Error finding container b11c8bd318f4d0e4bbd5581e32186d496b523722c52f629bab62c1e410e0e6d7: Status 404 returned error can't find the container with id b11c8bd318f4d0e4bbd5581e32186d496b523722c52f629bab62c1e410e0e6d7 Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.086099 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:43 crc kubenswrapper[4771]: W0219 21:48:43.107814 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63e1bdf2_42cd_4470_af16_8ebafd6580cf.slice/crio-7df17d9f2567324f58e60e5311d82037c8e802eae974a6b66262bfa0fbf1a116 WatchSource:0}: Error finding container 7df17d9f2567324f58e60e5311d82037c8e802eae974a6b66262bfa0fbf1a116: Status 404 returned error can't find the container with id 7df17d9f2567324f58e60e5311d82037c8e802eae974a6b66262bfa0fbf1a116 Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.159406 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.160748 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.166103 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.166272 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6wk49" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.166281 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.166465 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.167611 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-pjqmb"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.178272 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.257124 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bvxqv"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.275069 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.276669 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.277818 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.277924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.277976 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.278068 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.278128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-logs\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.278162 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.278223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znd67\" (UniqueName: \"kubernetes.io/projected/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-kube-api-access-znd67\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.278240 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.281967 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.284668 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.284862 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.313144 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hs22q"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386765 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-scripts\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386874 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf85m\" (UniqueName: \"kubernetes.io/projected/97fecc38-771f-4228-826d-5b895a4a7a58-kube-api-access-bf85m\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386920 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-logs\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386937 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-logs\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386960 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.386980 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387075 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znd67\" (UniqueName: \"kubernetes.io/projected/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-kube-api-access-znd67\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387111 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387130 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387148 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387167 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-config-data\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387850 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.387981 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-logs\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.388185 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.397840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.399945 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.400784 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.408633 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znd67\" (UniqueName: \"kubernetes.io/projected/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-kube-api-access-znd67\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.408943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.415900 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.448162 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5s24g"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488421 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488464 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488562 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-config-data\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-scripts\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf85m\" (UniqueName: \"kubernetes.io/projected/97fecc38-771f-4228-826d-5b895a4a7a58-kube-api-access-bf85m\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.488670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-logs\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.489175 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-logs\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.491813 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.500108 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.500359 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.500705 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.509142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.539428 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-scripts\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.540097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-config-data\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.542704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf85m\" (UniqueName: \"kubernetes.io/projected/97fecc38-771f-4228-826d-5b895a4a7a58-kube-api-access-bf85m\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.572222 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.583131 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.755452 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-7kf98"] Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.771895 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4q67w"] Feb 19 21:48:43 crc kubenswrapper[4771]: W0219 21:48:43.793078 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2edbc87a_44f8_47ab_a23f_c9acd74e7e5a.slice/crio-e8f0efc763d4e43ecdb7d7654d02c08cf246896e3413a1cc950179a8008e395f WatchSource:0}: Error finding container e8f0efc763d4e43ecdb7d7654d02c08cf246896e3413a1cc950179a8008e395f: Status 404 returned error can't find the container with id e8f0efc763d4e43ecdb7d7654d02c08cf246896e3413a1cc950179a8008e395f Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.838171 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" event={"ID":"737b7348-6bb1-4a77-b63e-c9cf183804fa","Type":"ContainerStarted","Data":"46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.838227 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" event={"ID":"737b7348-6bb1-4a77-b63e-c9cf183804fa","Type":"ContainerStarted","Data":"fb5de1323824b507a16786949afaec7b9db94c3c6377e8b847fb3e2532a73b08"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.838737 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" podUID="737b7348-6bb1-4a77-b63e-c9cf183804fa" containerName="init" containerID="cri-o://46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425" gracePeriod=10 Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.841632 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s24g" event={"ID":"d2fcae0a-de14-4853-83b4-7862f9fd5d53","Type":"ContainerStarted","Data":"7309656d2ab74f897d1a37e16127c0a66b2960d7590b1c6a6c322e39115c9c8c"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.844951 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" event={"ID":"10b1d546-2532-46e6-8545-d099ebad7fa6","Type":"ContainerStarted","Data":"fe1f5c6fee3cfe9cbc4d2a696718e210ad9487c605709a38943f490353734a4a"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.864737 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvxqv" event={"ID":"4f7e5bf4-6d90-4348-bba6-0d062714aace","Type":"ContainerStarted","Data":"6be2db0d0a7a15ac9e0f112592f3037222ad978893674b6b671e66899fbaf58b"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.864772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvxqv" event={"ID":"4f7e5bf4-6d90-4348-bba6-0d062714aace","Type":"ContainerStarted","Data":"43328ce249cf3a245b6f92aa5cfdf3d51872deea67038c89aa47958ed431334f"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.895938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2672" event={"ID":"cbc86282-634c-43a2-8b33-65e687ce3975","Type":"ContainerStarted","Data":"0c95bf2af364f9fea1a07bb5b1eeda40dc6f61250043572e43473c8b0808ba86"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.896001 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2672" event={"ID":"cbc86282-634c-43a2-8b33-65e687ce3975","Type":"ContainerStarted","Data":"b11c8bd318f4d0e4bbd5581e32186d496b523722c52f629bab62c1e410e0e6d7"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.908269 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bvxqv" podStartSLOduration=1.908250446 podStartE2EDuration="1.908250446s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:43.890852312 +0000 UTC m=+1224.162294782" watchObservedRunningTime="2026-02-19 21:48:43.908250446 +0000 UTC m=+1224.179692916" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.923452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerStarted","Data":"7df17d9f2567324f58e60e5311d82037c8e802eae974a6b66262bfa0fbf1a116"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.928543 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k2672" podStartSLOduration=1.928523505 podStartE2EDuration="1.928523505s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:43.922159046 +0000 UTC m=+1224.193601516" watchObservedRunningTime="2026-02-19 21:48:43.928523505 +0000 UTC m=+1224.199965975" Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.932824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hs22q" event={"ID":"b6263f86-8184-4c2a-b2b0-80cfecba212d","Type":"ContainerStarted","Data":"ecc1871762ce523271016f55b80bf8a9672e7ada145177ee9e75b96cdd91bf69"} Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.934630 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" podUID="00f00e62-5475-4280-9c01-534d495a3983" containerName="dnsmasq-dns" containerID="cri-o://7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69" gracePeriod=10 Feb 19 21:48:43 crc kubenswrapper[4771]: I0219 21:48:43.934833 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4q67w" event={"ID":"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a","Type":"ContainerStarted","Data":"e8f0efc763d4e43ecdb7d7654d02c08cf246896e3413a1cc950179a8008e395f"} Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.035159 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.493675 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.605403 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.702431 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.728585 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-svc\") pod \"737b7348-6bb1-4a77-b63e-c9cf183804fa\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.728638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-nb\") pod \"737b7348-6bb1-4a77-b63e-c9cf183804fa\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.728703 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-config\") pod \"737b7348-6bb1-4a77-b63e-c9cf183804fa\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.728731 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-swift-storage-0\") pod \"737b7348-6bb1-4a77-b63e-c9cf183804fa\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.728796 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxzbr\" (UniqueName: \"kubernetes.io/projected/737b7348-6bb1-4a77-b63e-c9cf183804fa-kube-api-access-mxzbr\") pod \"737b7348-6bb1-4a77-b63e-c9cf183804fa\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.728884 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-sb\") pod \"737b7348-6bb1-4a77-b63e-c9cf183804fa\" (UID: \"737b7348-6bb1-4a77-b63e-c9cf183804fa\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.747349 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737b7348-6bb1-4a77-b63e-c9cf183804fa-kube-api-access-mxzbr" (OuterVolumeSpecName: "kube-api-access-mxzbr") pod "737b7348-6bb1-4a77-b63e-c9cf183804fa" (UID: "737b7348-6bb1-4a77-b63e-c9cf183804fa"). InnerVolumeSpecName "kube-api-access-mxzbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.754602 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-config" (OuterVolumeSpecName: "config") pod "737b7348-6bb1-4a77-b63e-c9cf183804fa" (UID: "737b7348-6bb1-4a77-b63e-c9cf183804fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.765384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "737b7348-6bb1-4a77-b63e-c9cf183804fa" (UID: "737b7348-6bb1-4a77-b63e-c9cf183804fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.766035 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "737b7348-6bb1-4a77-b63e-c9cf183804fa" (UID: "737b7348-6bb1-4a77-b63e-c9cf183804fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.767380 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "737b7348-6bb1-4a77-b63e-c9cf183804fa" (UID: "737b7348-6bb1-4a77-b63e-c9cf183804fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.804857 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "737b7348-6bb1-4a77-b63e-c9cf183804fa" (UID: "737b7348-6bb1-4a77-b63e-c9cf183804fa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.830368 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-config\") pod \"00f00e62-5475-4280-9c01-534d495a3983\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.830531 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-swift-storage-0\") pod \"00f00e62-5475-4280-9c01-534d495a3983\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.830566 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qng5k\" (UniqueName: \"kubernetes.io/projected/00f00e62-5475-4280-9c01-534d495a3983-kube-api-access-qng5k\") pod \"00f00e62-5475-4280-9c01-534d495a3983\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.830603 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-sb\") pod \"00f00e62-5475-4280-9c01-534d495a3983\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.830649 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-svc\") pod \"00f00e62-5475-4280-9c01-534d495a3983\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.830685 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-nb\") pod \"00f00e62-5475-4280-9c01-534d495a3983\" (UID: \"00f00e62-5475-4280-9c01-534d495a3983\") " Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.831079 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.831094 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.831104 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.831113 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.831122 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxzbr\" (UniqueName: \"kubernetes.io/projected/737b7348-6bb1-4a77-b63e-c9cf183804fa-kube-api-access-mxzbr\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.831131 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/737b7348-6bb1-4a77-b63e-c9cf183804fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.837407 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f00e62-5475-4280-9c01-534d495a3983-kube-api-access-qng5k" (OuterVolumeSpecName: "kube-api-access-qng5k") pod "00f00e62-5475-4280-9c01-534d495a3983" (UID: "00f00e62-5475-4280-9c01-534d495a3983"). InnerVolumeSpecName "kube-api-access-qng5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.933038 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qng5k\" (UniqueName: \"kubernetes.io/projected/00f00e62-5475-4280-9c01-534d495a3983-kube-api-access-qng5k\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.958362 4771 generic.go:334] "Generic (PLEG): container finished" podID="737b7348-6bb1-4a77-b63e-c9cf183804fa" containerID="46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425" exitCode=0 Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.958442 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" event={"ID":"737b7348-6bb1-4a77-b63e-c9cf183804fa","Type":"ContainerDied","Data":"46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425"} Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.958475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" event={"ID":"737b7348-6bb1-4a77-b63e-c9cf183804fa","Type":"ContainerDied","Data":"fb5de1323824b507a16786949afaec7b9db94c3c6377e8b847fb3e2532a73b08"} Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.958495 4771 scope.go:117] "RemoveContainer" containerID="46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.958623 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-pjqmb" Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.983634 4771 generic.go:334] "Generic (PLEG): container finished" podID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerID="727eed6e83d1636a8a28fff6e5ec2f76ca424fe09d9c0d3d41227035efe0bd78" exitCode=0 Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.984034 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" event={"ID":"10b1d546-2532-46e6-8545-d099ebad7fa6","Type":"ContainerDied","Data":"727eed6e83d1636a8a28fff6e5ec2f76ca424fe09d9c0d3d41227035efe0bd78"} Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.991221 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:44 crc kubenswrapper[4771]: I0219 21:48:44.995889 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"97fecc38-771f-4228-826d-5b895a4a7a58","Type":"ContainerStarted","Data":"6a3148af07ded68849317f94f337c9e097b36ecdf04a2a0ec39286a7cd21e001"} Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.019782 4771 generic.go:334] "Generic (PLEG): container finished" podID="00f00e62-5475-4280-9c01-534d495a3983" containerID="7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69" exitCode=0 Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.019841 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" event={"ID":"00f00e62-5475-4280-9c01-534d495a3983","Type":"ContainerDied","Data":"7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69"} Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.019865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" event={"ID":"00f00e62-5475-4280-9c01-534d495a3983","Type":"ContainerDied","Data":"232511501df850c0959365fb9b377467200302958c77f3b5f91813d9d2cdfbe4"} Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.019959 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-s9xks" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.044071 4771 scope.go:117] "RemoveContainer" containerID="46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425" Feb 19 21:48:45 crc kubenswrapper[4771]: E0219 21:48:45.057378 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425\": container with ID starting with 46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425 not found: ID does not exist" containerID="46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.057435 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425"} err="failed to get container status \"46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425\": rpc error: code = NotFound desc = could not find container \"46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425\": container with ID starting with 46e8ac618be63adcff9b0dfe80b2f71757ac23aa1423451a7d8303f2af2c6425 not found: ID does not exist" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.057458 4771 scope.go:117] "RemoveContainer" containerID="7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.057520 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-pjqmb"] Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.061784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00f00e62-5475-4280-9c01-534d495a3983" (UID: "00f00e62-5475-4280-9c01-534d495a3983"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.104488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85cdf25-8b10-4666-87bb-7e49d0d3e90b","Type":"ContainerStarted","Data":"7285a645ceec42ab400722834c54df071b1d9fc2654441298de5666fa3d365bc"} Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.104715 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-pjqmb"] Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.117847 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00f00e62-5475-4280-9c01-534d495a3983" (UID: "00f00e62-5475-4280-9c01-534d495a3983"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.127120 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.135294 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.135889 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-config" (OuterVolumeSpecName: "config") pod "00f00e62-5475-4280-9c01-534d495a3983" (UID: "00f00e62-5475-4280-9c01-534d495a3983"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.141884 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.141913 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.141923 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.169281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00f00e62-5475-4280-9c01-534d495a3983" (UID: "00f00e62-5475-4280-9c01-534d495a3983"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.169307 4771 scope.go:117] "RemoveContainer" containerID="90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.172717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00f00e62-5475-4280-9c01-534d495a3983" (UID: "00f00e62-5475-4280-9c01-534d495a3983"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.244058 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.244088 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f00e62-5475-4280-9c01-534d495a3983-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.285598 4771 scope.go:117] "RemoveContainer" containerID="7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69" Feb 19 21:48:45 crc kubenswrapper[4771]: E0219 21:48:45.289155 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69\": container with ID starting with 7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69 not found: ID does not exist" containerID="7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.289200 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69"} err="failed to get container status \"7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69\": rpc error: code = NotFound desc = could not find container \"7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69\": container with ID starting with 7303cdb3f95740256878981fea5b1f7fb55a271c3fe36b26b1de39255ce4fc69 not found: ID does not exist" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.289235 4771 scope.go:117] "RemoveContainer" containerID="90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e" Feb 19 21:48:45 crc kubenswrapper[4771]: E0219 21:48:45.291334 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e\": container with ID starting with 90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e not found: ID does not exist" containerID="90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.291373 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e"} err="failed to get container status \"90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e\": rpc error: code = NotFound desc = could not find container \"90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e\": container with ID starting with 90b5dfe9137954c51a8b51d174ff114e9c86173242cf77a5630483f452b84e4e not found: ID does not exist" Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.476225 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-s9xks"] Feb 19 21:48:45 crc kubenswrapper[4771]: I0219 21:48:45.488856 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-s9xks"] Feb 19 21:48:46 crc kubenswrapper[4771]: I0219 21:48:46.137235 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85cdf25-8b10-4666-87bb-7e49d0d3e90b","Type":"ContainerStarted","Data":"8fd3da04281a4320f662af87a02c59a8ab05f4fc356edd308313a4a7e6ca7e0a"} Feb 19 21:48:46 crc kubenswrapper[4771]: I0219 21:48:46.140941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" event={"ID":"10b1d546-2532-46e6-8545-d099ebad7fa6","Type":"ContainerStarted","Data":"c9ff0e923cb0473882d50d2aafb9f2d9a970d7819afc1bd02a8b34c672edebd1"} Feb 19 21:48:46 crc kubenswrapper[4771]: I0219 21:48:46.141792 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:46 crc kubenswrapper[4771]: I0219 21:48:46.143599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"97fecc38-771f-4228-826d-5b895a4a7a58","Type":"ContainerStarted","Data":"d7b8ee575a6584bffb8e07afdcb6ffeabd5c53da10f3b8985d76e9b2d2e55d2c"} Feb 19 21:48:46 crc kubenswrapper[4771]: I0219 21:48:46.172311 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" podStartSLOduration=4.172294915 podStartE2EDuration="4.172294915s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:46.166679585 +0000 UTC m=+1226.438122065" watchObservedRunningTime="2026-02-19 21:48:46.172294915 +0000 UTC m=+1226.443737385" Feb 19 21:48:46 crc kubenswrapper[4771]: I0219 21:48:46.448800 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f00e62-5475-4280-9c01-534d495a3983" path="/var/lib/kubelet/pods/00f00e62-5475-4280-9c01-534d495a3983/volumes" Feb 19 21:48:46 crc kubenswrapper[4771]: I0219 21:48:46.449516 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737b7348-6bb1-4a77-b63e-c9cf183804fa" path="/var/lib/kubelet/pods/737b7348-6bb1-4a77-b63e-c9cf183804fa/volumes" Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.153228 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85cdf25-8b10-4666-87bb-7e49d0d3e90b","Type":"ContainerStarted","Data":"228b76555a6770f3d7087e4373ae4580718573a7e76c948cbcc090ee33c44c0c"} Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.153706 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-log" containerID="cri-o://8fd3da04281a4320f662af87a02c59a8ab05f4fc356edd308313a4a7e6ca7e0a" gracePeriod=30 Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.154188 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-httpd" containerID="cri-o://228b76555a6770f3d7087e4373ae4580718573a7e76c948cbcc090ee33c44c0c" gracePeriod=30 Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.160168 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-log" containerID="cri-o://d7b8ee575a6584bffb8e07afdcb6ffeabd5c53da10f3b8985d76e9b2d2e55d2c" gracePeriod=30 Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.160242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"97fecc38-771f-4228-826d-5b895a4a7a58","Type":"ContainerStarted","Data":"1cd274bcaac41baafcfb00a8dbfbea768cd82c32f90130692891c9c5608514f3"} Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.160301 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-httpd" containerID="cri-o://1cd274bcaac41baafcfb00a8dbfbea768cd82c32f90130692891c9c5608514f3" gracePeriod=30 Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.171949 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.171932851 podStartE2EDuration="5.171932851s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:47.169210058 +0000 UTC m=+1227.440652548" watchObservedRunningTime="2026-02-19 21:48:47.171932851 +0000 UTC m=+1227.443375321" Feb 19 21:48:47 crc kubenswrapper[4771]: I0219 21:48:47.197948 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.197898081 podStartE2EDuration="5.197898081s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:47.18990898 +0000 UTC m=+1227.461351460" watchObservedRunningTime="2026-02-19 21:48:47.197898081 +0000 UTC m=+1227.469340561" Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.169552 4771 generic.go:334] "Generic (PLEG): container finished" podID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerID="228b76555a6770f3d7087e4373ae4580718573a7e76c948cbcc090ee33c44c0c" exitCode=0 Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.169811 4771 generic.go:334] "Generic (PLEG): container finished" podID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerID="8fd3da04281a4320f662af87a02c59a8ab05f4fc356edd308313a4a7e6ca7e0a" exitCode=143 Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.169635 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85cdf25-8b10-4666-87bb-7e49d0d3e90b","Type":"ContainerDied","Data":"228b76555a6770f3d7087e4373ae4580718573a7e76c948cbcc090ee33c44c0c"} Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.169885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85cdf25-8b10-4666-87bb-7e49d0d3e90b","Type":"ContainerDied","Data":"8fd3da04281a4320f662af87a02c59a8ab05f4fc356edd308313a4a7e6ca7e0a"} Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.172629 4771 generic.go:334] "Generic (PLEG): container finished" podID="cbc86282-634c-43a2-8b33-65e687ce3975" containerID="0c95bf2af364f9fea1a07bb5b1eeda40dc6f61250043572e43473c8b0808ba86" exitCode=0 Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.172684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2672" event={"ID":"cbc86282-634c-43a2-8b33-65e687ce3975","Type":"ContainerDied","Data":"0c95bf2af364f9fea1a07bb5b1eeda40dc6f61250043572e43473c8b0808ba86"} Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.175150 4771 generic.go:334] "Generic (PLEG): container finished" podID="97fecc38-771f-4228-826d-5b895a4a7a58" containerID="1cd274bcaac41baafcfb00a8dbfbea768cd82c32f90130692891c9c5608514f3" exitCode=0 Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.175179 4771 generic.go:334] "Generic (PLEG): container finished" podID="97fecc38-771f-4228-826d-5b895a4a7a58" containerID="d7b8ee575a6584bffb8e07afdcb6ffeabd5c53da10f3b8985d76e9b2d2e55d2c" exitCode=143 Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.175200 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"97fecc38-771f-4228-826d-5b895a4a7a58","Type":"ContainerDied","Data":"1cd274bcaac41baafcfb00a8dbfbea768cd82c32f90130692891c9c5608514f3"} Feb 19 21:48:48 crc kubenswrapper[4771]: I0219 21:48:48.175225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"97fecc38-771f-4228-826d-5b895a4a7a58","Type":"ContainerDied","Data":"d7b8ee575a6584bffb8e07afdcb6ffeabd5c53da10f3b8985d76e9b2d2e55d2c"} Feb 19 21:48:52 crc kubenswrapper[4771]: I0219 21:48:52.879213 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:48:52 crc kubenswrapper[4771]: I0219 21:48:52.950934 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778944759-v7skj"] Feb 19 21:48:52 crc kubenswrapper[4771]: I0219 21:48:52.951183 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-778944759-v7skj" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="dnsmasq-dns" containerID="cri-o://bdea3e5cad7287c3601f8d85ab79ea379fb9829b7f8b529b698f94c74c14672d" gracePeriod=10 Feb 19 21:48:53 crc kubenswrapper[4771]: I0219 21:48:53.235782 4771 generic.go:334] "Generic (PLEG): container finished" podID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerID="bdea3e5cad7287c3601f8d85ab79ea379fb9829b7f8b529b698f94c74c14672d" exitCode=0 Feb 19 21:48:53 crc kubenswrapper[4771]: I0219 21:48:53.235827 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-v7skj" event={"ID":"4aca3142-bb0a-424d-95e6-5819e352d96d","Type":"ContainerDied","Data":"bdea3e5cad7287c3601f8d85ab79ea379fb9829b7f8b529b698f94c74c14672d"} Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.670632 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.679055 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.684454 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.743930 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czs55\" (UniqueName: \"kubernetes.io/projected/cbc86282-634c-43a2-8b33-65e687ce3975-kube-api-access-czs55\") pod \"cbc86282-634c-43a2-8b33-65e687ce3975\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.743982 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-config-data\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744009 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-internal-tls-certs\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-scripts\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744451 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-combined-ca-bundle\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744476 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-public-tls-certs\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744496 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-config-data\") pod \"cbc86282-634c-43a2-8b33-65e687ce3975\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744534 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-credential-keys\") pod \"cbc86282-634c-43a2-8b33-65e687ce3975\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744555 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-logs\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744591 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-scripts\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744634 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-logs\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744689 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-scripts\") pod \"cbc86282-634c-43a2-8b33-65e687ce3975\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744715 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744730 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-combined-ca-bundle\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744753 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-config-data\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744770 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-httpd-run\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744802 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-fernet-keys\") pod \"cbc86282-634c-43a2-8b33-65e687ce3975\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744823 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-combined-ca-bundle\") pod \"cbc86282-634c-43a2-8b33-65e687ce3975\" (UID: \"cbc86282-634c-43a2-8b33-65e687ce3975\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744843 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znd67\" (UniqueName: \"kubernetes.io/projected/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-kube-api-access-znd67\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744892 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-httpd-run\") pod \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\" (UID: \"f85cdf25-8b10-4666-87bb-7e49d0d3e90b\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744914 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.744930 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf85m\" (UniqueName: \"kubernetes.io/projected/97fecc38-771f-4228-826d-5b895a4a7a58-kube-api-access-bf85m\") pod \"97fecc38-771f-4228-826d-5b895a4a7a58\" (UID: \"97fecc38-771f-4228-826d-5b895a4a7a58\") " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.745944 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-logs" (OuterVolumeSpecName: "logs") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.746319 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-logs" (OuterVolumeSpecName: "logs") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.751538 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc86282-634c-43a2-8b33-65e687ce3975-kube-api-access-czs55" (OuterVolumeSpecName: "kube-api-access-czs55") pod "cbc86282-634c-43a2-8b33-65e687ce3975" (UID: "cbc86282-634c-43a2-8b33-65e687ce3975"). InnerVolumeSpecName "kube-api-access-czs55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.751683 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cbc86282-634c-43a2-8b33-65e687ce3975" (UID: "cbc86282-634c-43a2-8b33-65e687ce3975"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.751890 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-scripts" (OuterVolumeSpecName: "scripts") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.752303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.752844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.756666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-scripts" (OuterVolumeSpecName: "scripts") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.759154 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.759633 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-kube-api-access-znd67" (OuterVolumeSpecName: "kube-api-access-znd67") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "kube-api-access-znd67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.759869 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fecc38-771f-4228-826d-5b895a4a7a58-kube-api-access-bf85m" (OuterVolumeSpecName: "kube-api-access-bf85m") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "kube-api-access-bf85m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.766063 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cbc86282-634c-43a2-8b33-65e687ce3975" (UID: "cbc86282-634c-43a2-8b33-65e687ce3975"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.766186 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.788925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-scripts" (OuterVolumeSpecName: "scripts") pod "cbc86282-634c-43a2-8b33-65e687ce3975" (UID: "cbc86282-634c-43a2-8b33-65e687ce3975"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.802089 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbc86282-634c-43a2-8b33-65e687ce3975" (UID: "cbc86282-634c-43a2-8b33-65e687ce3975"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.815247 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-config-data" (OuterVolumeSpecName: "config-data") pod "cbc86282-634c-43a2-8b33-65e687ce3975" (UID: "cbc86282-634c-43a2-8b33-65e687ce3975"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.831245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849193 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849236 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849249 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf85m\" (UniqueName: \"kubernetes.io/projected/97fecc38-771f-4228-826d-5b895a4a7a58-kube-api-access-bf85m\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849262 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czs55\" (UniqueName: \"kubernetes.io/projected/cbc86282-634c-43a2-8b33-65e687ce3975-kube-api-access-czs55\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849273 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849284 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849295 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849307 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849318 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849328 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849338 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849443 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849466 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849477 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97fecc38-771f-4228-826d-5b895a4a7a58-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849490 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849503 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc86282-634c-43a2-8b33-65e687ce3975-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.849515 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znd67\" (UniqueName: \"kubernetes.io/projected/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-kube-api-access-znd67\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.854316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.875000 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-config-data" (OuterVolumeSpecName: "config-data") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.878339 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "97fecc38-771f-4228-826d-5b895a4a7a58" (UID: "97fecc38-771f-4228-826d-5b895a4a7a58"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.889832 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.891301 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-config-data" (OuterVolumeSpecName: "config-data") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.895168 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.902177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f85cdf25-8b10-4666-87bb-7e49d0d3e90b" (UID: "f85cdf25-8b10-4666-87bb-7e49d0d3e90b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.951213 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.951242 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.951253 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fecc38-771f-4228-826d-5b895a4a7a58-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.951263 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.951275 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.951283 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85cdf25-8b10-4666-87bb-7e49d0d3e90b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4771]: I0219 21:48:54.951293 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.254772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2672" event={"ID":"cbc86282-634c-43a2-8b33-65e687ce3975","Type":"ContainerDied","Data":"b11c8bd318f4d0e4bbd5581e32186d496b523722c52f629bab62c1e410e0e6d7"} Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.254820 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11c8bd318f4d0e4bbd5581e32186d496b523722c52f629bab62c1e410e0e6d7" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.254901 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2672" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.277802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"97fecc38-771f-4228-826d-5b895a4a7a58","Type":"ContainerDied","Data":"6a3148af07ded68849317f94f337c9e097b36ecdf04a2a0ec39286a7cd21e001"} Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.277860 4771 scope.go:117] "RemoveContainer" containerID="1cd274bcaac41baafcfb00a8dbfbea768cd82c32f90130692891c9c5608514f3" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.278154 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.296959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85cdf25-8b10-4666-87bb-7e49d0d3e90b","Type":"ContainerDied","Data":"7285a645ceec42ab400722834c54df071b1d9fc2654441298de5666fa3d365bc"} Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.297081 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.330728 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.348259 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.374468 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.419201 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.427765 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.428448 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-httpd" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.428562 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-httpd" Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.428651 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-httpd" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.428718 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-httpd" Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.428808 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc86282-634c-43a2-8b33-65e687ce3975" containerName="keystone-bootstrap" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.428877 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc86282-634c-43a2-8b33-65e687ce3975" containerName="keystone-bootstrap" Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.428957 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737b7348-6bb1-4a77-b63e-c9cf183804fa" containerName="init" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.429062 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="737b7348-6bb1-4a77-b63e-c9cf183804fa" containerName="init" Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.429138 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-log" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.429204 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-log" Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.429293 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f00e62-5475-4280-9c01-534d495a3983" containerName="init" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.429361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f00e62-5475-4280-9c01-534d495a3983" containerName="init" Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.429438 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f00e62-5475-4280-9c01-534d495a3983" containerName="dnsmasq-dns" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.429508 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f00e62-5475-4280-9c01-534d495a3983" containerName="dnsmasq-dns" Feb 19 21:48:55 crc kubenswrapper[4771]: E0219 21:48:55.429578 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-log" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.429649 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-log" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.429926 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-log" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.430005 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-log" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.430112 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" containerName="glance-httpd" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.430186 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc86282-634c-43a2-8b33-65e687ce3975" containerName="keystone-bootstrap" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.430257 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" containerName="glance-httpd" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.430329 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="737b7348-6bb1-4a77-b63e-c9cf183804fa" containerName="init" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.430419 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f00e62-5475-4280-9c01-534d495a3983" containerName="dnsmasq-dns" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.431675 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.435362 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.435653 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6wk49" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.436151 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.436413 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.436608 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.447264 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.452169 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.456269 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.458132 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.460530 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.503873 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.503945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.503970 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.504011 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.504070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.504107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42cmd\" (UniqueName: \"kubernetes.io/projected/20b982fd-3b6b-487a-9ec1-956360ee92d9-kube-api-access-42cmd\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.504229 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.504265 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-logs\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605371 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605755 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605806 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605828 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2j47\" (UniqueName: \"kubernetes.io/projected/01893f8d-263f-4c95-a1f9-864e9e655ee8-kube-api-access-j2j47\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605868 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605907 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-logs\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605924 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42cmd\" (UniqueName: \"kubernetes.io/projected/20b982fd-3b6b-487a-9ec1-956360ee92d9-kube-api-access-42cmd\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605958 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.605996 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.606042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-logs\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.606074 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.606153 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.606467 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-logs\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.606513 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.610917 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.611266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.616126 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.616393 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.624191 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42cmd\" (UniqueName: \"kubernetes.io/projected/20b982fd-3b6b-487a-9ec1-956360ee92d9-kube-api-access-42cmd\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.646259 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708059 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2j47\" (UniqueName: \"kubernetes.io/projected/01893f8d-263f-4c95-a1f9-864e9e655ee8-kube-api-access-j2j47\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708193 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708226 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-logs\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708242 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708262 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708272 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.708974 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-logs\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.712079 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.712447 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.720080 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.726288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.729818 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2j47\" (UniqueName: \"kubernetes.io/projected/01893f8d-263f-4c95-a1f9-864e9e655ee8-kube-api-access-j2j47\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.739561 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.763896 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.766185 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k2672"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.770615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.773224 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k2672"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.856854 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-97gqw"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.857846 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.859033 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.859745 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.859781 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.860111 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59frt" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.860444 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.864936 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-97gqw"] Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.910643 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-config-data\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.910679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-credential-keys\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.910707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-scripts\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.910727 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-fernet-keys\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.910795 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzr6\" (UniqueName: \"kubernetes.io/projected/0f997b82-0501-49eb-820a-26d053e81b02-kube-api-access-dqzr6\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:55 crc kubenswrapper[4771]: I0219 21:48:55.910890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-combined-ca-bundle\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.012810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqzr6\" (UniqueName: \"kubernetes.io/projected/0f997b82-0501-49eb-820a-26d053e81b02-kube-api-access-dqzr6\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.012848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-combined-ca-bundle\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.012921 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-config-data\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.012940 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-credential-keys\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.012962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-scripts\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.012982 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-fernet-keys\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.016280 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-credential-keys\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.016413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-fernet-keys\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.017280 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-config-data\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.017723 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-combined-ca-bundle\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.018052 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-scripts\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.031727 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqzr6\" (UniqueName: \"kubernetes.io/projected/0f997b82-0501-49eb-820a-26d053e81b02-kube-api-access-dqzr6\") pod \"keystone-bootstrap-97gqw\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.179608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.446876 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97fecc38-771f-4228-826d-5b895a4a7a58" path="/var/lib/kubelet/pods/97fecc38-771f-4228-826d-5b895a4a7a58/volumes" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.447742 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc86282-634c-43a2-8b33-65e687ce3975" path="/var/lib/kubelet/pods/cbc86282-634c-43a2-8b33-65e687ce3975/volumes" Feb 19 21:48:56 crc kubenswrapper[4771]: I0219 21:48:56.448389 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85cdf25-8b10-4666-87bb-7e49d0d3e90b" path="/var/lib/kubelet/pods/f85cdf25-8b10-4666-87bb-7e49d0d3e90b/volumes" Feb 19 21:49:01 crc kubenswrapper[4771]: I0219 21:49:01.173959 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-778944759-v7skj" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.585683 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.702628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-svc\") pod \"4aca3142-bb0a-424d-95e6-5819e352d96d\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.702723 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-config\") pod \"4aca3142-bb0a-424d-95e6-5819e352d96d\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.702802 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9x72\" (UniqueName: \"kubernetes.io/projected/4aca3142-bb0a-424d-95e6-5819e352d96d-kube-api-access-x9x72\") pod \"4aca3142-bb0a-424d-95e6-5819e352d96d\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.702844 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-swift-storage-0\") pod \"4aca3142-bb0a-424d-95e6-5819e352d96d\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.702931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-sb\") pod \"4aca3142-bb0a-424d-95e6-5819e352d96d\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.703088 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-nb\") pod \"4aca3142-bb0a-424d-95e6-5819e352d96d\" (UID: \"4aca3142-bb0a-424d-95e6-5819e352d96d\") " Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.710311 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aca3142-bb0a-424d-95e6-5819e352d96d-kube-api-access-x9x72" (OuterVolumeSpecName: "kube-api-access-x9x72") pod "4aca3142-bb0a-424d-95e6-5819e352d96d" (UID: "4aca3142-bb0a-424d-95e6-5819e352d96d"). InnerVolumeSpecName "kube-api-access-x9x72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.756781 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4aca3142-bb0a-424d-95e6-5819e352d96d" (UID: "4aca3142-bb0a-424d-95e6-5819e352d96d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.757383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-config" (OuterVolumeSpecName: "config") pod "4aca3142-bb0a-424d-95e6-5819e352d96d" (UID: "4aca3142-bb0a-424d-95e6-5819e352d96d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.760235 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4aca3142-bb0a-424d-95e6-5819e352d96d" (UID: "4aca3142-bb0a-424d-95e6-5819e352d96d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.762358 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4aca3142-bb0a-424d-95e6-5819e352d96d" (UID: "4aca3142-bb0a-424d-95e6-5819e352d96d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.775359 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4aca3142-bb0a-424d-95e6-5819e352d96d" (UID: "4aca3142-bb0a-424d-95e6-5819e352d96d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.805106 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.805143 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.805156 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9x72\" (UniqueName: \"kubernetes.io/projected/4aca3142-bb0a-424d-95e6-5819e352d96d-kube-api-access-x9x72\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.805172 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.805184 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:04 crc kubenswrapper[4771]: I0219 21:49:04.805194 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aca3142-bb0a-424d-95e6-5819e352d96d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:05 crc kubenswrapper[4771]: I0219 21:49:05.429261 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-v7skj" event={"ID":"4aca3142-bb0a-424d-95e6-5819e352d96d","Type":"ContainerDied","Data":"56c110f682a30d113bf3505366a1ab2fcb9e6191b42a614396a88a5ff097514e"} Feb 19 21:49:05 crc kubenswrapper[4771]: I0219 21:49:05.429340 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-v7skj" Feb 19 21:49:05 crc kubenswrapper[4771]: I0219 21:49:05.463437 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778944759-v7skj"] Feb 19 21:49:05 crc kubenswrapper[4771]: I0219 21:49:05.470134 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-778944759-v7skj"] Feb 19 21:49:05 crc kubenswrapper[4771]: I0219 21:49:05.868404 4771 scope.go:117] "RemoveContainer" containerID="d7b8ee575a6584bffb8e07afdcb6ffeabd5c53da10f3b8985d76e9b2d2e55d2c" Feb 19 21:49:05 crc kubenswrapper[4771]: E0219 21:49:05.871569 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 19 21:49:05 crc kubenswrapper[4771]: E0219 21:49:05.871744 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rrmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-hs22q_openstack(b6263f86-8184-4c2a-b2b0-80cfecba212d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:49:05 crc kubenswrapper[4771]: E0219 21:49:05.873027 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-hs22q" podUID="b6263f86-8184-4c2a-b2b0-80cfecba212d" Feb 19 21:49:05 crc kubenswrapper[4771]: I0219 21:49:05.926643 4771 scope.go:117] "RemoveContainer" containerID="228b76555a6770f3d7087e4373ae4580718573a7e76c948cbcc090ee33c44c0c" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.141066 4771 scope.go:117] "RemoveContainer" containerID="8fd3da04281a4320f662af87a02c59a8ab05f4fc356edd308313a4a7e6ca7e0a" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.172257 4771 scope.go:117] "RemoveContainer" containerID="bdea3e5cad7287c3601f8d85ab79ea379fb9829b7f8b529b698f94c74c14672d" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.176365 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-778944759-v7skj" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.212152 4771 scope.go:117] "RemoveContainer" containerID="9c9311be0e9f613b3ce79a463316a523997b3a2aad17a9f59ce2814154b53a0c" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.415332 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-97gqw"] Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.461253 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" path="/var/lib/kubelet/pods/4aca3142-bb0a-424d-95e6-5819e352d96d/volumes" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.461694 4771 generic.go:334] "Generic (PLEG): container finished" podID="4f7e5bf4-6d90-4348-bba6-0d062714aace" containerID="6be2db0d0a7a15ac9e0f112592f3037222ad978893674b6b671e66899fbaf58b" exitCode=0 Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.462253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s24g" event={"ID":"d2fcae0a-de14-4853-83b4-7862f9fd5d53","Type":"ContainerStarted","Data":"bee7c802e137a339145463341df9445a9986d274b0dbab724570fb327145cf73"} Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.462291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvxqv" event={"ID":"4f7e5bf4-6d90-4348-bba6-0d062714aace","Type":"ContainerDied","Data":"6be2db0d0a7a15ac9e0f112592f3037222ad978893674b6b671e66899fbaf58b"} Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.462938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerStarted","Data":"c3cba4d268336b5047be73f17219d1e2dcda048724027bf6af3d39710442bf20"} Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.464993 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97gqw" event={"ID":"0f997b82-0501-49eb-820a-26d053e81b02","Type":"ContainerStarted","Data":"7c5788d9efa4617282b956703869b529e13ee4fb679ac272f58ac09c197cdfff"} Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.466242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4q67w" event={"ID":"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a","Type":"ContainerStarted","Data":"8762cac2f6ef78687a4af6d497fb280fa7b766f18eed1d002187a46fd63354ec"} Feb 19 21:49:06 crc kubenswrapper[4771]: E0219 21:49:06.472544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-hs22q" podUID="b6263f86-8184-4c2a-b2b0-80cfecba212d" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.478931 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5s24g" podStartSLOduration=2.048258772 podStartE2EDuration="24.478908649s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="2026-02-19 21:48:43.460284193 +0000 UTC m=+1223.731726663" lastFinishedPulling="2026-02-19 21:49:05.89093406 +0000 UTC m=+1246.162376540" observedRunningTime="2026-02-19 21:49:06.475850898 +0000 UTC m=+1246.747293378" watchObservedRunningTime="2026-02-19 21:49:06.478908649 +0000 UTC m=+1246.750351149" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.533763 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.552374 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4q67w" podStartSLOduration=3.89181388 podStartE2EDuration="24.552359875s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="2026-02-19 21:48:43.796572713 +0000 UTC m=+1224.068015183" lastFinishedPulling="2026-02-19 21:49:04.457118668 +0000 UTC m=+1244.728561178" observedRunningTime="2026-02-19 21:49:06.546339304 +0000 UTC m=+1246.817781794" watchObservedRunningTime="2026-02-19 21:49:06.552359875 +0000 UTC m=+1246.823802345" Feb 19 21:49:06 crc kubenswrapper[4771]: I0219 21:49:06.656991 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:49:06 crc kubenswrapper[4771]: W0219 21:49:06.668330 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01893f8d_263f_4c95_a1f9_864e9e655ee8.slice/crio-f83bd29ceaa6412f47ff847d6746725a4cdebb458b5005b52f86dee15d9ca078 WatchSource:0}: Error finding container f83bd29ceaa6412f47ff847d6746725a4cdebb458b5005b52f86dee15d9ca078: Status 404 returned error can't find the container with id f83bd29ceaa6412f47ff847d6746725a4cdebb458b5005b52f86dee15d9ca078 Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.482186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"20b982fd-3b6b-487a-9ec1-956360ee92d9","Type":"ContainerStarted","Data":"3bd7ebdb5d9120c17dd8d1df15d7fd4beb48ccd3ef7ec08acaf67e67d0447b11"} Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.482845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"20b982fd-3b6b-487a-9ec1-956360ee92d9","Type":"ContainerStarted","Data":"b5ccfc745f69f321ce59b9a775fda3e5f2d7f9f91ff2d5fe8e8d4a6d648a82cc"} Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.487755 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97gqw" event={"ID":"0f997b82-0501-49eb-820a-26d053e81b02","Type":"ContainerStarted","Data":"1f6090de362c26def6021ffae1e120ef61588dde212f93dc1d10af3cca3c778c"} Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.489359 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01893f8d-263f-4c95-a1f9-864e9e655ee8","Type":"ContainerStarted","Data":"f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756"} Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.489399 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01893f8d-263f-4c95-a1f9-864e9e655ee8","Type":"ContainerStarted","Data":"f83bd29ceaa6412f47ff847d6746725a4cdebb458b5005b52f86dee15d9ca078"} Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.508802 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-97gqw" podStartSLOduration=12.508722149 podStartE2EDuration="12.508722149s" podCreationTimestamp="2026-02-19 21:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:07.503440148 +0000 UTC m=+1247.774882638" watchObservedRunningTime="2026-02-19 21:49:07.508722149 +0000 UTC m=+1247.780164649" Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.915896 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.961984 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-config\") pod \"4f7e5bf4-6d90-4348-bba6-0d062714aace\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.962409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j8q8\" (UniqueName: \"kubernetes.io/projected/4f7e5bf4-6d90-4348-bba6-0d062714aace-kube-api-access-7j8q8\") pod \"4f7e5bf4-6d90-4348-bba6-0d062714aace\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.962574 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-combined-ca-bundle\") pod \"4f7e5bf4-6d90-4348-bba6-0d062714aace\" (UID: \"4f7e5bf4-6d90-4348-bba6-0d062714aace\") " Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.968551 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7e5bf4-6d90-4348-bba6-0d062714aace-kube-api-access-7j8q8" (OuterVolumeSpecName: "kube-api-access-7j8q8") pod "4f7e5bf4-6d90-4348-bba6-0d062714aace" (UID: "4f7e5bf4-6d90-4348-bba6-0d062714aace"). InnerVolumeSpecName "kube-api-access-7j8q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.984902 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-config" (OuterVolumeSpecName: "config") pod "4f7e5bf4-6d90-4348-bba6-0d062714aace" (UID: "4f7e5bf4-6d90-4348-bba6-0d062714aace"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4771]: I0219 21:49:07.990047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f7e5bf4-6d90-4348-bba6-0d062714aace" (UID: "4f7e5bf4-6d90-4348-bba6-0d062714aace"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.065559 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.065659 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f7e5bf4-6d90-4348-bba6-0d062714aace-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.065725 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j8q8\" (UniqueName: \"kubernetes.io/projected/4f7e5bf4-6d90-4348-bba6-0d062714aace-kube-api-access-7j8q8\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.499466 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bvxqv" event={"ID":"4f7e5bf4-6d90-4348-bba6-0d062714aace","Type":"ContainerDied","Data":"43328ce249cf3a245b6f92aa5cfdf3d51872deea67038c89aa47958ed431334f"} Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.499514 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43328ce249cf3a245b6f92aa5cfdf3d51872deea67038c89aa47958ed431334f" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.500659 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bvxqv" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.502579 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerStarted","Data":"0c7e7bd5583eb8f512f402ebf8523dff6385edebe51004be10badb960f1b9987"} Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.506180 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01893f8d-263f-4c95-a1f9-864e9e655ee8","Type":"ContainerStarted","Data":"b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb"} Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.514081 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"20b982fd-3b6b-487a-9ec1-956360ee92d9","Type":"ContainerStarted","Data":"cf44c0f0303ceeed0b2c620f4aa0d0b8e0d088bfb5d009114b9968c51752a1a7"} Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.586136 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.586116264 podStartE2EDuration="13.586116264s" podCreationTimestamp="2026-02-19 21:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:08.559440784 +0000 UTC m=+1248.830883284" watchObservedRunningTime="2026-02-19 21:49:08.586116264 +0000 UTC m=+1248.857558724" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.593932 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.593914142 podStartE2EDuration="13.593914142s" podCreationTimestamp="2026-02-19 21:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:08.585206659 +0000 UTC m=+1248.856649149" watchObservedRunningTime="2026-02-19 21:49:08.593914142 +0000 UTC m=+1248.865356612" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.808065 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-pjszr"] Feb 19 21:49:08 crc kubenswrapper[4771]: E0219 21:49:08.808651 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7e5bf4-6d90-4348-bba6-0d062714aace" containerName="neutron-db-sync" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.808726 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7e5bf4-6d90-4348-bba6-0d062714aace" containerName="neutron-db-sync" Feb 19 21:49:08 crc kubenswrapper[4771]: E0219 21:49:08.808787 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="dnsmasq-dns" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.808844 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="dnsmasq-dns" Feb 19 21:49:08 crc kubenswrapper[4771]: E0219 21:49:08.808911 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="init" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.808966 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="init" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.809250 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aca3142-bb0a-424d-95e6-5819e352d96d" containerName="dnsmasq-dns" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.809333 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f7e5bf4-6d90-4348-bba6-0d062714aace" containerName="neutron-db-sync" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.810559 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.817009 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-pjszr"] Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.912557 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.913512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.913612 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.913698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pnmb\" (UniqueName: \"kubernetes.io/projected/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-kube-api-access-8pnmb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.913811 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.913891 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-config\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.990117 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bd457cb8d-5vbsm"] Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.991369 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.993084 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.993388 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.993585 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 21:49:08 crc kubenswrapper[4771]: I0219 21:49:08.994096 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-58l5w" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014715 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2rg\" (UniqueName: \"kubernetes.io/projected/fcb1436d-c34f-4211-b6f9-72a78630334d-kube-api-access-rf2rg\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014757 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014788 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014819 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-httpd-config\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pnmb\" (UniqueName: \"kubernetes.io/projected/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-kube-api-access-8pnmb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014859 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-ovndb-tls-certs\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-combined-ca-bundle\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-config\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.014979 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.015006 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-config\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.015658 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.015885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.016257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.016480 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-config\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.016798 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.049913 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pnmb\" (UniqueName: \"kubernetes.io/projected/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-kube-api-access-8pnmb\") pod \"dnsmasq-dns-6f455b5fc7-pjszr\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.059875 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd457cb8d-5vbsm"] Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.116053 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-httpd-config\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.116113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-ovndb-tls-certs\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.116185 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-combined-ca-bundle\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.116225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-config\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.116250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2rg\" (UniqueName: \"kubernetes.io/projected/fcb1436d-c34f-4211-b6f9-72a78630334d-kube-api-access-rf2rg\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.120704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-httpd-config\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.120943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-ovndb-tls-certs\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.121679 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-config\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.134452 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-combined-ca-bundle\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.140409 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.148650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2rg\" (UniqueName: \"kubernetes.io/projected/fcb1436d-c34f-4211-b6f9-72a78630334d-kube-api-access-rf2rg\") pod \"neutron-6bd457cb8d-5vbsm\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.310391 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.537853 4771 generic.go:334] "Generic (PLEG): container finished" podID="2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" containerID="8762cac2f6ef78687a4af6d497fb280fa7b766f18eed1d002187a46fd63354ec" exitCode=0 Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.541053 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4q67w" event={"ID":"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a","Type":"ContainerDied","Data":"8762cac2f6ef78687a4af6d497fb280fa7b766f18eed1d002187a46fd63354ec"} Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.714305 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-pjszr"] Feb 19 21:49:09 crc kubenswrapper[4771]: W0219 21:49:09.732906 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3190ceaf_e7fa_4ee7_a708_acd4a044e99d.slice/crio-94120578710a0fd86553197d0afc54adce3b0ed638519cbeb6dfb781e6504ef9 WatchSource:0}: Error finding container 94120578710a0fd86553197d0afc54adce3b0ed638519cbeb6dfb781e6504ef9: Status 404 returned error can't find the container with id 94120578710a0fd86553197d0afc54adce3b0ed638519cbeb6dfb781e6504ef9 Feb 19 21:49:09 crc kubenswrapper[4771]: I0219 21:49:09.948692 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd457cb8d-5vbsm"] Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.549908 4771 generic.go:334] "Generic (PLEG): container finished" podID="d2fcae0a-de14-4853-83b4-7862f9fd5d53" containerID="bee7c802e137a339145463341df9445a9986d274b0dbab724570fb327145cf73" exitCode=0 Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.550005 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s24g" event={"ID":"d2fcae0a-de14-4853-83b4-7862f9fd5d53","Type":"ContainerDied","Data":"bee7c802e137a339145463341df9445a9986d274b0dbab724570fb327145cf73"} Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.577200 4771 generic.go:334] "Generic (PLEG): container finished" podID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerID="c9a6384ccb057692e84cc08a0129d090b11b7c03e40a0a057304f02bd4c5f961" exitCode=0 Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.577279 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" event={"ID":"3190ceaf-e7fa-4ee7-a708-acd4a044e99d","Type":"ContainerDied","Data":"c9a6384ccb057692e84cc08a0129d090b11b7c03e40a0a057304f02bd4c5f961"} Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.577302 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" event={"ID":"3190ceaf-e7fa-4ee7-a708-acd4a044e99d","Type":"ContainerStarted","Data":"94120578710a0fd86553197d0afc54adce3b0ed638519cbeb6dfb781e6504ef9"} Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.582993 4771 generic.go:334] "Generic (PLEG): container finished" podID="0f997b82-0501-49eb-820a-26d053e81b02" containerID="1f6090de362c26def6021ffae1e120ef61588dde212f93dc1d10af3cca3c778c" exitCode=0 Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.583102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97gqw" event={"ID":"0f997b82-0501-49eb-820a-26d053e81b02","Type":"ContainerDied","Data":"1f6090de362c26def6021ffae1e120ef61588dde212f93dc1d10af3cca3c778c"} Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.596158 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd457cb8d-5vbsm" event={"ID":"fcb1436d-c34f-4211-b6f9-72a78630334d","Type":"ContainerStarted","Data":"56ec40b5a72abb5cddcb68f7c06683a0c85a8037fde988353754a66196601dda"} Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.596198 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.596208 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd457cb8d-5vbsm" event={"ID":"fcb1436d-c34f-4211-b6f9-72a78630334d","Type":"ContainerStarted","Data":"81928f25ef40f7c96ea048cc2dcbe4ec0c56577d24a6353cac6b73cbbf4e6ed5"} Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.596216 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd457cb8d-5vbsm" event={"ID":"fcb1436d-c34f-4211-b6f9-72a78630334d","Type":"ContainerStarted","Data":"abc55d25efe64fd10842ab56b937310df4758ae815d2789b9adf92e464d8444d"} Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.646853 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bd457cb8d-5vbsm" podStartSLOduration=2.646838852 podStartE2EDuration="2.646838852s" podCreationTimestamp="2026-02-19 21:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:10.644451598 +0000 UTC m=+1250.915894088" watchObservedRunningTime="2026-02-19 21:49:10.646838852 +0000 UTC m=+1250.918281322" Feb 19 21:49:10 crc kubenswrapper[4771]: I0219 21:49:10.907246 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4q67w" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.065301 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-combined-ca-bundle\") pod \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.065645 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bpbs\" (UniqueName: \"kubernetes.io/projected/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-kube-api-access-9bpbs\") pod \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.065697 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-logs\") pod \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.065738 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-scripts\") pod \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.065833 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-config-data\") pod \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\" (UID: \"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a\") " Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.068693 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-logs" (OuterVolumeSpecName: "logs") pod "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" (UID: "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.071834 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-scripts" (OuterVolumeSpecName: "scripts") pod "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" (UID: "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.074131 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-kube-api-access-9bpbs" (OuterVolumeSpecName: "kube-api-access-9bpbs") pod "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" (UID: "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a"). InnerVolumeSpecName "kube-api-access-9bpbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.095159 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-config-data" (OuterVolumeSpecName: "config-data") pod "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" (UID: "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.101510 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" (UID: "2edbc87a-44f8-47ab-a23f-c9acd74e7e5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.167295 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bpbs\" (UniqueName: \"kubernetes.io/projected/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-kube-api-access-9bpbs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.167327 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.167335 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.167345 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.167356 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.309249 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79b4d4dd8f-2sfll"] Feb 19 21:49:11 crc kubenswrapper[4771]: E0219 21:49:11.309575 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" containerName="placement-db-sync" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.309586 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" containerName="placement-db-sync" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.309759 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" containerName="placement-db-sync" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.310536 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.313956 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.314108 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.322131 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79b4d4dd8f-2sfll"] Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.475541 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-public-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.475961 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qx52\" (UniqueName: \"kubernetes.io/projected/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-kube-api-access-9qx52\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.475999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-internal-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.476125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-httpd-config\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.476147 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-ovndb-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.476168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-config\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.476314 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-combined-ca-bundle\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.577230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-httpd-config\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.577273 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-ovndb-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.577293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-config\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.577351 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-combined-ca-bundle\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.577432 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-public-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.577448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qx52\" (UniqueName: \"kubernetes.io/projected/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-kube-api-access-9qx52\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.577473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-internal-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.582759 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-httpd-config\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.584462 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-combined-ca-bundle\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.584754 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-internal-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.590146 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-ovndb-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.591047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-public-tls-certs\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.592234 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-config\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.609717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qx52\" (UniqueName: \"kubernetes.io/projected/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-kube-api-access-9qx52\") pod \"neutron-79b4d4dd8f-2sfll\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.617104 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" event={"ID":"3190ceaf-e7fa-4ee7-a708-acd4a044e99d","Type":"ContainerStarted","Data":"c7993b3a4d69a6fb5e3f6a151fc2d6afffb90341e4f53dfdebdc17c4319bd170"} Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.618006 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.626752 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4q67w" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.627845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4q67w" event={"ID":"2edbc87a-44f8-47ab-a23f-c9acd74e7e5a","Type":"ContainerDied","Data":"e8f0efc763d4e43ecdb7d7654d02c08cf246896e3413a1cc950179a8008e395f"} Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.627904 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8f0efc763d4e43ecdb7d7654d02c08cf246896e3413a1cc950179a8008e395f" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.645097 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.658448 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" podStartSLOduration=3.658424585 podStartE2EDuration="3.658424585s" podCreationTimestamp="2026-02-19 21:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:11.640600861 +0000 UTC m=+1251.912043351" watchObservedRunningTime="2026-02-19 21:49:11.658424585 +0000 UTC m=+1251.929867055" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.682904 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64d6c64fc4-hcgn4"] Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.684989 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.691082 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d6c64fc4-hcgn4"] Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.707525 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.707632 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.707713 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-h2sgl" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.707745 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.707829 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.889607 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-scripts\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.890037 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-config-data\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.890057 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-combined-ca-bundle\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.890173 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2z9\" (UniqueName: \"kubernetes.io/projected/eafa04ef-6985-4909-9228-1da911369751-kube-api-access-7h2z9\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.890209 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa04ef-6985-4909-9228-1da911369751-logs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.890246 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-public-tls-certs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:11 crc kubenswrapper[4771]: I0219 21:49:11.890360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-internal-tls-certs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:11.999065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2z9\" (UniqueName: \"kubernetes.io/projected/eafa04ef-6985-4909-9228-1da911369751-kube-api-access-7h2z9\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:11.999110 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa04ef-6985-4909-9228-1da911369751-logs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:11.999150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-public-tls-certs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:11.999226 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-internal-tls-certs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:11.999260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-scripts\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:11.999277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-config-data\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:11.999296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-combined-ca-bundle\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.000400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa04ef-6985-4909-9228-1da911369751-logs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.012336 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-scripts\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.032064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-public-tls-certs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.032428 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-internal-tls-certs\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.042566 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-combined-ca-bundle\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.043181 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2z9\" (UniqueName: \"kubernetes.io/projected/eafa04ef-6985-4909-9228-1da911369751-kube-api-access-7h2z9\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.048749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-config-data\") pod \"placement-64d6c64fc4-hcgn4\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.240741 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s24g" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.241792 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.301951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcqdr\" (UniqueName: \"kubernetes.io/projected/d2fcae0a-de14-4853-83b4-7862f9fd5d53-kube-api-access-wcqdr\") pod \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.301994 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-credential-keys\") pod \"0f997b82-0501-49eb-820a-26d053e81b02\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.302037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-fernet-keys\") pod \"0f997b82-0501-49eb-820a-26d053e81b02\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.302081 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-combined-ca-bundle\") pod \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.302120 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-db-sync-config-data\") pod \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\" (UID: \"d2fcae0a-de14-4853-83b4-7862f9fd5d53\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.302159 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-config-data\") pod \"0f997b82-0501-49eb-820a-26d053e81b02\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.302186 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqzr6\" (UniqueName: \"kubernetes.io/projected/0f997b82-0501-49eb-820a-26d053e81b02-kube-api-access-dqzr6\") pod \"0f997b82-0501-49eb-820a-26d053e81b02\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.302209 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-scripts\") pod \"0f997b82-0501-49eb-820a-26d053e81b02\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.302225 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-combined-ca-bundle\") pod \"0f997b82-0501-49eb-820a-26d053e81b02\" (UID: \"0f997b82-0501-49eb-820a-26d053e81b02\") " Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.305893 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0f997b82-0501-49eb-820a-26d053e81b02" (UID: "0f997b82-0501-49eb-820a-26d053e81b02"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.308529 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fcae0a-de14-4853-83b4-7862f9fd5d53-kube-api-access-wcqdr" (OuterVolumeSpecName: "kube-api-access-wcqdr") pod "d2fcae0a-de14-4853-83b4-7862f9fd5d53" (UID: "d2fcae0a-de14-4853-83b4-7862f9fd5d53"). InnerVolumeSpecName "kube-api-access-wcqdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.309454 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f997b82-0501-49eb-820a-26d053e81b02-kube-api-access-dqzr6" (OuterVolumeSpecName: "kube-api-access-dqzr6") pod "0f997b82-0501-49eb-820a-26d053e81b02" (UID: "0f997b82-0501-49eb-820a-26d053e81b02"). InnerVolumeSpecName "kube-api-access-dqzr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.321258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-scripts" (OuterVolumeSpecName: "scripts") pod "0f997b82-0501-49eb-820a-26d053e81b02" (UID: "0f997b82-0501-49eb-820a-26d053e81b02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.324736 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d2fcae0a-de14-4853-83b4-7862f9fd5d53" (UID: "d2fcae0a-de14-4853-83b4-7862f9fd5d53"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.326048 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0f997b82-0501-49eb-820a-26d053e81b02" (UID: "0f997b82-0501-49eb-820a-26d053e81b02"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.331139 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f997b82-0501-49eb-820a-26d053e81b02" (UID: "0f997b82-0501-49eb-820a-26d053e81b02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.343528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.358233 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-config-data" (OuterVolumeSpecName: "config-data") pod "0f997b82-0501-49eb-820a-26d053e81b02" (UID: "0f997b82-0501-49eb-820a-26d053e81b02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.371134 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2fcae0a-de14-4853-83b4-7862f9fd5d53" (UID: "d2fcae0a-de14-4853-83b4-7862f9fd5d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403649 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403884 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403893 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqzr6\" (UniqueName: \"kubernetes.io/projected/0f997b82-0501-49eb-820a-26d053e81b02-kube-api-access-dqzr6\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403903 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403912 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403920 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcqdr\" (UniqueName: \"kubernetes.io/projected/d2fcae0a-de14-4853-83b4-7862f9fd5d53-kube-api-access-wcqdr\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403944 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403952 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f997b82-0501-49eb-820a-26d053e81b02-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.403960 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fcae0a-de14-4853-83b4-7862f9fd5d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.522529 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79b4d4dd8f-2sfll"] Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.638467 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79b4d4dd8f-2sfll" event={"ID":"c8508bb1-418f-4cc7-8021-97c4a8e7e77f","Type":"ContainerStarted","Data":"47883bffcaaeae6d37824869dfc1061203ba56f17529ff026522fb0bfdf4a8ba"} Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.640620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s24g" event={"ID":"d2fcae0a-de14-4853-83b4-7862f9fd5d53","Type":"ContainerDied","Data":"7309656d2ab74f897d1a37e16127c0a66b2960d7590b1c6a6c322e39115c9c8c"} Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.640667 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7309656d2ab74f897d1a37e16127c0a66b2960d7590b1c6a6c322e39115c9c8c" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.640757 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s24g" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.647035 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97gqw" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.647133 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97gqw" event={"ID":"0f997b82-0501-49eb-820a-26d053e81b02","Type":"ContainerDied","Data":"7c5788d9efa4617282b956703869b529e13ee4fb679ac272f58ac09c197cdfff"} Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.647191 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c5788d9efa4617282b956703869b529e13ee4fb679ac272f58ac09c197cdfff" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.753284 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6cf6b5c758-nxz7l"] Feb 19 21:49:12 crc kubenswrapper[4771]: E0219 21:49:12.753911 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f997b82-0501-49eb-820a-26d053e81b02" containerName="keystone-bootstrap" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.753932 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f997b82-0501-49eb-820a-26d053e81b02" containerName="keystone-bootstrap" Feb 19 21:49:12 crc kubenswrapper[4771]: E0219 21:49:12.753967 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fcae0a-de14-4853-83b4-7862f9fd5d53" containerName="barbican-db-sync" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.753977 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fcae0a-de14-4853-83b4-7862f9fd5d53" containerName="barbican-db-sync" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.754151 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f997b82-0501-49eb-820a-26d053e81b02" containerName="keystone-bootstrap" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.754168 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2fcae0a-de14-4853-83b4-7862f9fd5d53" containerName="barbican-db-sync" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.755088 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.760013 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.760198 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.761870 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tdhnr" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.784055 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6ff976dc9f-sc4rs"] Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.789113 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.794636 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.821536 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cf6b5c758-nxz7l"] Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.840105 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ff976dc9f-sc4rs"] Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.880460 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b9474445c-zjs86"] Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.881538 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.885830 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.886113 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-59frt" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.886267 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.886265 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.886300 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.886377 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924154 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data-custom\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924191 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-867pg\" (UniqueName: \"kubernetes.io/projected/b13d8ae8-258b-430f-96fb-789f503797e9-kube-api-access-867pg\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924222 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-combined-ca-bundle\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924255 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpp7n\" (UniqueName: \"kubernetes.io/projected/be1114ef-2c67-4e51-b05e-a5608c8d5b00-kube-api-access-tpp7n\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924274 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924296 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data-custom\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924340 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1114ef-2c67-4e51-b05e-a5608c8d5b00-logs\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924401 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b13d8ae8-258b-430f-96fb-789f503797e9-logs\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.924439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-combined-ca-bundle\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.931247 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d6c64fc4-hcgn4"] Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.967894 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.968209 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.968588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b9474445c-zjs86"] Feb 19 21:49:12 crc kubenswrapper[4771]: I0219 21:49:12.990197 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-pjszr"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.002124 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-tsz8t"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.003512 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.017074 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-tsz8t"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025365 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b13d8ae8-258b-430f-96fb-789f503797e9-logs\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-scripts\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025451 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-combined-ca-bundle\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data-custom\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025490 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-867pg\" (UniqueName: \"kubernetes.io/projected/b13d8ae8-258b-430f-96fb-789f503797e9-kube-api-access-867pg\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-combined-ca-bundle\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025539 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-public-tls-certs\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025557 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpp7n\" (UniqueName: \"kubernetes.io/projected/be1114ef-2c67-4e51-b05e-a5608c8d5b00-kube-api-access-tpp7n\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-combined-ca-bundle\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025596 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025612 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-config-data\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025626 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-internal-tls-certs\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data-custom\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025659 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmbcg\" (UniqueName: \"kubernetes.io/projected/4b605459-c654-463d-b66c-ec804185ea7d-kube-api-access-kmbcg\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025678 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-credential-keys\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-fernet-keys\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025716 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.025735 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1114ef-2c67-4e51-b05e-a5608c8d5b00-logs\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.026271 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1114ef-2c67-4e51-b05e-a5608c8d5b00-logs\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.026407 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b13d8ae8-258b-430f-96fb-789f503797e9-logs\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.030350 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-combined-ca-bundle\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.032680 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data-custom\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.036728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-combined-ca-bundle\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.040989 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.041314 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.044235 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data-custom\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.046819 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-fcbbf6884-kmsdv"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.048197 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.053380 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.065683 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-fcbbf6884-kmsdv"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.067605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpp7n\" (UniqueName: \"kubernetes.io/projected/be1114ef-2c67-4e51-b05e-a5608c8d5b00-kube-api-access-tpp7n\") pod \"barbican-keystone-listener-6cf6b5c758-nxz7l\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.068183 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-867pg\" (UniqueName: \"kubernetes.io/projected/b13d8ae8-258b-430f-96fb-789f503797e9-kube-api-access-867pg\") pod \"barbican-worker-6ff976dc9f-sc4rs\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.079610 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.127188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128220 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-config\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-public-tls-certs\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128319 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-combined-ca-bundle\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-config-data\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-internal-tls-certs\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmbcg\" (UniqueName: \"kubernetes.io/projected/4b605459-c654-463d-b66c-ec804185ea7d-kube-api-access-kmbcg\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-credential-keys\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128409 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-fernet-keys\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128515 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhw9h\" (UniqueName: \"kubernetes.io/projected/c5790e42-f883-43b1-912b-4226fa3b8db1-kube-api-access-lhw9h\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128541 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-scripts\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.128558 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.140853 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-combined-ca-bundle\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.141727 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-fernet-keys\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.142127 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-public-tls-certs\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.144069 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-internal-tls-certs\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.144304 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-scripts\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.145079 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8c658556-bs9wt"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.146484 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.156644 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-config-data\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.167489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-credential-keys\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.169438 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7965bd8f87-ttph6"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.171390 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.178952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmbcg\" (UniqueName: \"kubernetes.io/projected/4b605459-c654-463d-b66c-ec804185ea7d-kube-api-access-kmbcg\") pod \"keystone-7b9474445c-zjs86\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.187087 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8c658556-bs9wt"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.197075 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7965bd8f87-ttph6"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.228345 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhw9h\" (UniqueName: \"kubernetes.io/projected/c5790e42-f883-43b1-912b-4226fa3b8db1-kube-api-access-lhw9h\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229727 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229777 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-config\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229797 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229828 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldd9w\" (UniqueName: \"kubernetes.io/projected/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-kube-api-access-ldd9w\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229899 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-logs\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-combined-ca-bundle\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.229951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data-custom\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.231103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.231599 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.232114 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-config\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.232680 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.232817 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.248899 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b4c966d9d-766ld"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.256929 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.266753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhw9h\" (UniqueName: \"kubernetes.io/projected/c5790e42-f883-43b1-912b-4226fa3b8db1-kube-api-access-lhw9h\") pod \"dnsmasq-dns-6b55f48d49-tsz8t\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.274856 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b4c966d9d-766ld"] Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldd9w\" (UniqueName: \"kubernetes.io/projected/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-kube-api-access-ldd9w\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331554 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-combined-ca-bundle\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331591 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331617 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a201b832-5d71-4947-98ba-02adf91bccd5-logs\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331642 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8dx\" (UniqueName: \"kubernetes.io/projected/a201b832-5d71-4947-98ba-02adf91bccd5-kube-api-access-dc8dx\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331679 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-logs\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331710 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-combined-ca-bundle\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331725 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-logs\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331745 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-combined-ca-bundle\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331766 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data-custom\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331795 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data-custom\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331814 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxqds\" (UniqueName: \"kubernetes.io/projected/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-kube-api-access-fxqds\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data-custom\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.331884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.332828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-logs\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.337494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data-custom\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.346752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-combined-ca-bundle\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.350243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.368954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldd9w\" (UniqueName: \"kubernetes.io/projected/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-kube-api-access-ldd9w\") pod \"barbican-api-fcbbf6884-kmsdv\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.433982 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-combined-ca-bundle\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-logs\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434066 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434082 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434097 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-combined-ca-bundle\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434115 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434144 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a201b832-5d71-4947-98ba-02adf91bccd5-logs\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434171 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s26t\" (UniqueName: \"kubernetes.io/projected/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-kube-api-access-8s26t\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8dx\" (UniqueName: \"kubernetes.io/projected/a201b832-5d71-4947-98ba-02adf91bccd5-kube-api-access-dc8dx\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434219 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data-custom\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434241 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-combined-ca-bundle\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-logs\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data-custom\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxqds\" (UniqueName: \"kubernetes.io/projected/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-kube-api-access-fxqds\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.434319 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data-custom\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.435862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a201b832-5d71-4947-98ba-02adf91bccd5-logs\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.438458 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-logs\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.438740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-combined-ca-bundle\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.444063 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-combined-ca-bundle\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.444874 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.450650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.454998 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data-custom\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.455096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data-custom\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.459043 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8dx\" (UniqueName: \"kubernetes.io/projected/a201b832-5d71-4947-98ba-02adf91bccd5-kube-api-access-dc8dx\") pod \"barbican-keystone-listener-8c658556-bs9wt\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.459532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxqds\" (UniqueName: \"kubernetes.io/projected/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-kube-api-access-fxqds\") pod \"barbican-worker-7965bd8f87-ttph6\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.536495 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data-custom\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.536676 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-logs\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.536704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.536723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-combined-ca-bundle\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.536779 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s26t\" (UniqueName: \"kubernetes.io/projected/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-kube-api-access-8s26t\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.538063 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-logs\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.541328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data-custom\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.548290 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.548508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-combined-ca-bundle\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.548677 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.549136 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.562434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s26t\" (UniqueName: \"kubernetes.io/projected/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-kube-api-access-8s26t\") pod \"barbican-api-6b4c966d9d-766ld\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.605708 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.621744 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.629056 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.665344 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79b4d4dd8f-2sfll" event={"ID":"c8508bb1-418f-4cc7-8021-97c4a8e7e77f","Type":"ContainerStarted","Data":"7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd"} Feb 19 21:49:13 crc kubenswrapper[4771]: I0219 21:49:13.666879 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d6c64fc4-hcgn4" event={"ID":"eafa04ef-6985-4909-9228-1da911369751","Type":"ContainerStarted","Data":"cc88aab17c914b0b17ceabccae14c5de86203d6e91ddee7dc43a97254a95a585"} Feb 19 21:49:14 crc kubenswrapper[4771]: I0219 21:49:14.675596 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" podUID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerName="dnsmasq-dns" containerID="cri-o://c7993b3a4d69a6fb5e3f6a151fc2d6afffb90341e4f53dfdebdc17c4319bd170" gracePeriod=10 Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.684593 4771 generic.go:334] "Generic (PLEG): container finished" podID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerID="c7993b3a4d69a6fb5e3f6a151fc2d6afffb90341e4f53dfdebdc17c4319bd170" exitCode=0 Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.684862 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" event={"ID":"3190ceaf-e7fa-4ee7-a708-acd4a044e99d","Type":"ContainerDied","Data":"c7993b3a4d69a6fb5e3f6a151fc2d6afffb90341e4f53dfdebdc17c4319bd170"} Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.764654 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.764707 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.771886 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.773614 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.835666 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.836232 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.838890 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.854034 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.870300 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-fcbbf6884-kmsdv"] Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.903963 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-dbcb5799b-x6nw7"] Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.905364 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.908807 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.909036 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.923316 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dbcb5799b-x6nw7"] Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.982791 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data-custom\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.982888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-combined-ca-bundle\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.982916 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2b6\" (UniqueName: \"kubernetes.io/projected/1e5f8191-842e-4a39-ac28-b0042c51f813-kube-api-access-jc2b6\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.982947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8191-842e-4a39-ac28-b0042c51f813-logs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.982994 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-internal-tls-certs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.983220 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:15 crc kubenswrapper[4771]: I0219 21:49:15.983392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-public-tls-certs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.085271 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.085359 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-public-tls-certs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.085393 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data-custom\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.085420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-combined-ca-bundle\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.085438 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2b6\" (UniqueName: \"kubernetes.io/projected/1e5f8191-842e-4a39-ac28-b0042c51f813-kube-api-access-jc2b6\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.085458 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8191-842e-4a39-ac28-b0042c51f813-logs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.085474 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-internal-tls-certs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.086068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8191-842e-4a39-ac28-b0042c51f813-logs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.091181 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-internal-tls-certs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.091819 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-public-tls-certs\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.092209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-combined-ca-bundle\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.092935 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data-custom\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.095327 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.104556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2b6\" (UniqueName: \"kubernetes.io/projected/1e5f8191-842e-4a39-ac28-b0042c51f813-kube-api-access-jc2b6\") pod \"barbican-api-dbcb5799b-x6nw7\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.230444 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.738341 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.747667 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" event={"ID":"3190ceaf-e7fa-4ee7-a708-acd4a044e99d","Type":"ContainerDied","Data":"94120578710a0fd86553197d0afc54adce3b0ed638519cbeb6dfb781e6504ef9"} Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.747715 4771 scope.go:117] "RemoveContainer" containerID="c7993b3a4d69a6fb5e3f6a151fc2d6afffb90341e4f53dfdebdc17c4319bd170" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.751813 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79b4d4dd8f-2sfll" event={"ID":"c8508bb1-418f-4cc7-8021-97c4a8e7e77f","Type":"ContainerStarted","Data":"75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b"} Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.752650 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.759796 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d6c64fc4-hcgn4" event={"ID":"eafa04ef-6985-4909-9228-1da911369751","Type":"ContainerStarted","Data":"5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06"} Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.759835 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.760612 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.760626 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.760634 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.806201 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79b4d4dd8f-2sfll" podStartSLOduration=5.806182916 podStartE2EDuration="5.806182916s" podCreationTimestamp="2026-02-19 21:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:16.80219827 +0000 UTC m=+1257.073640750" watchObservedRunningTime="2026-02-19 21:49:16.806182916 +0000 UTC m=+1257.077625386" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.832502 4771 scope.go:117] "RemoveContainer" containerID="c9a6384ccb057692e84cc08a0129d090b11b7c03e40a0a057304f02bd4c5f961" Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.912791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-sb\") pod \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.912868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-svc\") pod \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.912936 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-swift-storage-0\") pod \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.912966 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-nb\") pod \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.912990 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-config\") pod \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.913090 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pnmb\" (UniqueName: \"kubernetes.io/projected/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-kube-api-access-8pnmb\") pod \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\" (UID: \"3190ceaf-e7fa-4ee7-a708-acd4a044e99d\") " Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.931668 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-fcbbf6884-kmsdv"] Feb 19 21:49:16 crc kubenswrapper[4771]: I0219 21:49:16.945077 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-kube-api-access-8pnmb" (OuterVolumeSpecName: "kube-api-access-8pnmb") pod "3190ceaf-e7fa-4ee7-a708-acd4a044e99d" (UID: "3190ceaf-e7fa-4ee7-a708-acd4a044e99d"). InnerVolumeSpecName "kube-api-access-8pnmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.016321 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pnmb\" (UniqueName: \"kubernetes.io/projected/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-kube-api-access-8pnmb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.177360 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8c658556-bs9wt"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.271331 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3190ceaf-e7fa-4ee7-a708-acd4a044e99d" (UID: "3190ceaf-e7fa-4ee7-a708-acd4a044e99d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.327846 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.337862 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7965bd8f87-ttph6"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.397875 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ff976dc9f-sc4rs"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.428629 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-tsz8t"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.453888 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3190ceaf-e7fa-4ee7-a708-acd4a044e99d" (UID: "3190ceaf-e7fa-4ee7-a708-acd4a044e99d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.469548 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-config" (OuterVolumeSpecName: "config") pod "3190ceaf-e7fa-4ee7-a708-acd4a044e99d" (UID: "3190ceaf-e7fa-4ee7-a708-acd4a044e99d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.540642 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.542049 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.548282 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3190ceaf-e7fa-4ee7-a708-acd4a044e99d" (UID: "3190ceaf-e7fa-4ee7-a708-acd4a044e99d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.573662 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3190ceaf-e7fa-4ee7-a708-acd4a044e99d" (UID: "3190ceaf-e7fa-4ee7-a708-acd4a044e99d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.588872 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cf6b5c758-nxz7l"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.644719 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.644749 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3190ceaf-e7fa-4ee7-a708-acd4a044e99d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.775705 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b9474445c-zjs86"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.776770 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" event={"ID":"b13d8ae8-258b-430f-96fb-789f503797e9","Type":"ContainerStarted","Data":"49715e1f16af4256e53abbe74654fded84e115b1699165a48c8e9b5e3438f41b"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.782186 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-pjszr" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.790597 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b4c966d9d-766ld"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.791004 4771 generic.go:334] "Generic (PLEG): container finished" podID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerID="fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4" exitCode=0 Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.791197 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" event={"ID":"c5790e42-f883-43b1-912b-4226fa3b8db1","Type":"ContainerDied","Data":"fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.791230 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" event={"ID":"c5790e42-f883-43b1-912b-4226fa3b8db1","Type":"ContainerStarted","Data":"3bdcd8e144d5ec9a1c44b77ba00113d4f2ed92e8df3974978ac48b2b8f33d970"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.793245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" event={"ID":"be1114ef-2c67-4e51-b05e-a5608c8d5b00","Type":"ContainerStarted","Data":"e4c185a80db66669c8c6efcc29890ca412911d7b61f8e187677321cefb18f2d4"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.802650 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d6c64fc4-hcgn4" event={"ID":"eafa04ef-6985-4909-9228-1da911369751","Type":"ContainerStarted","Data":"00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.803656 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.803778 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.807534 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dbcb5799b-x6nw7"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.809217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7965bd8f87-ttph6" event={"ID":"1b36b1b4-e091-4032-90aa-7f983b5c4b4f","Type":"ContainerStarted","Data":"f0619468cc8a83de8d4aa87f9c90269413a4afaaf23cdba825ac780da85e3679"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.838579 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerStarted","Data":"ff6e52627f8b1152d746602a27350ea7da0f00eb438702f45f295da242bf04ac"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.847103 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" event={"ID":"a201b832-5d71-4947-98ba-02adf91bccd5","Type":"ContainerStarted","Data":"4d378e093609eebcb7349df78092b6f9bdd649912cfa737bf9f2b8c97daea77e"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.849289 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcbbf6884-kmsdv" event={"ID":"e7d48d8e-a3e6-45ad-bb02-372d6491aa74","Type":"ContainerStarted","Data":"3dc44121dd868607eab05a22cca997eff8ae75169e20ae6a08b963711f3c704e"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.849334 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcbbf6884-kmsdv" event={"ID":"e7d48d8e-a3e6-45ad-bb02-372d6491aa74","Type":"ContainerStarted","Data":"38c9b8bddaa5540df35fc3feeac43bbf07b4cd241cea491762aadd25eb7f0c90"} Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.865293 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64d6c64fc4-hcgn4" podStartSLOduration=6.865272404 podStartE2EDuration="6.865272404s" podCreationTimestamp="2026-02-19 21:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:17.834403633 +0000 UTC m=+1258.105846113" watchObservedRunningTime="2026-02-19 21:49:17.865272404 +0000 UTC m=+1258.136714864" Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.883262 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-pjszr"] Feb 19 21:49:17 crc kubenswrapper[4771]: I0219 21:49:17.888631 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-pjszr"] Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.458325 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" path="/var/lib/kubelet/pods/3190ceaf-e7fa-4ee7-a708-acd4a044e99d/volumes" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.871461 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9474445c-zjs86" event={"ID":"4b605459-c654-463d-b66c-ec804185ea7d","Type":"ContainerStarted","Data":"2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.871507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9474445c-zjs86" event={"ID":"4b605459-c654-463d-b66c-ec804185ea7d","Type":"ContainerStarted","Data":"5469734f697a546fed81543e7f4451be5f08040e6f65ecae3a88729a12770722"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.871549 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.874547 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c966d9d-766ld" event={"ID":"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102","Type":"ContainerStarted","Data":"9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.874588 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c966d9d-766ld" event={"ID":"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102","Type":"ContainerStarted","Data":"23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.874600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c966d9d-766ld" event={"ID":"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102","Type":"ContainerStarted","Data":"cbe0887fc80fe89e7d7d9efb2723071f14394653aecca00c6fbd231da0d286a5"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.875508 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.875559 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.881626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dbcb5799b-x6nw7" event={"ID":"1e5f8191-842e-4a39-ac28-b0042c51f813","Type":"ContainerStarted","Data":"8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.881666 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dbcb5799b-x6nw7" event={"ID":"1e5f8191-842e-4a39-ac28-b0042c51f813","Type":"ContainerStarted","Data":"4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.881684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dbcb5799b-x6nw7" event={"ID":"1e5f8191-842e-4a39-ac28-b0042c51f813","Type":"ContainerStarted","Data":"5a5a4b5b442b3e603403d45fee1756c39fb4aab801c8ec3dcf50203a3b761331"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.882466 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.882498 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.895427 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b9474445c-zjs86" podStartSLOduration=6.895412443 podStartE2EDuration="6.895412443s" podCreationTimestamp="2026-02-19 21:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:18.890197874 +0000 UTC m=+1259.161640354" watchObservedRunningTime="2026-02-19 21:49:18.895412443 +0000 UTC m=+1259.166854913" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.895843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcbbf6884-kmsdv" event={"ID":"e7d48d8e-a3e6-45ad-bb02-372d6491aa74","Type":"ContainerStarted","Data":"3f2a07f24513e74bf71cfc8c216867701a6a1d33a438f9d3855b23be48a3f2e8"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.895986 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-fcbbf6884-kmsdv" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api-log" containerID="cri-o://3dc44121dd868607eab05a22cca997eff8ae75169e20ae6a08b963711f3c704e" gracePeriod=30 Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.896290 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.896314 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.896346 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-fcbbf6884-kmsdv" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api" containerID="cri-o://3f2a07f24513e74bf71cfc8c216867701a6a1d33a438f9d3855b23be48a3f2e8" gracePeriod=30 Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.910138 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.910351 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.912079 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" event={"ID":"c5790e42-f883-43b1-912b-4226fa3b8db1","Type":"ContainerStarted","Data":"314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e"} Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.912911 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.912977 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.912984 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.932720 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b4c966d9d-766ld" podStartSLOduration=5.932702296 podStartE2EDuration="5.932702296s" podCreationTimestamp="2026-02-19 21:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:18.92500997 +0000 UTC m=+1259.196452450" watchObservedRunningTime="2026-02-19 21:49:18.932702296 +0000 UTC m=+1259.204144766" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.969919 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-dbcb5799b-x6nw7" podStartSLOduration=3.969898555 podStartE2EDuration="3.969898555s" podCreationTimestamp="2026-02-19 21:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:18.962823357 +0000 UTC m=+1259.234265847" watchObservedRunningTime="2026-02-19 21:49:18.969898555 +0000 UTC m=+1259.241341025" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.980758 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" podStartSLOduration=6.9806035 podStartE2EDuration="6.9806035s" podCreationTimestamp="2026-02-19 21:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:18.978726921 +0000 UTC m=+1259.250169401" watchObservedRunningTime="2026-02-19 21:49:18.9806035 +0000 UTC m=+1259.252045970" Feb 19 21:49:18 crc kubenswrapper[4771]: I0219 21:49:18.999809 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-fcbbf6884-kmsdv" podStartSLOduration=6.999793831 podStartE2EDuration="6.999793831s" podCreationTimestamp="2026-02-19 21:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:18.997171051 +0000 UTC m=+1259.268613531" watchObservedRunningTime="2026-02-19 21:49:18.999793831 +0000 UTC m=+1259.271236301" Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.635337 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.698721 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.700299 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.774362 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.951374 4771 generic.go:334] "Generic (PLEG): container finished" podID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerID="3f2a07f24513e74bf71cfc8c216867701a6a1d33a438f9d3855b23be48a3f2e8" exitCode=0 Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.951396 4771 generic.go:334] "Generic (PLEG): container finished" podID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerID="3dc44121dd868607eab05a22cca997eff8ae75169e20ae6a08b963711f3c704e" exitCode=143 Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.952483 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcbbf6884-kmsdv" event={"ID":"e7d48d8e-a3e6-45ad-bb02-372d6491aa74","Type":"ContainerDied","Data":"3f2a07f24513e74bf71cfc8c216867701a6a1d33a438f9d3855b23be48a3f2e8"} Feb 19 21:49:19 crc kubenswrapper[4771]: I0219 21:49:19.952507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcbbf6884-kmsdv" event={"ID":"e7d48d8e-a3e6-45ad-bb02-372d6491aa74","Type":"ContainerDied","Data":"3dc44121dd868607eab05a22cca997eff8ae75169e20ae6a08b963711f3c704e"} Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.536273 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.661186 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldd9w\" (UniqueName: \"kubernetes.io/projected/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-kube-api-access-ldd9w\") pod \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.661263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-logs\") pod \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.661281 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-combined-ca-bundle\") pod \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.661484 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data-custom\") pod \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.661522 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data\") pod \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\" (UID: \"e7d48d8e-a3e6-45ad-bb02-372d6491aa74\") " Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.663828 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-logs" (OuterVolumeSpecName: "logs") pod "e7d48d8e-a3e6-45ad-bb02-372d6491aa74" (UID: "e7d48d8e-a3e6-45ad-bb02-372d6491aa74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.668149 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-kube-api-access-ldd9w" (OuterVolumeSpecName: "kube-api-access-ldd9w") pod "e7d48d8e-a3e6-45ad-bb02-372d6491aa74" (UID: "e7d48d8e-a3e6-45ad-bb02-372d6491aa74"). InnerVolumeSpecName "kube-api-access-ldd9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.668874 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7d48d8e-a3e6-45ad-bb02-372d6491aa74" (UID: "e7d48d8e-a3e6-45ad-bb02-372d6491aa74"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.703126 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7d48d8e-a3e6-45ad-bb02-372d6491aa74" (UID: "e7d48d8e-a3e6-45ad-bb02-372d6491aa74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.763443 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldd9w\" (UniqueName: \"kubernetes.io/projected/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-kube-api-access-ldd9w\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.763473 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.763483 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.763491 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.781674 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data" (OuterVolumeSpecName: "config-data") pod "e7d48d8e-a3e6-45ad-bb02-372d6491aa74" (UID: "e7d48d8e-a3e6-45ad-bb02-372d6491aa74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.865378 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d48d8e-a3e6-45ad-bb02-372d6491aa74-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.969714 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcbbf6884-kmsdv" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.969756 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcbbf6884-kmsdv" event={"ID":"e7d48d8e-a3e6-45ad-bb02-372d6491aa74","Type":"ContainerDied","Data":"38c9b8bddaa5540df35fc3feeac43bbf07b4cd241cea491762aadd25eb7f0c90"} Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.969847 4771 scope.go:117] "RemoveContainer" containerID="3f2a07f24513e74bf71cfc8c216867701a6a1d33a438f9d3855b23be48a3f2e8" Feb 19 21:49:20 crc kubenswrapper[4771]: I0219 21:49:20.973560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" event={"ID":"b13d8ae8-258b-430f-96fb-789f503797e9","Type":"ContainerStarted","Data":"c15c2f6b67b7a0766f50154f47f4eadaf352c45f8933d1d886d19522e5848490"} Feb 19 21:49:21 crc kubenswrapper[4771]: I0219 21:49:21.103734 4771 scope.go:117] "RemoveContainer" containerID="3dc44121dd868607eab05a22cca997eff8ae75169e20ae6a08b963711f3c704e" Feb 19 21:49:21 crc kubenswrapper[4771]: I0219 21:49:21.139559 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-fcbbf6884-kmsdv"] Feb 19 21:49:21 crc kubenswrapper[4771]: I0219 21:49:21.146607 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-fcbbf6884-kmsdv"] Feb 19 21:49:21 crc kubenswrapper[4771]: E0219 21:49:21.252382 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d48d8e_a3e6_45ad_bb02_372d6491aa74.slice\": RecentStats: unable to find data in memory cache]" Feb 19 21:49:21 crc kubenswrapper[4771]: I0219 21:49:21.996988 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" event={"ID":"a201b832-5d71-4947-98ba-02adf91bccd5","Type":"ContainerStarted","Data":"28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03"} Feb 19 21:49:21 crc kubenswrapper[4771]: I0219 21:49:21.997791 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" event={"ID":"a201b832-5d71-4947-98ba-02adf91bccd5","Type":"ContainerStarted","Data":"8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a"} Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.004310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" event={"ID":"be1114ef-2c67-4e51-b05e-a5608c8d5b00","Type":"ContainerStarted","Data":"d1930253fb64e7437cb83f851ea3c642d444ccbebaf0184579209df0de9f536f"} Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.004385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" event={"ID":"be1114ef-2c67-4e51-b05e-a5608c8d5b00","Type":"ContainerStarted","Data":"c9f92b78b08bc05ea5e68f6382b6fd2262354fab289ce8c82fede1906a5f05ec"} Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.008430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hs22q" event={"ID":"b6263f86-8184-4c2a-b2b0-80cfecba212d","Type":"ContainerStarted","Data":"0a967eeaba3d4bc56afe66a0da1bc669b7ba8837bde6edc23ce8b5bdde09949c"} Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.020440 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" podStartSLOduration=5.905367399 podStartE2EDuration="9.020420817s" podCreationTimestamp="2026-02-19 21:49:13 +0000 UTC" firstStartedPulling="2026-02-19 21:49:17.375461139 +0000 UTC m=+1257.646903609" lastFinishedPulling="2026-02-19 21:49:20.490514557 +0000 UTC m=+1260.761957027" observedRunningTime="2026-02-19 21:49:22.020323104 +0000 UTC m=+1262.291765594" watchObservedRunningTime="2026-02-19 21:49:22.020420817 +0000 UTC m=+1262.291863287" Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.026153 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7965bd8f87-ttph6" event={"ID":"1b36b1b4-e091-4032-90aa-7f983b5c4b4f","Type":"ContainerStarted","Data":"0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8"} Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.026201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7965bd8f87-ttph6" event={"ID":"1b36b1b4-e091-4032-90aa-7f983b5c4b4f","Type":"ContainerStarted","Data":"4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a"} Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.035340 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" event={"ID":"b13d8ae8-258b-430f-96fb-789f503797e9","Type":"ContainerStarted","Data":"819be71cb4312142dccaa9e1f4f79160bd71ae306cb4fd427f60234ecdeb0584"} Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.039572 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-hs22q" podStartSLOduration=3.033073273 podStartE2EDuration="40.039560917s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="2026-02-19 21:48:43.320593314 +0000 UTC m=+1223.592035784" lastFinishedPulling="2026-02-19 21:49:20.327080938 +0000 UTC m=+1260.598523428" observedRunningTime="2026-02-19 21:49:22.037478101 +0000 UTC m=+1262.308920581" watchObservedRunningTime="2026-02-19 21:49:22.039560917 +0000 UTC m=+1262.311003387" Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.097796 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6cf6b5c758-nxz7l"] Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.106314 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" podStartSLOduration=7.242000437 podStartE2EDuration="10.106293162s" podCreationTimestamp="2026-02-19 21:49:12 +0000 UTC" firstStartedPulling="2026-02-19 21:49:17.630751763 +0000 UTC m=+1257.902194233" lastFinishedPulling="2026-02-19 21:49:20.495044488 +0000 UTC m=+1260.766486958" observedRunningTime="2026-02-19 21:49:22.085592122 +0000 UTC m=+1262.357034602" watchObservedRunningTime="2026-02-19 21:49:22.106293162 +0000 UTC m=+1262.377735642" Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.127631 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" podStartSLOduration=7.083099898 podStartE2EDuration="10.12760999s" podCreationTimestamp="2026-02-19 21:49:12 +0000 UTC" firstStartedPulling="2026-02-19 21:49:17.452227901 +0000 UTC m=+1257.723670371" lastFinishedPulling="2026-02-19 21:49:20.496737993 +0000 UTC m=+1260.768180463" observedRunningTime="2026-02-19 21:49:22.112504128 +0000 UTC m=+1262.383946598" watchObservedRunningTime="2026-02-19 21:49:22.12760999 +0000 UTC m=+1262.399052460" Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.143980 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7965bd8f87-ttph6" podStartSLOduration=6.090401442 podStartE2EDuration="9.143956675s" podCreationTimestamp="2026-02-19 21:49:13 +0000 UTC" firstStartedPulling="2026-02-19 21:49:17.445199904 +0000 UTC m=+1257.716642374" lastFinishedPulling="2026-02-19 21:49:20.498755137 +0000 UTC m=+1260.770197607" observedRunningTime="2026-02-19 21:49:22.131575185 +0000 UTC m=+1262.403017655" watchObservedRunningTime="2026-02-19 21:49:22.143956675 +0000 UTC m=+1262.415399145" Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.173994 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6ff976dc9f-sc4rs"] Feb 19 21:49:22 crc kubenswrapper[4771]: I0219 21:49:22.448137 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" path="/var/lib/kubelet/pods/e7d48d8e-a3e6-45ad-bb02-372d6491aa74/volumes" Feb 19 21:49:23 crc kubenswrapper[4771]: I0219 21:49:23.551618 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:23 crc kubenswrapper[4771]: I0219 21:49:23.639048 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-7kf98"] Feb 19 21:49:23 crc kubenswrapper[4771]: I0219 21:49:23.639382 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerName="dnsmasq-dns" containerID="cri-o://c9ff0e923cb0473882d50d2aafb9f2d9a970d7819afc1bd02a8b34c672edebd1" gracePeriod=10 Feb 19 21:49:24 crc kubenswrapper[4771]: I0219 21:49:24.054494 4771 generic.go:334] "Generic (PLEG): container finished" podID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerID="c9ff0e923cb0473882d50d2aafb9f2d9a970d7819afc1bd02a8b34c672edebd1" exitCode=0 Feb 19 21:49:24 crc kubenswrapper[4771]: I0219 21:49:24.054570 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" event={"ID":"10b1d546-2532-46e6-8545-d099ebad7fa6","Type":"ContainerDied","Data":"c9ff0e923cb0473882d50d2aafb9f2d9a970d7819afc1bd02a8b34c672edebd1"} Feb 19 21:49:24 crc kubenswrapper[4771]: I0219 21:49:24.055001 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker" containerID="cri-o://819be71cb4312142dccaa9e1f4f79160bd71ae306cb4fd427f60234ecdeb0584" gracePeriod=30 Feb 19 21:49:24 crc kubenswrapper[4771]: I0219 21:49:24.055039 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener-log" containerID="cri-o://c9f92b78b08bc05ea5e68f6382b6fd2262354fab289ce8c82fede1906a5f05ec" gracePeriod=30 Feb 19 21:49:24 crc kubenswrapper[4771]: I0219 21:49:24.055112 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener" containerID="cri-o://d1930253fb64e7437cb83f851ea3c642d444ccbebaf0184579209df0de9f536f" gracePeriod=30 Feb 19 21:49:24 crc kubenswrapper[4771]: I0219 21:49:24.055189 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker-log" containerID="cri-o://c15c2f6b67b7a0766f50154f47f4eadaf352c45f8933d1d886d19522e5848490" gracePeriod=30 Feb 19 21:49:25 crc kubenswrapper[4771]: I0219 21:49:25.065451 4771 generic.go:334] "Generic (PLEG): container finished" podID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerID="c9f92b78b08bc05ea5e68f6382b6fd2262354fab289ce8c82fede1906a5f05ec" exitCode=143 Feb 19 21:49:25 crc kubenswrapper[4771]: I0219 21:49:25.065511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" event={"ID":"be1114ef-2c67-4e51-b05e-a5608c8d5b00","Type":"ContainerDied","Data":"c9f92b78b08bc05ea5e68f6382b6fd2262354fab289ce8c82fede1906a5f05ec"} Feb 19 21:49:25 crc kubenswrapper[4771]: I0219 21:49:25.067598 4771 generic.go:334] "Generic (PLEG): container finished" podID="b13d8ae8-258b-430f-96fb-789f503797e9" containerID="c15c2f6b67b7a0766f50154f47f4eadaf352c45f8933d1d886d19522e5848490" exitCode=143 Feb 19 21:49:25 crc kubenswrapper[4771]: I0219 21:49:25.067652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" event={"ID":"b13d8ae8-258b-430f-96fb-789f503797e9","Type":"ContainerDied","Data":"c15c2f6b67b7a0766f50154f47f4eadaf352c45f8933d1d886d19522e5848490"} Feb 19 21:49:25 crc kubenswrapper[4771]: I0219 21:49:25.161994 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:25 crc kubenswrapper[4771]: I0219 21:49:25.247591 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:26 crc kubenswrapper[4771]: I0219 21:49:26.083355 4771 generic.go:334] "Generic (PLEG): container finished" podID="b13d8ae8-258b-430f-96fb-789f503797e9" containerID="819be71cb4312142dccaa9e1f4f79160bd71ae306cb4fd427f60234ecdeb0584" exitCode=0 Feb 19 21:49:26 crc kubenswrapper[4771]: I0219 21:49:26.083434 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" event={"ID":"b13d8ae8-258b-430f-96fb-789f503797e9","Type":"ContainerDied","Data":"819be71cb4312142dccaa9e1f4f79160bd71ae306cb4fd427f60234ecdeb0584"} Feb 19 21:49:26 crc kubenswrapper[4771]: I0219 21:49:26.091073 4771 generic.go:334] "Generic (PLEG): container finished" podID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerID="d1930253fb64e7437cb83f851ea3c642d444ccbebaf0184579209df0de9f536f" exitCode=0 Feb 19 21:49:26 crc kubenswrapper[4771]: I0219 21:49:26.091159 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" event={"ID":"be1114ef-2c67-4e51-b05e-a5608c8d5b00","Type":"ContainerDied","Data":"d1930253fb64e7437cb83f851ea3c642d444ccbebaf0184579209df0de9f536f"} Feb 19 21:49:27 crc kubenswrapper[4771]: I0219 21:49:27.099745 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6263f86-8184-4c2a-b2b0-80cfecba212d" containerID="0a967eeaba3d4bc56afe66a0da1bc669b7ba8837bde6edc23ce8b5bdde09949c" exitCode=0 Feb 19 21:49:27 crc kubenswrapper[4771]: I0219 21:49:27.099965 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hs22q" event={"ID":"b6263f86-8184-4c2a-b2b0-80cfecba212d","Type":"ContainerDied","Data":"0a967eeaba3d4bc56afe66a0da1bc669b7ba8837bde6edc23ce8b5bdde09949c"} Feb 19 21:49:27 crc kubenswrapper[4771]: I0219 21:49:27.844472 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:27 crc kubenswrapper[4771]: I0219 21:49:27.879483 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: connect: connection refused" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.090943 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.225854 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b4c966d9d-766ld"] Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.232931 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b4c966d9d-766ld" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api-log" containerID="cri-o://23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46" gracePeriod=30 Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.233174 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b4c966d9d-766ld" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api" containerID="cri-o://9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90" gracePeriod=30 Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.317363 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.353718 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.408460 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418155 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjg8x\" (UniqueName: \"kubernetes.io/projected/10b1d546-2532-46e6-8545-d099ebad7fa6-kube-api-access-qjg8x\") pod \"10b1d546-2532-46e6-8545-d099ebad7fa6\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418247 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-config\") pod \"10b1d546-2532-46e6-8545-d099ebad7fa6\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418293 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-nb\") pod \"10b1d546-2532-46e6-8545-d099ebad7fa6\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418312 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-867pg\" (UniqueName: \"kubernetes.io/projected/b13d8ae8-258b-430f-96fb-789f503797e9-kube-api-access-867pg\") pod \"b13d8ae8-258b-430f-96fb-789f503797e9\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418442 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b13d8ae8-258b-430f-96fb-789f503797e9-logs\") pod \"b13d8ae8-258b-430f-96fb-789f503797e9\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418467 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-combined-ca-bundle\") pod \"b13d8ae8-258b-430f-96fb-789f503797e9\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418523 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-swift-storage-0\") pod \"10b1d546-2532-46e6-8545-d099ebad7fa6\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418611 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data\") pod \"b13d8ae8-258b-430f-96fb-789f503797e9\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data-custom\") pod \"b13d8ae8-258b-430f-96fb-789f503797e9\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418682 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-sb\") pod \"10b1d546-2532-46e6-8545-d099ebad7fa6\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.418700 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-svc\") pod \"10b1d546-2532-46e6-8545-d099ebad7fa6\" (UID: \"10b1d546-2532-46e6-8545-d099ebad7fa6\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.425380 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b13d8ae8-258b-430f-96fb-789f503797e9-logs" (OuterVolumeSpecName: "logs") pod "b13d8ae8-258b-430f-96fb-789f503797e9" (UID: "b13d8ae8-258b-430f-96fb-789f503797e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.432144 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b13d8ae8-258b-430f-96fb-789f503797e9" (UID: "b13d8ae8-258b-430f-96fb-789f503797e9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.433365 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13d8ae8-258b-430f-96fb-789f503797e9-kube-api-access-867pg" (OuterVolumeSpecName: "kube-api-access-867pg") pod "b13d8ae8-258b-430f-96fb-789f503797e9" (UID: "b13d8ae8-258b-430f-96fb-789f503797e9"). InnerVolumeSpecName "kube-api-access-867pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.435640 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b1d546-2532-46e6-8545-d099ebad7fa6-kube-api-access-qjg8x" (OuterVolumeSpecName: "kube-api-access-qjg8x") pod "10b1d546-2532-46e6-8545-d099ebad7fa6" (UID: "10b1d546-2532-46e6-8545-d099ebad7fa6"). InnerVolumeSpecName "kube-api-access-qjg8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.480633 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b13d8ae8-258b-430f-96fb-789f503797e9" (UID: "b13d8ae8-258b-430f-96fb-789f503797e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.501777 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10b1d546-2532-46e6-8545-d099ebad7fa6" (UID: "10b1d546-2532-46e6-8545-d099ebad7fa6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.501814 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-config" (OuterVolumeSpecName: "config") pod "10b1d546-2532-46e6-8545-d099ebad7fa6" (UID: "10b1d546-2532-46e6-8545-d099ebad7fa6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.519214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data" (OuterVolumeSpecName: "config-data") pod "b13d8ae8-258b-430f-96fb-789f503797e9" (UID: "b13d8ae8-258b-430f-96fb-789f503797e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.519982 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-combined-ca-bundle\") pod \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.520160 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1114ef-2c67-4e51-b05e-a5608c8d5b00-logs\") pod \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.520187 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpp7n\" (UniqueName: \"kubernetes.io/projected/be1114ef-2c67-4e51-b05e-a5608c8d5b00-kube-api-access-tpp7n\") pod \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.520291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data\") pod \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.520358 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data-custom\") pod \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\" (UID: \"be1114ef-2c67-4e51-b05e-a5608c8d5b00\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.520410 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data\") pod \"b13d8ae8-258b-430f-96fb-789f503797e9\" (UID: \"b13d8ae8-258b-430f-96fb-789f503797e9\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.520615 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be1114ef-2c67-4e51-b05e-a5608c8d5b00-logs" (OuterVolumeSpecName: "logs") pod "be1114ef-2c67-4e51-b05e-a5608c8d5b00" (UID: "be1114ef-2c67-4e51-b05e-a5608c8d5b00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: W0219 21:49:28.520691 4771 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b13d8ae8-258b-430f-96fb-789f503797e9/volumes/kubernetes.io~secret/config-data Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.520704 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data" (OuterVolumeSpecName: "config-data") pod "b13d8ae8-258b-430f-96fb-789f503797e9" (UID: "b13d8ae8-258b-430f-96fb-789f503797e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.522339 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.523635 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.523764 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjg8x\" (UniqueName: \"kubernetes.io/projected/10b1d546-2532-46e6-8545-d099ebad7fa6-kube-api-access-qjg8x\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.523878 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be1114ef-2c67-4e51-b05e-a5608c8d5b00-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.524066 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.524177 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.524364 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-867pg\" (UniqueName: \"kubernetes.io/projected/b13d8ae8-258b-430f-96fb-789f503797e9-kube-api-access-867pg\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.524464 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b13d8ae8-258b-430f-96fb-789f503797e9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.524553 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13d8ae8-258b-430f-96fb-789f503797e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.528185 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1114ef-2c67-4e51-b05e-a5608c8d5b00-kube-api-access-tpp7n" (OuterVolumeSpecName: "kube-api-access-tpp7n") pod "be1114ef-2c67-4e51-b05e-a5608c8d5b00" (UID: "be1114ef-2c67-4e51-b05e-a5608c8d5b00"). InnerVolumeSpecName "kube-api-access-tpp7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.532823 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be1114ef-2c67-4e51-b05e-a5608c8d5b00" (UID: "be1114ef-2c67-4e51-b05e-a5608c8d5b00"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.548562 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10b1d546-2532-46e6-8545-d099ebad7fa6" (UID: "10b1d546-2532-46e6-8545-d099ebad7fa6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.549167 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10b1d546-2532-46e6-8545-d099ebad7fa6" (UID: "10b1d546-2532-46e6-8545-d099ebad7fa6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.580499 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10b1d546-2532-46e6-8545-d099ebad7fa6" (UID: "10b1d546-2532-46e6-8545-d099ebad7fa6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.583123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be1114ef-2c67-4e51-b05e-a5608c8d5b00" (UID: "be1114ef-2c67-4e51-b05e-a5608c8d5b00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.584328 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data" (OuterVolumeSpecName: "config-data") pod "be1114ef-2c67-4e51-b05e-a5608c8d5b00" (UID: "be1114ef-2c67-4e51-b05e-a5608c8d5b00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.593859 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hs22q" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.628765 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-scripts\") pod \"b6263f86-8184-4c2a-b2b0-80cfecba212d\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.630989 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6263f86-8184-4c2a-b2b0-80cfecba212d-etc-machine-id\") pod \"b6263f86-8184-4c2a-b2b0-80cfecba212d\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.631131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rrmc\" (UniqueName: \"kubernetes.io/projected/b6263f86-8184-4c2a-b2b0-80cfecba212d-kube-api-access-7rrmc\") pod \"b6263f86-8184-4c2a-b2b0-80cfecba212d\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.631170 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-db-sync-config-data\") pod \"b6263f86-8184-4c2a-b2b0-80cfecba212d\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.631202 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-combined-ca-bundle\") pod \"b6263f86-8184-4c2a-b2b0-80cfecba212d\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.631254 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-config-data\") pod \"b6263f86-8184-4c2a-b2b0-80cfecba212d\" (UID: \"b6263f86-8184-4c2a-b2b0-80cfecba212d\") " Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.631579 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6263f86-8184-4c2a-b2b0-80cfecba212d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b6263f86-8184-4c2a-b2b0-80cfecba212d" (UID: "b6263f86-8184-4c2a-b2b0-80cfecba212d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632113 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpp7n\" (UniqueName: \"kubernetes.io/projected/be1114ef-2c67-4e51-b05e-a5608c8d5b00-kube-api-access-tpp7n\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632133 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632148 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6263f86-8184-4c2a-b2b0-80cfecba212d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632158 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632167 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632180 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632207 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b1d546-2532-46e6-8545-d099ebad7fa6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.632219 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1114ef-2c67-4e51-b05e-a5608c8d5b00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.647543 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b6263f86-8184-4c2a-b2b0-80cfecba212d" (UID: "b6263f86-8184-4c2a-b2b0-80cfecba212d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.647591 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6263f86-8184-4c2a-b2b0-80cfecba212d-kube-api-access-7rrmc" (OuterVolumeSpecName: "kube-api-access-7rrmc") pod "b6263f86-8184-4c2a-b2b0-80cfecba212d" (UID: "b6263f86-8184-4c2a-b2b0-80cfecba212d"). InnerVolumeSpecName "kube-api-access-7rrmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.647597 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-scripts" (OuterVolumeSpecName: "scripts") pod "b6263f86-8184-4c2a-b2b0-80cfecba212d" (UID: "b6263f86-8184-4c2a-b2b0-80cfecba212d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.658334 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6263f86-8184-4c2a-b2b0-80cfecba212d" (UID: "b6263f86-8184-4c2a-b2b0-80cfecba212d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.696151 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-config-data" (OuterVolumeSpecName: "config-data") pod "b6263f86-8184-4c2a-b2b0-80cfecba212d" (UID: "b6263f86-8184-4c2a-b2b0-80cfecba212d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.733632 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rrmc\" (UniqueName: \"kubernetes.io/projected/b6263f86-8184-4c2a-b2b0-80cfecba212d-kube-api-access-7rrmc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.733844 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.733929 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.734023 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:28 crc kubenswrapper[4771]: I0219 21:49:28.734079 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6263f86-8184-4c2a-b2b0-80cfecba212d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.122794 4771 generic.go:334] "Generic (PLEG): container finished" podID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerID="23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46" exitCode=143 Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.122901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c966d9d-766ld" event={"ID":"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102","Type":"ContainerDied","Data":"23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46"} Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.124817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" event={"ID":"10b1d546-2532-46e6-8545-d099ebad7fa6","Type":"ContainerDied","Data":"fe1f5c6fee3cfe9cbc4d2a696718e210ad9487c605709a38943f490353734a4a"} Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.124873 4771 scope.go:117] "RemoveContainer" containerID="c9ff0e923cb0473882d50d2aafb9f2d9a970d7819afc1bd02a8b34c672edebd1" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.125188 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-7kf98" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.130299 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" event={"ID":"b13d8ae8-258b-430f-96fb-789f503797e9","Type":"ContainerDied","Data":"49715e1f16af4256e53abbe74654fded84e115b1699165a48c8e9b5e3438f41b"} Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.130410 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ff976dc9f-sc4rs" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.150415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerStarted","Data":"5b81496ba5c0165a7a9d05175e4743d29b2dcbd7e342c83dca657a5fe8ecbb23"} Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.150602 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-central-agent" containerID="cri-o://c3cba4d268336b5047be73f17219d1e2dcda048724027bf6af3d39710442bf20" gracePeriod=30 Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.150853 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.150904 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="proxy-httpd" containerID="cri-o://5b81496ba5c0165a7a9d05175e4743d29b2dcbd7e342c83dca657a5fe8ecbb23" gracePeriod=30 Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.150943 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="sg-core" containerID="cri-o://ff6e52627f8b1152d746602a27350ea7da0f00eb438702f45f295da242bf04ac" gracePeriod=30 Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.150976 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-notification-agent" containerID="cri-o://0c7e7bd5583eb8f512f402ebf8523dff6385edebe51004be10badb960f1b9987" gracePeriod=30 Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.161252 4771 scope.go:117] "RemoveContainer" containerID="727eed6e83d1636a8a28fff6e5ec2f76ca424fe09d9c0d3d41227035efe0bd78" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.165552 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.165602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cf6b5c758-nxz7l" event={"ID":"be1114ef-2c67-4e51-b05e-a5608c8d5b00","Type":"ContainerDied","Data":"e4c185a80db66669c8c6efcc29890ca412911d7b61f8e187677321cefb18f2d4"} Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.178600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hs22q" event={"ID":"b6263f86-8184-4c2a-b2b0-80cfecba212d","Type":"ContainerDied","Data":"ecc1871762ce523271016f55b80bf8a9672e7ada145177ee9e75b96cdd91bf69"} Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.178644 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc1871762ce523271016f55b80bf8a9672e7ada145177ee9e75b96cdd91bf69" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.178728 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hs22q" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.201153 4771 scope.go:117] "RemoveContainer" containerID="819be71cb4312142dccaa9e1f4f79160bd71ae306cb4fd427f60234ecdeb0584" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.213115 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6ff976dc9f-sc4rs"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.220492 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6ff976dc9f-sc4rs"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.230761 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.407612797 podStartE2EDuration="47.230744775s" podCreationTimestamp="2026-02-19 21:48:42 +0000 UTC" firstStartedPulling="2026-02-19 21:48:43.111054538 +0000 UTC m=+1223.382496998" lastFinishedPulling="2026-02-19 21:49:27.934186516 +0000 UTC m=+1268.205628976" observedRunningTime="2026-02-19 21:49:29.223886462 +0000 UTC m=+1269.495328932" watchObservedRunningTime="2026-02-19 21:49:29.230744775 +0000 UTC m=+1269.502187245" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.240782 4771 scope.go:117] "RemoveContainer" containerID="c15c2f6b67b7a0766f50154f47f4eadaf352c45f8933d1d886d19522e5848490" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.255539 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-7kf98"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.264529 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-7kf98"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.268144 4771 scope.go:117] "RemoveContainer" containerID="d1930253fb64e7437cb83f851ea3c642d444ccbebaf0184579209df0de9f536f" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.279320 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6cf6b5c758-nxz7l"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.300765 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6cf6b5c758-nxz7l"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.356887 4771 scope.go:117] "RemoveContainer" containerID="c9f92b78b08bc05ea5e68f6382b6fd2262354fab289ce8c82fede1906a5f05ec" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.362983 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363343 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api-log" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api-log" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363373 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6263f86-8184-4c2a-b2b0-80cfecba212d" containerName="cinder-db-sync" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363379 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6263f86-8184-4c2a-b2b0-80cfecba212d" containerName="cinder-db-sync" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363390 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker-log" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363396 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker-log" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363410 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363416 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363428 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerName="dnsmasq-dns" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363434 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerName="dnsmasq-dns" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363445 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerName="dnsmasq-dns" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363451 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerName="dnsmasq-dns" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363465 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener-log" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363471 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener-log" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363482 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerName="init" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363488 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerName="init" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363497 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363503 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363515 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerName="init" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363521 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerName="init" Feb 19 21:49:29 crc kubenswrapper[4771]: E0219 21:49:29.363536 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363543 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363690 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener-log" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363703 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6263f86-8184-4c2a-b2b0-80cfecba212d" containerName="cinder-db-sync" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363717 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" containerName="dnsmasq-dns" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363728 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363737 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3190ceaf-e7fa-4ee7-a708-acd4a044e99d" containerName="dnsmasq-dns" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363745 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d48d8e-a3e6-45ad-bb02-372d6491aa74" containerName="barbican-api-log" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363755 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker-log" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363767 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" containerName="barbican-worker" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.363777 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" containerName="barbican-keystone-listener" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.364754 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.372050 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.373286 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-h5558" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.373499 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.373654 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.384138 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.451573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.451660 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9323d6-6ed4-4e8f-bc34-c856cb767eee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.451682 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxhpk\" (UniqueName: \"kubernetes.io/projected/db9323d6-6ed4-4e8f-bc34-c856cb767eee-kube-api-access-wxhpk\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.451718 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.451740 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.451757 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-scripts\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.474069 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-khtzg"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.475393 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.495798 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-khtzg"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553636 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxhpk\" (UniqueName: \"kubernetes.io/projected/db9323d6-6ed4-4e8f-bc34-c856cb767eee-kube-api-access-wxhpk\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553729 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-scripts\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553778 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553807 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-config\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553858 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553904 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l68mb\" (UniqueName: \"kubernetes.io/projected/c2974e7d-3e17-4fb5-bc64-6e33b4237107-kube-api-access-l68mb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9323d6-6ed4-4e8f-bc34-c856cb767eee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.553972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-svc\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.555030 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9323d6-6ed4-4e8f-bc34-c856cb767eee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.558400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.559262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.559588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-scripts\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.560962 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.573510 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxhpk\" (UniqueName: \"kubernetes.io/projected/db9323d6-6ed4-4e8f-bc34-c856cb767eee-kube-api-access-wxhpk\") pod \"cinder-scheduler-0\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.623200 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.626092 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.628382 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.640836 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655361 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655429 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-scripts\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655474 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655530 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9657414b-a103-4415-8b0c-873f532271db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655559 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl8pm\" (UniqueName: \"kubernetes.io/projected/9657414b-a103-4415-8b0c-873f532271db-kube-api-access-zl8pm\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-config\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data-custom\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l68mb\" (UniqueName: \"kubernetes.io/projected/c2974e7d-3e17-4fb5-bc64-6e33b4237107-kube-api-access-l68mb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655690 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655709 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-svc\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.655744 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9657414b-a103-4415-8b0c-873f532271db-logs\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.656599 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.656717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.658591 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.659102 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-svc\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.660774 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-config\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.677639 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l68mb\" (UniqueName: \"kubernetes.io/projected/c2974e7d-3e17-4fb5-bc64-6e33b4237107-kube-api-access-l68mb\") pod \"dnsmasq-dns-6dc67df487-khtzg\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.741679 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.757635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.757684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-scripts\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.757702 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.757718 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9657414b-a103-4415-8b0c-873f532271db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.757741 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl8pm\" (UniqueName: \"kubernetes.io/projected/9657414b-a103-4415-8b0c-873f532271db-kube-api-access-zl8pm\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.757788 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data-custom\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.757832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9657414b-a103-4415-8b0c-873f532271db-logs\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.758213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9657414b-a103-4415-8b0c-873f532271db-logs\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.758256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9657414b-a103-4415-8b0c-873f532271db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.762935 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.762941 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data-custom\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.764320 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-scripts\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.767703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.774532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl8pm\" (UniqueName: \"kubernetes.io/projected/9657414b-a103-4415-8b0c-873f532271db-kube-api-access-zl8pm\") pod \"cinder-api-0\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " pod="openstack/cinder-api-0" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.793725 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:29 crc kubenswrapper[4771]: I0219 21:49:29.952612 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.237154 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.333700 4771 generic.go:334] "Generic (PLEG): container finished" podID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerID="5b81496ba5c0165a7a9d05175e4743d29b2dcbd7e342c83dca657a5fe8ecbb23" exitCode=0 Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.333725 4771 generic.go:334] "Generic (PLEG): container finished" podID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerID="ff6e52627f8b1152d746602a27350ea7da0f00eb438702f45f295da242bf04ac" exitCode=2 Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.333732 4771 generic.go:334] "Generic (PLEG): container finished" podID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerID="c3cba4d268336b5047be73f17219d1e2dcda048724027bf6af3d39710442bf20" exitCode=0 Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.333786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerDied","Data":"5b81496ba5c0165a7a9d05175e4743d29b2dcbd7e342c83dca657a5fe8ecbb23"} Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.333809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerDied","Data":"ff6e52627f8b1152d746602a27350ea7da0f00eb438702f45f295da242bf04ac"} Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.333819 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerDied","Data":"c3cba4d268336b5047be73f17219d1e2dcda048724027bf6af3d39710442bf20"} Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.381561 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-khtzg"] Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.465423 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b1d546-2532-46e6-8545-d099ebad7fa6" path="/var/lib/kubelet/pods/10b1d546-2532-46e6-8545-d099ebad7fa6/volumes" Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.467418 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13d8ae8-258b-430f-96fb-789f503797e9" path="/var/lib/kubelet/pods/b13d8ae8-258b-430f-96fb-789f503797e9/volumes" Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.468147 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1114ef-2c67-4e51-b05e-a5608c8d5b00" path="/var/lib/kubelet/pods/be1114ef-2c67-4e51-b05e-a5608c8d5b00/volumes" Feb 19 21:49:30 crc kubenswrapper[4771]: I0219 21:49:30.532942 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.411666 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b4c966d9d-766ld" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:40900->10.217.0.164:9311: read: connection reset by peer" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.411689 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b4c966d9d-766ld" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:40908->10.217.0.164:9311: read: connection reset by peer" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.415398 4771 generic.go:334] "Generic (PLEG): container finished" podID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerID="0c7e7bd5583eb8f512f402ebf8523dff6385edebe51004be10badb960f1b9987" exitCode=0 Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.415452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerDied","Data":"0c7e7bd5583eb8f512f402ebf8523dff6385edebe51004be10badb960f1b9987"} Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.422399 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9657414b-a103-4415-8b0c-873f532271db","Type":"ContainerStarted","Data":"34d955f6ce785ca5102ed3d08369096426c73428d7115ae6d11a102e42fd4390"} Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.422597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9657414b-a103-4415-8b0c-873f532271db","Type":"ContainerStarted","Data":"30cf1e4eb6c0f93604dba20565b9d4be692e99cdf576881367d29669343b9920"} Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.423896 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9323d6-6ed4-4e8f-bc34-c856cb767eee","Type":"ContainerStarted","Data":"8b537d499028a117bebba8df5dd19307177a89b31c348a4eb02391e62739f0f0"} Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.425703 4771 generic.go:334] "Generic (PLEG): container finished" podID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerID="05a5cf0de26dd34b5edca24d533555fe952a16f66da1168202f36726ddafe278" exitCode=0 Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.425725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" event={"ID":"c2974e7d-3e17-4fb5-bc64-6e33b4237107","Type":"ContainerDied","Data":"05a5cf0de26dd34b5edca24d533555fe952a16f66da1168202f36726ddafe278"} Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.425740 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" event={"ID":"c2974e7d-3e17-4fb5-bc64-6e33b4237107","Type":"ContainerStarted","Data":"73e080550eb3beb8fdcafd5edec32d8d5ee51d6540a73651d9b2444a29f216cb"} Feb 19 21:49:31 crc kubenswrapper[4771]: E0219 21:49:31.568717 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod732a349d_e1eb_4ca6_b1b7_9dfe6fbd0102.slice/crio-9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod732a349d_e1eb_4ca6_b1b7_9dfe6fbd0102.slice/crio-conmon-9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90.scope\": RecentStats: unable to find data in memory cache]" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.579471 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.713284 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-run-httpd\") pod \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.713560 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-scripts\") pod \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.713581 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-config-data\") pod \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.713671 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-log-httpd\") pod \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.714212 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "63e1bdf2-42cd-4470-af16-8ebafd6580cf" (UID: "63e1bdf2-42cd-4470-af16-8ebafd6580cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.714377 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twm2n\" (UniqueName: \"kubernetes.io/projected/63e1bdf2-42cd-4470-af16-8ebafd6580cf-kube-api-access-twm2n\") pod \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.714456 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "63e1bdf2-42cd-4470-af16-8ebafd6580cf" (UID: "63e1bdf2-42cd-4470-af16-8ebafd6580cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.714506 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-combined-ca-bundle\") pod \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.714560 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-sg-core-conf-yaml\") pod \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\" (UID: \"63e1bdf2-42cd-4470-af16-8ebafd6580cf\") " Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.714922 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.714938 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63e1bdf2-42cd-4470-af16-8ebafd6580cf-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.717673 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e1bdf2-42cd-4470-af16-8ebafd6580cf-kube-api-access-twm2n" (OuterVolumeSpecName: "kube-api-access-twm2n") pod "63e1bdf2-42cd-4470-af16-8ebafd6580cf" (UID: "63e1bdf2-42cd-4470-af16-8ebafd6580cf"). InnerVolumeSpecName "kube-api-access-twm2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.717757 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-scripts" (OuterVolumeSpecName: "scripts") pod "63e1bdf2-42cd-4470-af16-8ebafd6580cf" (UID: "63e1bdf2-42cd-4470-af16-8ebafd6580cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.813052 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "63e1bdf2-42cd-4470-af16-8ebafd6580cf" (UID: "63e1bdf2-42cd-4470-af16-8ebafd6580cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.816645 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twm2n\" (UniqueName: \"kubernetes.io/projected/63e1bdf2-42cd-4470-af16-8ebafd6580cf-kube-api-access-twm2n\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.816676 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.816688 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.821233 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.864915 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63e1bdf2-42cd-4470-af16-8ebafd6580cf" (UID: "63e1bdf2-42cd-4470-af16-8ebafd6580cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.888748 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-config-data" (OuterVolumeSpecName: "config-data") pod "63e1bdf2-42cd-4470-af16-8ebafd6580cf" (UID: "63e1bdf2-42cd-4470-af16-8ebafd6580cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.897905 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.917871 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:31 crc kubenswrapper[4771]: I0219 21:49:31.917894 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e1bdf2-42cd-4470-af16-8ebafd6580cf-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.019330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-logs\") pod \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.019539 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-combined-ca-bundle\") pod \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.019691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data\") pod \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.019801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s26t\" (UniqueName: \"kubernetes.io/projected/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-kube-api-access-8s26t\") pod \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.020263 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-logs" (OuterVolumeSpecName: "logs") pod "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" (UID: "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.020491 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data-custom\") pod \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\" (UID: \"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102\") " Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.020999 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.024643 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-kube-api-access-8s26t" (OuterVolumeSpecName: "kube-api-access-8s26t") pod "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" (UID: "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102"). InnerVolumeSpecName "kube-api-access-8s26t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.033446 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" (UID: "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.066860 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" (UID: "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.083633 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data" (OuterVolumeSpecName: "config-data") pod "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" (UID: "732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.122938 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.123189 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s26t\" (UniqueName: \"kubernetes.io/projected/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-kube-api-access-8s26t\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.123254 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.123372 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.439999 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.464378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63e1bdf2-42cd-4470-af16-8ebafd6580cf","Type":"ContainerDied","Data":"7df17d9f2567324f58e60e5311d82037c8e802eae974a6b66262bfa0fbf1a116"} Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.464447 4771 scope.go:117] "RemoveContainer" containerID="5b81496ba5c0165a7a9d05175e4743d29b2dcbd7e342c83dca657a5fe8ecbb23" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.498737 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9657414b-a103-4415-8b0c-873f532271db","Type":"ContainerStarted","Data":"050c7646bddc86dead29a1986df826ed7afb73f99f9161f47f854552f6cce761"} Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.498944 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api-log" containerID="cri-o://34d955f6ce785ca5102ed3d08369096426c73428d7115ae6d11a102e42fd4390" gracePeriod=30 Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.499070 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api" containerID="cri-o://050c7646bddc86dead29a1986df826ed7afb73f99f9161f47f854552f6cce761" gracePeriod=30 Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.499074 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.513961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9323d6-6ed4-4e8f-bc34-c856cb767eee","Type":"ContainerStarted","Data":"95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a"} Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.545884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" event={"ID":"c2974e7d-3e17-4fb5-bc64-6e33b4237107","Type":"ContainerStarted","Data":"66ddd0890bf0446c337501370919c8c56f71b19bf3b7a656b3b554ade625e9bd"} Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.546452 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.549117 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.572463 4771 generic.go:334] "Generic (PLEG): container finished" podID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerID="9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90" exitCode=0 Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.572524 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c966d9d-766ld" event={"ID":"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102","Type":"ContainerDied","Data":"9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90"} Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.572768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b4c966d9d-766ld" event={"ID":"732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102","Type":"ContainerDied","Data":"cbe0887fc80fe89e7d7d9efb2723071f14394653aecca00c6fbd231da0d286a5"} Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.572859 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b4c966d9d-766ld" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.576887 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.593894 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.594294 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="proxy-httpd" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594312 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="proxy-httpd" Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.594331 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594338 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api" Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.594353 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="sg-core" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="sg-core" Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.594374 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-central-agent" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594380 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-central-agent" Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.594388 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api-log" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594394 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api-log" Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.594407 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-notification-agent" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594413 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-notification-agent" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594563 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594573 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="proxy-httpd" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594580 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-central-agent" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594593 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="sg-core" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594601 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" containerName="barbican-api-log" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.594614 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" containerName="ceilometer-notification-agent" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.596079 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.598294 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.599365 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.606254 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.606237566 podStartE2EDuration="3.606237566s" podCreationTimestamp="2026-02-19 21:49:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:32.533536401 +0000 UTC m=+1272.804978871" watchObservedRunningTime="2026-02-19 21:49:32.606237566 +0000 UTC m=+1272.877680036" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.606450 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.606923 4771 scope.go:117] "RemoveContainer" containerID="ff6e52627f8b1152d746602a27350ea7da0f00eb438702f45f295da242bf04ac" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.613674 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" podStartSLOduration=3.613656753 podStartE2EDuration="3.613656753s" podCreationTimestamp="2026-02-19 21:49:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:32.568659676 +0000 UTC m=+1272.840102156" watchObservedRunningTime="2026-02-19 21:49:32.613656753 +0000 UTC m=+1272.885099223" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.631011 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b4c966d9d-766ld"] Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.635451 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-log-httpd\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.635489 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-run-httpd\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.635534 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-config-data\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.635560 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.636146 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.636173 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-scripts\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.636395 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2gj9\" (UniqueName: \"kubernetes.io/projected/e456659d-0497-4509-b998-ee4b5de724d9-kube-api-access-s2gj9\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.638953 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b4c966d9d-766ld"] Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.644231 4771 scope.go:117] "RemoveContainer" containerID="0c7e7bd5583eb8f512f402ebf8523dff6385edebe51004be10badb960f1b9987" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.664130 4771 scope.go:117] "RemoveContainer" containerID="c3cba4d268336b5047be73f17219d1e2dcda048724027bf6af3d39710442bf20" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.683980 4771 scope.go:117] "RemoveContainer" containerID="9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.705295 4771 scope.go:117] "RemoveContainer" containerID="23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.722038 4771 scope.go:117] "RemoveContainer" containerID="9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90" Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.722387 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90\": container with ID starting with 9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90 not found: ID does not exist" containerID="9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.722417 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90"} err="failed to get container status \"9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90\": rpc error: code = NotFound desc = could not find container \"9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90\": container with ID starting with 9c5f59c63411fcb9193c0aeef3d48143a6a5ba4cd031f9e4ac83d66c37420f90 not found: ID does not exist" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.722440 4771 scope.go:117] "RemoveContainer" containerID="23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46" Feb 19 21:49:32 crc kubenswrapper[4771]: E0219 21:49:32.722797 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46\": container with ID starting with 23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46 not found: ID does not exist" containerID="23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.722835 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46"} err="failed to get container status \"23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46\": rpc error: code = NotFound desc = could not find container \"23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46\": container with ID starting with 23cefe3a6f5aec3611d18fede5bafbf52e1386c0790d06a793ff67a99c08fc46 not found: ID does not exist" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738422 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-log-httpd\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-run-httpd\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738498 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-config-data\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738521 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738542 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738562 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-scripts\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2gj9\" (UniqueName: \"kubernetes.io/projected/e456659d-0497-4509-b998-ee4b5de724d9-kube-api-access-s2gj9\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.738918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-log-httpd\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.739103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-run-httpd\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.742633 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-config-data\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.742789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.744313 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.744595 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-scripts\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.752505 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2gj9\" (UniqueName: \"kubernetes.io/projected/e456659d-0497-4509-b998-ee4b5de724d9-kube-api-access-s2gj9\") pod \"ceilometer-0\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " pod="openstack/ceilometer-0" Feb 19 21:49:32 crc kubenswrapper[4771]: I0219 21:49:32.915363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.558921 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.581934 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9323d6-6ed4-4e8f-bc34-c856cb767eee","Type":"ContainerStarted","Data":"2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2"} Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.584632 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerStarted","Data":"846a465caafaf697f57b70498e9d8c31aea9ef7f01267453cf36fce35d029aab"} Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.588284 4771 generic.go:334] "Generic (PLEG): container finished" podID="9657414b-a103-4415-8b0c-873f532271db" containerID="050c7646bddc86dead29a1986df826ed7afb73f99f9161f47f854552f6cce761" exitCode=0 Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.588316 4771 generic.go:334] "Generic (PLEG): container finished" podID="9657414b-a103-4415-8b0c-873f532271db" containerID="34d955f6ce785ca5102ed3d08369096426c73428d7115ae6d11a102e42fd4390" exitCode=143 Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.588358 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9657414b-a103-4415-8b0c-873f532271db","Type":"ContainerDied","Data":"050c7646bddc86dead29a1986df826ed7afb73f99f9161f47f854552f6cce761"} Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.588391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9657414b-a103-4415-8b0c-873f532271db","Type":"ContainerDied","Data":"34d955f6ce785ca5102ed3d08369096426c73428d7115ae6d11a102e42fd4390"} Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.588403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9657414b-a103-4415-8b0c-873f532271db","Type":"ContainerDied","Data":"30cf1e4eb6c0f93604dba20565b9d4be692e99cdf576881367d29669343b9920"} Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.588413 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30cf1e4eb6c0f93604dba20565b9d4be692e99cdf576881367d29669343b9920" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.605755 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.757349788 podStartE2EDuration="4.605735668s" podCreationTimestamp="2026-02-19 21:49:29 +0000 UTC" firstStartedPulling="2026-02-19 21:49:30.329482978 +0000 UTC m=+1270.600925448" lastFinishedPulling="2026-02-19 21:49:31.177868818 +0000 UTC m=+1271.449311328" observedRunningTime="2026-02-19 21:49:33.59752881 +0000 UTC m=+1273.868971280" watchObservedRunningTime="2026-02-19 21:49:33.605735668 +0000 UTC m=+1273.877178138" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.653914 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.767793 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl8pm\" (UniqueName: \"kubernetes.io/projected/9657414b-a103-4415-8b0c-873f532271db-kube-api-access-zl8pm\") pod \"9657414b-a103-4415-8b0c-873f532271db\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.767856 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9657414b-a103-4415-8b0c-873f532271db-logs\") pod \"9657414b-a103-4415-8b0c-873f532271db\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.767892 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data-custom\") pod \"9657414b-a103-4415-8b0c-873f532271db\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.767931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-combined-ca-bundle\") pod \"9657414b-a103-4415-8b0c-873f532271db\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.767948 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-scripts\") pod \"9657414b-a103-4415-8b0c-873f532271db\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.767980 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data\") pod \"9657414b-a103-4415-8b0c-873f532271db\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.768056 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9657414b-a103-4415-8b0c-873f532271db-etc-machine-id\") pod \"9657414b-a103-4415-8b0c-873f532271db\" (UID: \"9657414b-a103-4415-8b0c-873f532271db\") " Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.768440 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9657414b-a103-4415-8b0c-873f532271db-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9657414b-a103-4415-8b0c-873f532271db" (UID: "9657414b-a103-4415-8b0c-873f532271db"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.769216 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9657414b-a103-4415-8b0c-873f532271db-logs" (OuterVolumeSpecName: "logs") pod "9657414b-a103-4415-8b0c-873f532271db" (UID: "9657414b-a103-4415-8b0c-873f532271db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.774459 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9657414b-a103-4415-8b0c-873f532271db-kube-api-access-zl8pm" (OuterVolumeSpecName: "kube-api-access-zl8pm") pod "9657414b-a103-4415-8b0c-873f532271db" (UID: "9657414b-a103-4415-8b0c-873f532271db"). InnerVolumeSpecName "kube-api-access-zl8pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.776440 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9657414b-a103-4415-8b0c-873f532271db" (UID: "9657414b-a103-4415-8b0c-873f532271db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.789562 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-scripts" (OuterVolumeSpecName: "scripts") pod "9657414b-a103-4415-8b0c-873f532271db" (UID: "9657414b-a103-4415-8b0c-873f532271db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.819442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9657414b-a103-4415-8b0c-873f532271db" (UID: "9657414b-a103-4415-8b0c-873f532271db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.821352 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data" (OuterVolumeSpecName: "config-data") pod "9657414b-a103-4415-8b0c-873f532271db" (UID: "9657414b-a103-4415-8b0c-873f532271db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.870134 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9657414b-a103-4415-8b0c-873f532271db-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.870180 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl8pm\" (UniqueName: \"kubernetes.io/projected/9657414b-a103-4415-8b0c-873f532271db-kube-api-access-zl8pm\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.870202 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9657414b-a103-4415-8b0c-873f532271db-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.870220 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.870239 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.870255 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:33 crc kubenswrapper[4771]: I0219 21:49:33.870270 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9657414b-a103-4415-8b0c-873f532271db-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.454908 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e1bdf2-42cd-4470-af16-8ebafd6580cf" path="/var/lib/kubelet/pods/63e1bdf2-42cd-4470-af16-8ebafd6580cf/volumes" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.456151 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102" path="/var/lib/kubelet/pods/732a349d-e1eb-4ca6-b1b7-9dfe6fbd0102/volumes" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.602868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerStarted","Data":"797328426e0aa2206ab417aa2383a42589dfff89b0e57599eb711c18a60a46ab"} Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.602944 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.628788 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.642533 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.663100 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:34 crc kubenswrapper[4771]: E0219 21:49:34.663623 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.663665 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api" Feb 19 21:49:34 crc kubenswrapper[4771]: E0219 21:49:34.663731 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api-log" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.663744 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api-log" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.664074 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api-log" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.664106 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9657414b-a103-4415-8b0c-873f532271db" containerName="cinder-api" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.665683 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.677455 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.677831 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.677900 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.685055 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.743339 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.785748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.785838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.785864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-scripts\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.785912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.785941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.786048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bp4\" (UniqueName: \"kubernetes.io/projected/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-kube-api-access-q8bp4\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.786094 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-logs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.786140 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.786228 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.905470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.905821 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.905855 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8bp4\" (UniqueName: \"kubernetes.io/projected/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-kube-api-access-q8bp4\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.905621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.905899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-logs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.905953 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.906006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.906434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-logs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.906525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.906588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.906618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-scripts\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.909436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.910502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.910910 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.911948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.918002 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-scripts\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.926500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8bp4\" (UniqueName: \"kubernetes.io/projected/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-kube-api-access-q8bp4\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:34 crc kubenswrapper[4771]: I0219 21:49:34.927005 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data\") pod \"cinder-api-0\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " pod="openstack/cinder-api-0" Feb 19 21:49:35 crc kubenswrapper[4771]: I0219 21:49:35.001529 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:49:35 crc kubenswrapper[4771]: I0219 21:49:35.469475 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:35 crc kubenswrapper[4771]: I0219 21:49:35.611639 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerStarted","Data":"489543f4c2581d9007a1cecc6d4a40185759151af2fd2cbb5dee52c8d835eb1b"} Feb 19 21:49:35 crc kubenswrapper[4771]: I0219 21:49:35.615041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e","Type":"ContainerStarted","Data":"a5c466f91dea18e630a557c52810ffe412a14d61ac7179f7205d3432342a3d4c"} Feb 19 21:49:36 crc kubenswrapper[4771]: I0219 21:49:36.457221 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9657414b-a103-4415-8b0c-873f532271db" path="/var/lib/kubelet/pods/9657414b-a103-4415-8b0c-873f532271db/volumes" Feb 19 21:49:36 crc kubenswrapper[4771]: I0219 21:49:36.632671 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e","Type":"ContainerStarted","Data":"4db106a42f266426c6257e14ae4da294c4869692c1f876160b68b9126d307faf"} Feb 19 21:49:36 crc kubenswrapper[4771]: I0219 21:49:36.644948 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerStarted","Data":"b97d6b46293a12fd9d5002169110b0ce2c18eb6532ec0575517600274e33c5ae"} Feb 19 21:49:37 crc kubenswrapper[4771]: I0219 21:49:37.652436 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e","Type":"ContainerStarted","Data":"b9023b235acac843c6d06970035afa4b4d903cfed719a74a48514a2a10f8f915"} Feb 19 21:49:37 crc kubenswrapper[4771]: I0219 21:49:37.652831 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 21:49:37 crc kubenswrapper[4771]: I0219 21:49:37.654868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerStarted","Data":"f004750bc4d1bf39d0aaab57fe1c4483dc09933c243421394310b3293f8fd926"} Feb 19 21:49:37 crc kubenswrapper[4771]: I0219 21:49:37.655034 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:49:37 crc kubenswrapper[4771]: I0219 21:49:37.678989 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.67897079 podStartE2EDuration="3.67897079s" podCreationTimestamp="2026-02-19 21:49:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:37.676441733 +0000 UTC m=+1277.947884203" watchObservedRunningTime="2026-02-19 21:49:37.67897079 +0000 UTC m=+1277.950413260" Feb 19 21:49:37 crc kubenswrapper[4771]: I0219 21:49:37.732166 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.046467209 podStartE2EDuration="5.732148886s" podCreationTimestamp="2026-02-19 21:49:32 +0000 UTC" firstStartedPulling="2026-02-19 21:49:33.566777221 +0000 UTC m=+1273.838219691" lastFinishedPulling="2026-02-19 21:49:37.252458888 +0000 UTC m=+1277.523901368" observedRunningTime="2026-02-19 21:49:37.727229704 +0000 UTC m=+1277.998672184" watchObservedRunningTime="2026-02-19 21:49:37.732148886 +0000 UTC m=+1278.003591356" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.327108 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.584102 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79b4d4dd8f-2sfll"] Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.584488 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79b4d4dd8f-2sfll" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-api" containerID="cri-o://7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd" gracePeriod=30 Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.584570 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79b4d4dd8f-2sfll" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-httpd" containerID="cri-o://75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b" gracePeriod=30 Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.594528 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-79b4d4dd8f-2sfll" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": EOF" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.629704 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c4d6b9785-889jt"] Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.631386 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.650072 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c4d6b9785-889jt"] Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.711836 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-httpd-config\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.712185 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-public-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.712211 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-config\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.712233 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-ovndb-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.712249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-combined-ca-bundle\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.712269 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-internal-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.712397 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks497\" (UniqueName: \"kubernetes.io/projected/b5526752-6549-40aa-8443-9ad3572799d2-kube-api-access-ks497\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.796133 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.814046 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-httpd-config\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.814119 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-public-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.814145 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-config\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.814170 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-ovndb-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.814186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-combined-ca-bundle\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.814203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-internal-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.814227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks497\" (UniqueName: \"kubernetes.io/projected/b5526752-6549-40aa-8443-9ad3572799d2-kube-api-access-ks497\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.822235 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-httpd-config\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.822731 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-public-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.847905 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-combined-ca-bundle\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.849169 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-config\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.851878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-internal-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.852556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-ovndb-tls-certs\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.855929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks497\" (UniqueName: \"kubernetes.io/projected/b5526752-6549-40aa-8443-9ad3572799d2-kube-api-access-ks497\") pod \"neutron-6c4d6b9785-889jt\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.876108 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-tsz8t"] Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.876332 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" podUID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerName="dnsmasq-dns" containerID="cri-o://314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e" gracePeriod=10 Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.964297 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:39 crc kubenswrapper[4771]: I0219 21:49:39.982142 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.050790 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.360406 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.541100 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-nb\") pod \"c5790e42-f883-43b1-912b-4226fa3b8db1\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.541225 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-swift-storage-0\") pod \"c5790e42-f883-43b1-912b-4226fa3b8db1\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.541278 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhw9h\" (UniqueName: \"kubernetes.io/projected/c5790e42-f883-43b1-912b-4226fa3b8db1-kube-api-access-lhw9h\") pod \"c5790e42-f883-43b1-912b-4226fa3b8db1\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.541363 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-sb\") pod \"c5790e42-f883-43b1-912b-4226fa3b8db1\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.541402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-config\") pod \"c5790e42-f883-43b1-912b-4226fa3b8db1\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.541473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-svc\") pod \"c5790e42-f883-43b1-912b-4226fa3b8db1\" (UID: \"c5790e42-f883-43b1-912b-4226fa3b8db1\") " Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.553296 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5790e42-f883-43b1-912b-4226fa3b8db1-kube-api-access-lhw9h" (OuterVolumeSpecName: "kube-api-access-lhw9h") pod "c5790e42-f883-43b1-912b-4226fa3b8db1" (UID: "c5790e42-f883-43b1-912b-4226fa3b8db1"). InnerVolumeSpecName "kube-api-access-lhw9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.597875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5790e42-f883-43b1-912b-4226fa3b8db1" (UID: "c5790e42-f883-43b1-912b-4226fa3b8db1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.600589 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5790e42-f883-43b1-912b-4226fa3b8db1" (UID: "c5790e42-f883-43b1-912b-4226fa3b8db1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.606705 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5790e42-f883-43b1-912b-4226fa3b8db1" (UID: "c5790e42-f883-43b1-912b-4226fa3b8db1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.608152 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c5790e42-f883-43b1-912b-4226fa3b8db1" (UID: "c5790e42-f883-43b1-912b-4226fa3b8db1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.615469 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-config" (OuterVolumeSpecName: "config") pod "c5790e42-f883-43b1-912b-4226fa3b8db1" (UID: "c5790e42-f883-43b1-912b-4226fa3b8db1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.644194 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.644436 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.644515 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.644585 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.644653 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5790e42-f883-43b1-912b-4226fa3b8db1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.644722 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhw9h\" (UniqueName: \"kubernetes.io/projected/c5790e42-f883-43b1-912b-4226fa3b8db1-kube-api-access-lhw9h\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.660566 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c4d6b9785-889jt"] Feb 19 21:49:40 crc kubenswrapper[4771]: W0219 21:49:40.665475 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5526752_6549_40aa_8443_9ad3572799d2.slice/crio-39393119ad43783eb2cb65c6296110c5c8ca46ff51c0cb0efa1a2788ca5fdd75 WatchSource:0}: Error finding container 39393119ad43783eb2cb65c6296110c5c8ca46ff51c0cb0efa1a2788ca5fdd75: Status 404 returned error can't find the container with id 39393119ad43783eb2cb65c6296110c5c8ca46ff51c0cb0efa1a2788ca5fdd75 Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.719554 4771 generic.go:334] "Generic (PLEG): container finished" podID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerID="314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e" exitCode=0 Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.719617 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.719624 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" event={"ID":"c5790e42-f883-43b1-912b-4226fa3b8db1","Type":"ContainerDied","Data":"314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e"} Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.719739 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-tsz8t" event={"ID":"c5790e42-f883-43b1-912b-4226fa3b8db1","Type":"ContainerDied","Data":"3bdcd8e144d5ec9a1c44b77ba00113d4f2ed92e8df3974978ac48b2b8f33d970"} Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.719758 4771 scope.go:117] "RemoveContainer" containerID="314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.720698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4d6b9785-889jt" event={"ID":"b5526752-6549-40aa-8443-9ad3572799d2","Type":"ContainerStarted","Data":"39393119ad43783eb2cb65c6296110c5c8ca46ff51c0cb0efa1a2788ca5fdd75"} Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.722321 4771 generic.go:334] "Generic (PLEG): container finished" podID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerID="75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b" exitCode=0 Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.722479 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="cinder-scheduler" containerID="cri-o://95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a" gracePeriod=30 Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.723054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79b4d4dd8f-2sfll" event={"ID":"c8508bb1-418f-4cc7-8021-97c4a8e7e77f","Type":"ContainerDied","Data":"75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b"} Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.723096 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="probe" containerID="cri-o://2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2" gracePeriod=30 Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.752769 4771 scope.go:117] "RemoveContainer" containerID="fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.757515 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-tsz8t"] Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.765764 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-tsz8t"] Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.785086 4771 scope.go:117] "RemoveContainer" containerID="314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e" Feb 19 21:49:40 crc kubenswrapper[4771]: E0219 21:49:40.785465 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e\": container with ID starting with 314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e not found: ID does not exist" containerID="314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.785504 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e"} err="failed to get container status \"314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e\": rpc error: code = NotFound desc = could not find container \"314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e\": container with ID starting with 314a128442e3c742ed4495ea4fbb3447d3f737bf1370da58cfa49c00767f055e not found: ID does not exist" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.785525 4771 scope.go:117] "RemoveContainer" containerID="fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4" Feb 19 21:49:40 crc kubenswrapper[4771]: E0219 21:49:40.785839 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4\": container with ID starting with fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4 not found: ID does not exist" containerID="fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4" Feb 19 21:49:40 crc kubenswrapper[4771]: I0219 21:49:40.785860 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4"} err="failed to get container status \"fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4\": rpc error: code = NotFound desc = could not find container \"fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4\": container with ID starting with fd2fc04fe5546bc3f0315ab5d3ecd3794eb4f2c1f643d98466b3c1ad88f877d4 not found: ID does not exist" Feb 19 21:49:41 crc kubenswrapper[4771]: I0219 21:49:41.646614 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-79b4d4dd8f-2sfll" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Feb 19 21:49:41 crc kubenswrapper[4771]: I0219 21:49:41.732140 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4d6b9785-889jt" event={"ID":"b5526752-6549-40aa-8443-9ad3572799d2","Type":"ContainerStarted","Data":"89ed7fd8b6567c5530e1c7e743cf565b8674ecca14b4cf2d6f7229ab2c36478e"} Feb 19 21:49:41 crc kubenswrapper[4771]: I0219 21:49:41.732682 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4d6b9785-889jt" event={"ID":"b5526752-6549-40aa-8443-9ad3572799d2","Type":"ContainerStarted","Data":"28873d69596e9435ccc225be2172f05b9359e76cc1602b3c9c36e2902b461021"} Feb 19 21:49:41 crc kubenswrapper[4771]: I0219 21:49:41.732776 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:49:41 crc kubenswrapper[4771]: I0219 21:49:41.734747 4771 generic.go:334] "Generic (PLEG): container finished" podID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerID="2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2" exitCode=0 Feb 19 21:49:41 crc kubenswrapper[4771]: I0219 21:49:41.734794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9323d6-6ed4-4e8f-bc34-c856cb767eee","Type":"ContainerDied","Data":"2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2"} Feb 19 21:49:41 crc kubenswrapper[4771]: I0219 21:49:41.778630 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c4d6b9785-889jt" podStartSLOduration=2.778611105 podStartE2EDuration="2.778611105s" podCreationTimestamp="2026-02-19 21:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:41.756513367 +0000 UTC m=+1282.027955877" watchObservedRunningTime="2026-02-19 21:49:41.778611105 +0000 UTC m=+1282.050053585" Feb 19 21:49:42 crc kubenswrapper[4771]: I0219 21:49:42.452677 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5790e42-f883-43b1-912b-4226fa3b8db1" path="/var/lib/kubelet/pods/c5790e42-f883-43b1-912b-4226fa3b8db1/volumes" Feb 19 21:49:42 crc kubenswrapper[4771]: I0219 21:49:42.956689 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:49:42 crc kubenswrapper[4771]: I0219 21:49:42.957196 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:49:42 crc kubenswrapper[4771]: I0219 21:49:42.957246 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:49:42 crc kubenswrapper[4771]: I0219 21:49:42.958122 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b980742e22e0e4d74d8de52154d96047cf8f0b8f6b46b5fe1e26926aa9dcd46b"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:49:42 crc kubenswrapper[4771]: I0219 21:49:42.958192 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://b980742e22e0e4d74d8de52154d96047cf8f0b8f6b46b5fe1e26926aa9dcd46b" gracePeriod=600 Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.194379 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.316328 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data\") pod \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.316407 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data-custom\") pod \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.316452 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxhpk\" (UniqueName: \"kubernetes.io/projected/db9323d6-6ed4-4e8f-bc34-c856cb767eee-kube-api-access-wxhpk\") pod \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.316530 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9323d6-6ed4-4e8f-bc34-c856cb767eee-etc-machine-id\") pod \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.316565 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-combined-ca-bundle\") pod \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.316673 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-scripts\") pod \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\" (UID: \"db9323d6-6ed4-4e8f-bc34-c856cb767eee\") " Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.316914 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db9323d6-6ed4-4e8f-bc34-c856cb767eee-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "db9323d6-6ed4-4e8f-bc34-c856cb767eee" (UID: "db9323d6-6ed4-4e8f-bc34-c856cb767eee"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.317281 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9323d6-6ed4-4e8f-bc34-c856cb767eee-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.322584 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-scripts" (OuterVolumeSpecName: "scripts") pod "db9323d6-6ed4-4e8f-bc34-c856cb767eee" (UID: "db9323d6-6ed4-4e8f-bc34-c856cb767eee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.325166 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "db9323d6-6ed4-4e8f-bc34-c856cb767eee" (UID: "db9323d6-6ed4-4e8f-bc34-c856cb767eee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.329190 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db9323d6-6ed4-4e8f-bc34-c856cb767eee-kube-api-access-wxhpk" (OuterVolumeSpecName: "kube-api-access-wxhpk") pod "db9323d6-6ed4-4e8f-bc34-c856cb767eee" (UID: "db9323d6-6ed4-4e8f-bc34-c856cb767eee"). InnerVolumeSpecName "kube-api-access-wxhpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.384762 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.396676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db9323d6-6ed4-4e8f-bc34-c856cb767eee" (UID: "db9323d6-6ed4-4e8f-bc34-c856cb767eee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.418814 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.418851 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.418877 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxhpk\" (UniqueName: \"kubernetes.io/projected/db9323d6-6ed4-4e8f-bc34-c856cb767eee-kube-api-access-wxhpk\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.418894 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.426596 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data" (OuterVolumeSpecName: "config-data") pod "db9323d6-6ed4-4e8f-bc34-c856cb767eee" (UID: "db9323d6-6ed4-4e8f-bc34-c856cb767eee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.520343 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9323d6-6ed4-4e8f-bc34-c856cb767eee-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.669679 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.765632 4771 generic.go:334] "Generic (PLEG): container finished" podID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerID="95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a" exitCode=0 Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.765743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9323d6-6ed4-4e8f-bc34-c856cb767eee","Type":"ContainerDied","Data":"95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a"} Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.765779 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9323d6-6ed4-4e8f-bc34-c856cb767eee","Type":"ContainerDied","Data":"8b537d499028a117bebba8df5dd19307177a89b31c348a4eb02391e62739f0f0"} Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.765809 4771 scope.go:117] "RemoveContainer" containerID="2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.766001 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.781998 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="b980742e22e0e4d74d8de52154d96047cf8f0b8f6b46b5fe1e26926aa9dcd46b" exitCode=0 Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.790665 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"b980742e22e0e4d74d8de52154d96047cf8f0b8f6b46b5fe1e26926aa9dcd46b"} Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.790716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"3de117c6f677c9430a82ad131a4b88d218c50126d4fd008ad4d29c8f59650395"} Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.818568 4771 scope.go:117] "RemoveContainer" containerID="95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.848357 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.863411 4771 scope.go:117] "RemoveContainer" containerID="2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2" Feb 19 21:49:43 crc kubenswrapper[4771]: E0219 21:49:43.863826 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2\": container with ID starting with 2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2 not found: ID does not exist" containerID="2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.863858 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2"} err="failed to get container status \"2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2\": rpc error: code = NotFound desc = could not find container \"2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2\": container with ID starting with 2ed125ca2f10f6d7b9a95245f1404702b296a746ac2a8faba83651ca55fb8ea2 not found: ID does not exist" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.863877 4771 scope.go:117] "RemoveContainer" containerID="95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a" Feb 19 21:49:43 crc kubenswrapper[4771]: E0219 21:49:43.864437 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a\": container with ID starting with 95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a not found: ID does not exist" containerID="95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.864468 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a"} err="failed to get container status \"95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a\": rpc error: code = NotFound desc = could not find container \"95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a\": container with ID starting with 95f78b64c251a17e88cc54d8725ab2c4402d20e56d66db85e59d1c284461e37a not found: ID does not exist" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.864482 4771 scope.go:117] "RemoveContainer" containerID="de6552d35b845cd34ff7f7f99fd7b414fe0ca3adffdf00a9aac825e2b4fe5659" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.872228 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.892427 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:43 crc kubenswrapper[4771]: E0219 21:49:43.892866 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerName="dnsmasq-dns" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.892878 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerName="dnsmasq-dns" Feb 19 21:49:43 crc kubenswrapper[4771]: E0219 21:49:43.892887 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="cinder-scheduler" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.892893 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="cinder-scheduler" Feb 19 21:49:43 crc kubenswrapper[4771]: E0219 21:49:43.892911 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="probe" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.892917 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="probe" Feb 19 21:49:43 crc kubenswrapper[4771]: E0219 21:49:43.892941 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerName="init" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.892948 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerName="init" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.893112 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="cinder-scheduler" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.893129 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5790e42-f883-43b1-912b-4226fa3b8db1" containerName="dnsmasq-dns" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.893146 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" containerName="probe" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.896139 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.900183 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.906894 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.937804 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e1653c4-41e5-403a-ae0d-9c1a5762f287-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.937889 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-scripts\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.937988 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.938040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.938060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.938091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlvtj\" (UniqueName: \"kubernetes.io/projected/4e1653c4-41e5-403a-ae0d-9c1a5762f287-kube-api-access-wlvtj\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.967318 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b5d9cf798-b8q9j"] Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.973650 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:43 crc kubenswrapper[4771]: I0219 21:49:43.980683 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b5d9cf798-b8q9j"] Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041460 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041504 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkh2p\" (UniqueName: \"kubernetes.io/projected/680c2961-e33c-44b1-aadd-37556bf4839c-kube-api-access-rkh2p\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041533 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/680c2961-e33c-44b1-aadd-37556bf4839c-logs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041548 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041568 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlvtj\" (UniqueName: \"kubernetes.io/projected/4e1653c4-41e5-403a-ae0d-9c1a5762f287-kube-api-access-wlvtj\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041743 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e1653c4-41e5-403a-ae0d-9c1a5762f287-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041854 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-internal-tls-certs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041859 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e1653c4-41e5-403a-ae0d-9c1a5762f287-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041899 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041932 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-scripts\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.041978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-public-tls-certs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.042092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.042116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-combined-ca-bundle\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.042150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.045796 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-scripts\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.047439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.049824 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.058006 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.059839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlvtj\" (UniqueName: \"kubernetes.io/projected/4e1653c4-41e5-403a-ae0d-9c1a5762f287-kube-api-access-wlvtj\") pod \"cinder-scheduler-0\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.143357 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-internal-tls-certs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.143410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.143447 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-public-tls-certs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.143497 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-combined-ca-bundle\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.143532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkh2p\" (UniqueName: \"kubernetes.io/projected/680c2961-e33c-44b1-aadd-37556bf4839c-kube-api-access-rkh2p\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.143556 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/680c2961-e33c-44b1-aadd-37556bf4839c-logs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.143571 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.144912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/680c2961-e33c-44b1-aadd-37556bf4839c-logs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.146902 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-internal-tls-certs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.146950 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.147535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-combined-ca-bundle\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.158545 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.158725 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-public-tls-certs\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.160952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkh2p\" (UniqueName: \"kubernetes.io/projected/680c2961-e33c-44b1-aadd-37556bf4839c-kube-api-access-rkh2p\") pod \"placement-b5d9cf798-b8q9j\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.213062 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.332590 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.449040 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db9323d6-6ed4-4e8f-bc34-c856cb767eee" path="/var/lib/kubelet/pods/db9323d6-6ed4-4e8f-bc34-c856cb767eee/volumes" Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.679126 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.796135 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b5d9cf798-b8q9j"] Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.802521 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e1653c4-41e5-403a-ae0d-9c1a5762f287","Type":"ContainerStarted","Data":"76c714da43fed73cdc5e041d1d5609867a3027b67b042963530d08b46a3c974f"} Feb 19 21:49:44 crc kubenswrapper[4771]: W0219 21:49:44.810203 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod680c2961_e33c_44b1_aadd_37556bf4839c.slice/crio-9fa692d6c39236751821891e0aac5e950e9beaa94fe51a185f962ab38554c09b WatchSource:0}: Error finding container 9fa692d6c39236751821891e0aac5e950e9beaa94fe51a185f962ab38554c09b: Status 404 returned error can't find the container with id 9fa692d6c39236751821891e0aac5e950e9beaa94fe51a185f962ab38554c09b Feb 19 21:49:44 crc kubenswrapper[4771]: I0219 21:49:44.907666 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.634572 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.685229 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-public-tls-certs\") pod \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.685488 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-httpd-config\") pod \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.685687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-config\") pod \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.685772 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qx52\" (UniqueName: \"kubernetes.io/projected/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-kube-api-access-9qx52\") pod \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.685859 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-combined-ca-bundle\") pod \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.685957 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-internal-tls-certs\") pod \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.686054 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-ovndb-tls-certs\") pod \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\" (UID: \"c8508bb1-418f-4cc7-8021-97c4a8e7e77f\") " Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.709113 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-kube-api-access-9qx52" (OuterVolumeSpecName: "kube-api-access-9qx52") pod "c8508bb1-418f-4cc7-8021-97c4a8e7e77f" (UID: "c8508bb1-418f-4cc7-8021-97c4a8e7e77f"). InnerVolumeSpecName "kube-api-access-9qx52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.709188 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c8508bb1-418f-4cc7-8021-97c4a8e7e77f" (UID: "c8508bb1-418f-4cc7-8021-97c4a8e7e77f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.769322 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c8508bb1-418f-4cc7-8021-97c4a8e7e77f" (UID: "c8508bb1-418f-4cc7-8021-97c4a8e7e77f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.783760 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-config" (OuterVolumeSpecName: "config") pod "c8508bb1-418f-4cc7-8021-97c4a8e7e77f" (UID: "c8508bb1-418f-4cc7-8021-97c4a8e7e77f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.798266 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.798291 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qx52\" (UniqueName: \"kubernetes.io/projected/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-kube-api-access-9qx52\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.798301 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.798310 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.807192 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8508bb1-418f-4cc7-8021-97c4a8e7e77f" (UID: "c8508bb1-418f-4cc7-8021-97c4a8e7e77f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.815705 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c8508bb1-418f-4cc7-8021-97c4a8e7e77f" (UID: "c8508bb1-418f-4cc7-8021-97c4a8e7e77f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.828918 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c8508bb1-418f-4cc7-8021-97c4a8e7e77f" (UID: "c8508bb1-418f-4cc7-8021-97c4a8e7e77f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.848429 4771 generic.go:334] "Generic (PLEG): container finished" podID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerID="7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd" exitCode=0 Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.848515 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79b4d4dd8f-2sfll" event={"ID":"c8508bb1-418f-4cc7-8021-97c4a8e7e77f","Type":"ContainerDied","Data":"7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd"} Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.848572 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79b4d4dd8f-2sfll" event={"ID":"c8508bb1-418f-4cc7-8021-97c4a8e7e77f","Type":"ContainerDied","Data":"47883bffcaaeae6d37824869dfc1061203ba56f17529ff026522fb0bfdf4a8ba"} Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.848589 4771 scope.go:117] "RemoveContainer" containerID="75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.848741 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79b4d4dd8f-2sfll" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.873976 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e1653c4-41e5-403a-ae0d-9c1a5762f287","Type":"ContainerStarted","Data":"f5328ae68d5231323e69942e8f00f3eca7b2e290877a4c8daac72b1fb1c774f5"} Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.897599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b5d9cf798-b8q9j" event={"ID":"680c2961-e33c-44b1-aadd-37556bf4839c","Type":"ContainerStarted","Data":"0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771"} Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.897654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b5d9cf798-b8q9j" event={"ID":"680c2961-e33c-44b1-aadd-37556bf4839c","Type":"ContainerStarted","Data":"eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2"} Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.897669 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b5d9cf798-b8q9j" event={"ID":"680c2961-e33c-44b1-aadd-37556bf4839c","Type":"ContainerStarted","Data":"9fa692d6c39236751821891e0aac5e950e9beaa94fe51a185f962ab38554c09b"} Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.899080 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.899119 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.903244 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.903268 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.903291 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8508bb1-418f-4cc7-8021-97c4a8e7e77f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.936162 4771 scope.go:117] "RemoveContainer" containerID="7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.965473 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b5d9cf798-b8q9j" podStartSLOduration=2.965453321 podStartE2EDuration="2.965453321s" podCreationTimestamp="2026-02-19 21:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:45.917387841 +0000 UTC m=+1286.188830311" watchObservedRunningTime="2026-02-19 21:49:45.965453321 +0000 UTC m=+1286.236895791" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.974277 4771 scope.go:117] "RemoveContainer" containerID="75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b" Feb 19 21:49:45 crc kubenswrapper[4771]: E0219 21:49:45.975069 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b\": container with ID starting with 75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b not found: ID does not exist" containerID="75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.975327 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b"} err="failed to get container status \"75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b\": rpc error: code = NotFound desc = could not find container \"75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b\": container with ID starting with 75ba9e96fd1d5c9264b5b978486a273c05bc36520ade55e9ad2efe039221af1b not found: ID does not exist" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.975366 4771 scope.go:117] "RemoveContainer" containerID="7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd" Feb 19 21:49:45 crc kubenswrapper[4771]: E0219 21:49:45.976052 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd\": container with ID starting with 7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd not found: ID does not exist" containerID="7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.976084 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd"} err="failed to get container status \"7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd\": rpc error: code = NotFound desc = could not find container \"7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd\": container with ID starting with 7d2551aedd775520b4f16b1c19250bdab301f2444b31d26b7eaf792106a194cd not found: ID does not exist" Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.990422 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79b4d4dd8f-2sfll"] Feb 19 21:49:45 crc kubenswrapper[4771]: I0219 21:49:45.994892 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79b4d4dd8f-2sfll"] Feb 19 21:49:46 crc kubenswrapper[4771]: I0219 21:49:46.449213 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" path="/var/lib/kubelet/pods/c8508bb1-418f-4cc7-8021-97c4a8e7e77f/volumes" Feb 19 21:49:46 crc kubenswrapper[4771]: I0219 21:49:46.804378 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 21:49:46 crc kubenswrapper[4771]: I0219 21:49:46.908085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e1653c4-41e5-403a-ae0d-9c1a5762f287","Type":"ContainerStarted","Data":"53dafb13a5c390196d3cb19b500110a7ca153d256650b8c84a399c53e300934e"} Feb 19 21:49:46 crc kubenswrapper[4771]: I0219 21:49:46.939882 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.939861085 podStartE2EDuration="3.939861085s" podCreationTimestamp="2026-02-19 21:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:46.93026926 +0000 UTC m=+1287.201711760" watchObservedRunningTime="2026-02-19 21:49:46.939861085 +0000 UTC m=+1287.211303555" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.213436 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.537605 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6c7696df7-bt9vd"] Feb 19 21:49:49 crc kubenswrapper[4771]: E0219 21:49:49.538293 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-api" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.538314 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-api" Feb 19 21:49:49 crc kubenswrapper[4771]: E0219 21:49:49.538323 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-httpd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.538330 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-httpd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.538531 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-api" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.538556 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8508bb1-418f-4cc7-8021-97c4a8e7e77f" containerName="neutron-httpd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.539526 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.542636 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.542875 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.543041 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.549476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c7696df7-bt9vd"] Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.580432 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-internal-tls-certs\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.580499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-public-tls-certs\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.580647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-config-data\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.580686 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjhlb\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-kube-api-access-sjhlb\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.580727 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-log-httpd\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.580794 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-run-httpd\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.580841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-etc-swift\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.581105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-combined-ca-bundle\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682547 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-internal-tls-certs\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682593 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-public-tls-certs\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682662 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-config-data\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjhlb\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-kube-api-access-sjhlb\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682712 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-log-httpd\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-run-httpd\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-etc-swift\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.682853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-combined-ca-bundle\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.683948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-log-httpd\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.684074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-run-httpd\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.689123 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-internal-tls-certs\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.689242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-public-tls-certs\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.689465 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-etc-swift\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.700404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-combined-ca-bundle\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.702070 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-config-data\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.709891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjhlb\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-kube-api-access-sjhlb\") pod \"swift-proxy-6c7696df7-bt9vd\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.821720 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xgm28"] Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.822742 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.838733 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xgm28"] Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.862618 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.887180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/989cc18d-2aca-4d05-a256-8908d0c5ac60-operator-scripts\") pod \"nova-api-db-create-xgm28\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.887486 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95lp6\" (UniqueName: \"kubernetes.io/projected/989cc18d-2aca-4d05-a256-8908d0c5ac60-kube-api-access-95lp6\") pod \"nova-api-db-create-xgm28\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.930402 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zpprp"] Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.945142 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.953729 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zpprp"] Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.990582 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/989cc18d-2aca-4d05-a256-8908d0c5ac60-operator-scripts\") pod \"nova-api-db-create-xgm28\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.990643 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95lp6\" (UniqueName: \"kubernetes.io/projected/989cc18d-2aca-4d05-a256-8908d0c5ac60-kube-api-access-95lp6\") pod \"nova-api-db-create-xgm28\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.990679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/480beb32-325d-4cdd-bda8-428a55bcf4d4-operator-scripts\") pod \"nova-cell0-db-create-zpprp\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.990718 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69mgp\" (UniqueName: \"kubernetes.io/projected/480beb32-325d-4cdd-bda8-428a55bcf4d4-kube-api-access-69mgp\") pod \"nova-cell0-db-create-zpprp\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.991826 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/989cc18d-2aca-4d05-a256-8908d0c5ac60-operator-scripts\") pod \"nova-api-db-create-xgm28\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.996981 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-71de-account-create-update-2djdv"] Feb 19 21:49:49 crc kubenswrapper[4771]: I0219 21:49:49.998499 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.004212 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-71de-account-create-update-2djdv"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.022927 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.034219 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.039713 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.042991 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.044901 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.047412 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-skhvw" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.047548 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.068178 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95lp6\" (UniqueName: \"kubernetes.io/projected/989cc18d-2aca-4d05-a256-8908d0c5ac60-kube-api-access-95lp6\") pod \"nova-api-db-create-xgm28\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092483 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2sln\" (UniqueName: \"kubernetes.io/projected/8a3d5744-034c-43b6-851e-ffc1fb4eca48-kube-api-access-g2sln\") pod \"nova-api-71de-account-create-update-2djdv\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/480beb32-325d-4cdd-bda8-428a55bcf4d4-operator-scripts\") pod \"nova-cell0-db-create-zpprp\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092568 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69mgp\" (UniqueName: \"kubernetes.io/projected/480beb32-325d-4cdd-bda8-428a55bcf4d4-kube-api-access-69mgp\") pod \"nova-cell0-db-create-zpprp\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092624 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092645 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a3d5744-034c-43b6-851e-ffc1fb4eca48-operator-scripts\") pod \"nova-api-71de-account-create-update-2djdv\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5v8b\" (UniqueName: \"kubernetes.io/projected/8efa5dd0-e91b-456b-b295-e45608c03c36-kube-api-access-k5v8b\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.092748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config-secret\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.102226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/480beb32-325d-4cdd-bda8-428a55bcf4d4-operator-scripts\") pod \"nova-cell0-db-create-zpprp\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.125804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69mgp\" (UniqueName: \"kubernetes.io/projected/480beb32-325d-4cdd-bda8-428a55bcf4d4-kube-api-access-69mgp\") pod \"nova-cell0-db-create-zpprp\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.144177 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qwjfh"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.144810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.146276 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.165062 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-2zlx2"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.166184 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.172877 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.174601 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qwjfh"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.189073 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-2zlx2"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.194243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2sln\" (UniqueName: \"kubernetes.io/projected/8a3d5744-034c-43b6-851e-ffc1fb4eca48-kube-api-access-g2sln\") pod \"nova-api-71de-account-create-update-2djdv\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.195781 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.195894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63630cc-8a7e-4c82-a69b-3babe4a33e43-operator-scripts\") pod \"nova-cell0-c5fb-account-create-update-2zlx2\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.195980 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.196074 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx4pq\" (UniqueName: \"kubernetes.io/projected/dcc226cf-f7e9-47c7-b553-b6645f504d4d-kube-api-access-vx4pq\") pod \"nova-cell1-db-create-qwjfh\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.196156 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a3d5744-034c-43b6-851e-ffc1fb4eca48-operator-scripts\") pod \"nova-api-71de-account-create-update-2djdv\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.196273 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5v8b\" (UniqueName: \"kubernetes.io/projected/8efa5dd0-e91b-456b-b295-e45608c03c36-kube-api-access-k5v8b\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.196353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsq7j\" (UniqueName: \"kubernetes.io/projected/e63630cc-8a7e-4c82-a69b-3babe4a33e43-kube-api-access-hsq7j\") pod \"nova-cell0-c5fb-account-create-update-2zlx2\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.196496 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config-secret\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.196583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc226cf-f7e9-47c7-b553-b6645f504d4d-operator-scripts\") pod \"nova-cell1-db-create-qwjfh\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.197335 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.198060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a3d5744-034c-43b6-851e-ffc1fb4eca48-operator-scripts\") pod \"nova-api-71de-account-create-update-2djdv\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.207064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config-secret\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.209985 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2sln\" (UniqueName: \"kubernetes.io/projected/8a3d5744-034c-43b6-851e-ffc1fb4eca48-kube-api-access-g2sln\") pod \"nova-api-71de-account-create-update-2djdv\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.210520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.213998 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5v8b\" (UniqueName: \"kubernetes.io/projected/8efa5dd0-e91b-456b-b295-e45608c03c36-kube-api-access-k5v8b\") pod \"openstackclient\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.298935 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.299800 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc226cf-f7e9-47c7-b553-b6645f504d4d-operator-scripts\") pod \"nova-cell1-db-create-qwjfh\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.300920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc226cf-f7e9-47c7-b553-b6645f504d4d-operator-scripts\") pod \"nova-cell1-db-create-qwjfh\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.301584 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63630cc-8a7e-4c82-a69b-3babe4a33e43-operator-scripts\") pod \"nova-cell0-c5fb-account-create-update-2zlx2\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.301634 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63630cc-8a7e-4c82-a69b-3babe4a33e43-operator-scripts\") pod \"nova-cell0-c5fb-account-create-update-2zlx2\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.301710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx4pq\" (UniqueName: \"kubernetes.io/projected/dcc226cf-f7e9-47c7-b553-b6645f504d4d-kube-api-access-vx4pq\") pod \"nova-cell1-db-create-qwjfh\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.302248 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsq7j\" (UniqueName: \"kubernetes.io/projected/e63630cc-8a7e-4c82-a69b-3babe4a33e43-kube-api-access-hsq7j\") pod \"nova-cell0-c5fb-account-create-update-2zlx2\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.322150 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx4pq\" (UniqueName: \"kubernetes.io/projected/dcc226cf-f7e9-47c7-b553-b6645f504d4d-kube-api-access-vx4pq\") pod \"nova-cell1-db-create-qwjfh\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.322960 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-jdsz9"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.323135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsq7j\" (UniqueName: \"kubernetes.io/projected/e63630cc-8a7e-4c82-a69b-3babe4a33e43-kube-api-access-hsq7j\") pod \"nova-cell0-c5fb-account-create-update-2zlx2\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.324047 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.328072 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.332572 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.335811 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-jdsz9"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.369421 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.403787 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d98gt\" (UniqueName: \"kubernetes.io/projected/f68b419d-929c-4ea3-a95c-dd6a436248a3-kube-api-access-d98gt\") pod \"nova-cell1-70bf-account-create-update-jdsz9\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.403948 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68b419d-929c-4ea3-a95c-dd6a436248a3-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-jdsz9\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.423971 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.424517 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-central-agent" containerID="cri-o://797328426e0aa2206ab417aa2383a42589dfff89b0e57599eb711c18a60a46ab" gracePeriod=30 Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.425167 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="proxy-httpd" containerID="cri-o://f004750bc4d1bf39d0aaab57fe1c4483dc09933c243421394310b3293f8fd926" gracePeriod=30 Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.425232 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-notification-agent" containerID="cri-o://489543f4c2581d9007a1cecc6d4a40185759151af2fd2cbb5dee52c8d835eb1b" gracePeriod=30 Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.425294 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="sg-core" containerID="cri-o://b97d6b46293a12fd9d5002169110b0ce2c18eb6532ec0575517600274e33c5ae" gracePeriod=30 Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.440614 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.483898 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.506307 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68b419d-929c-4ea3-a95c-dd6a436248a3-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-jdsz9\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.506374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d98gt\" (UniqueName: \"kubernetes.io/projected/f68b419d-929c-4ea3-a95c-dd6a436248a3-kube-api-access-d98gt\") pod \"nova-cell1-70bf-account-create-update-jdsz9\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.507678 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.508179 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68b419d-929c-4ea3-a95c-dd6a436248a3-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-jdsz9\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.549109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d98gt\" (UniqueName: \"kubernetes.io/projected/f68b419d-929c-4ea3-a95c-dd6a436248a3-kube-api-access-d98gt\") pod \"nova-cell1-70bf-account-create-update-jdsz9\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.562092 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6c7696df7-bt9vd"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.648831 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.673330 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xgm28"] Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.863131 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zpprp"] Feb 19 21:49:50 crc kubenswrapper[4771]: W0219 21:49:50.881955 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod480beb32_325d_4cdd_bda8_428a55bcf4d4.slice/crio-8b6ba713c482aee8fe00fb7dbda184ae3fb77f90f44659a18537773d2fb6a393 WatchSource:0}: Error finding container 8b6ba713c482aee8fe00fb7dbda184ae3fb77f90f44659a18537773d2fb6a393: Status 404 returned error can't find the container with id 8b6ba713c482aee8fe00fb7dbda184ae3fb77f90f44659a18537773d2fb6a393 Feb 19 21:49:50 crc kubenswrapper[4771]: I0219 21:49:50.969737 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-71de-account-create-update-2djdv"] Feb 19 21:49:51 crc kubenswrapper[4771]: W0219 21:49:51.014227 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3d5744_034c_43b6_851e_ffc1fb4eca48.slice/crio-d5b5915e53f4ecec848e887c0b166e94420eb8b4ca4480307fa5ecb1d0d5bd5b WatchSource:0}: Error finding container d5b5915e53f4ecec848e887c0b166e94420eb8b4ca4480307fa5ecb1d0d5bd5b: Status 404 returned error can't find the container with id d5b5915e53f4ecec848e887c0b166e94420eb8b4ca4480307fa5ecb1d0d5bd5b Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047406 4771 generic.go:334] "Generic (PLEG): container finished" podID="e456659d-0497-4509-b998-ee4b5de724d9" containerID="f004750bc4d1bf39d0aaab57fe1c4483dc09933c243421394310b3293f8fd926" exitCode=0 Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047443 4771 generic.go:334] "Generic (PLEG): container finished" podID="e456659d-0497-4509-b998-ee4b5de724d9" containerID="b97d6b46293a12fd9d5002169110b0ce2c18eb6532ec0575517600274e33c5ae" exitCode=2 Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047454 4771 generic.go:334] "Generic (PLEG): container finished" podID="e456659d-0497-4509-b998-ee4b5de724d9" containerID="489543f4c2581d9007a1cecc6d4a40185759151af2fd2cbb5dee52c8d835eb1b" exitCode=0 Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047461 4771 generic.go:334] "Generic (PLEG): container finished" podID="e456659d-0497-4509-b998-ee4b5de724d9" containerID="797328426e0aa2206ab417aa2383a42589dfff89b0e57599eb711c18a60a46ab" exitCode=0 Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047561 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerDied","Data":"f004750bc4d1bf39d0aaab57fe1c4483dc09933c243421394310b3293f8fd926"} Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047607 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerDied","Data":"b97d6b46293a12fd9d5002169110b0ce2c18eb6532ec0575517600274e33c5ae"} Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerDied","Data":"489543f4c2581d9007a1cecc6d4a40185759151af2fd2cbb5dee52c8d835eb1b"} Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.047628 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerDied","Data":"797328426e0aa2206ab417aa2383a42589dfff89b0e57599eb711c18a60a46ab"} Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.060288 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zpprp" event={"ID":"480beb32-325d-4cdd-bda8-428a55bcf4d4","Type":"ContainerStarted","Data":"8b6ba713c482aee8fe00fb7dbda184ae3fb77f90f44659a18537773d2fb6a393"} Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.077851 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xgm28" event={"ID":"989cc18d-2aca-4d05-a256-8908d0c5ac60","Type":"ContainerStarted","Data":"0a75c774acf0bd8c3048724e044fb5b70968357a9b45d80d4f94c5a797b88577"} Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.089329 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c7696df7-bt9vd" event={"ID":"13181273-c19b-46fa-a087-da5695148b1f","Type":"ContainerStarted","Data":"615ad1804139378182b1a5e5ee653bd158cf7f61b79685e05b5fcf774e8f84e5"} Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.114561 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.130382 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qwjfh"] Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.262210 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-2zlx2"] Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.397143 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-jdsz9"] Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.553884 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.648547 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-sg-core-conf-yaml\") pod \"e456659d-0497-4509-b998-ee4b5de724d9\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.648798 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-combined-ca-bundle\") pod \"e456659d-0497-4509-b998-ee4b5de724d9\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.648886 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-config-data\") pod \"e456659d-0497-4509-b998-ee4b5de724d9\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.648988 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-log-httpd\") pod \"e456659d-0497-4509-b998-ee4b5de724d9\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.649355 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2gj9\" (UniqueName: \"kubernetes.io/projected/e456659d-0497-4509-b998-ee4b5de724d9-kube-api-access-s2gj9\") pod \"e456659d-0497-4509-b998-ee4b5de724d9\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.649439 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-run-httpd\") pod \"e456659d-0497-4509-b998-ee4b5de724d9\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.649550 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-scripts\") pod \"e456659d-0497-4509-b998-ee4b5de724d9\" (UID: \"e456659d-0497-4509-b998-ee4b5de724d9\") " Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.653562 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e456659d-0497-4509-b998-ee4b5de724d9" (UID: "e456659d-0497-4509-b998-ee4b5de724d9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.653973 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e456659d-0497-4509-b998-ee4b5de724d9" (UID: "e456659d-0497-4509-b998-ee4b5de724d9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.655868 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-scripts" (OuterVolumeSpecName: "scripts") pod "e456659d-0497-4509-b998-ee4b5de724d9" (UID: "e456659d-0497-4509-b998-ee4b5de724d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.661170 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e456659d-0497-4509-b998-ee4b5de724d9-kube-api-access-s2gj9" (OuterVolumeSpecName: "kube-api-access-s2gj9") pod "e456659d-0497-4509-b998-ee4b5de724d9" (UID: "e456659d-0497-4509-b998-ee4b5de724d9"). InnerVolumeSpecName "kube-api-access-s2gj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.751327 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.751350 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2gj9\" (UniqueName: \"kubernetes.io/projected/e456659d-0497-4509-b998-ee4b5de724d9-kube-api-access-s2gj9\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.751360 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e456659d-0497-4509-b998-ee4b5de724d9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.751368 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.776439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e456659d-0497-4509-b998-ee4b5de724d9" (UID: "e456659d-0497-4509-b998-ee4b5de724d9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.817206 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e456659d-0497-4509-b998-ee4b5de724d9" (UID: "e456659d-0497-4509-b998-ee4b5de724d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.856311 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.856334 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.895073 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-config-data" (OuterVolumeSpecName: "config-data") pod "e456659d-0497-4509-b998-ee4b5de724d9" (UID: "e456659d-0497-4509-b998-ee4b5de724d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:51 crc kubenswrapper[4771]: I0219 21:49:51.957513 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e456659d-0497-4509-b998-ee4b5de724d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:52 crc kubenswrapper[4771]: E0219 21:49:52.095300 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3d5744_034c_43b6_851e_ffc1fb4eca48.slice/crio-91cf6aa4394a77413c6ac8c75e4e0d70166f76d1cea8bfe757e7914bcec5a944.scope\": RecentStats: unable to find data in memory cache]" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.109771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" event={"ID":"e63630cc-8a7e-4c82-a69b-3babe4a33e43","Type":"ContainerStarted","Data":"82509bf1ab50929ddc715d7c46a7dac697c4c007fec020d2a38c870dfe0d48c8"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.109888 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" event={"ID":"e63630cc-8a7e-4c82-a69b-3babe4a33e43","Type":"ContainerStarted","Data":"3fc9caa01d916171d2a9ebe9ab6591e32e5ab9f9341daa38ec695144afe109ca"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.122348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e456659d-0497-4509-b998-ee4b5de724d9","Type":"ContainerDied","Data":"846a465caafaf697f57b70498e9d8c31aea9ef7f01267453cf36fce35d029aab"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.122398 4771 scope.go:117] "RemoveContainer" containerID="f004750bc4d1bf39d0aaab57fe1c4483dc09933c243421394310b3293f8fd926" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.122546 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.131984 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" podStartSLOduration=2.131967276 podStartE2EDuration="2.131967276s" podCreationTimestamp="2026-02-19 21:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:52.127405645 +0000 UTC m=+1292.398848105" watchObservedRunningTime="2026-02-19 21:49:52.131967276 +0000 UTC m=+1292.403409746" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.133561 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" event={"ID":"f68b419d-929c-4ea3-a95c-dd6a436248a3","Type":"ContainerStarted","Data":"71cc7d9442ba94c1444da4fb61fedc176b18a5f13ddee418fad83e7fc53e74bd"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.133606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" event={"ID":"f68b419d-929c-4ea3-a95c-dd6a436248a3","Type":"ContainerStarted","Data":"36a156bc9ac40f6951ec533c83e5f4283964a051010a7344f532d633896a4d0e"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.152312 4771 generic.go:334] "Generic (PLEG): container finished" podID="989cc18d-2aca-4d05-a256-8908d0c5ac60" containerID="40144a8e451dbe1a572950c52be0b02a654d805e4c7b97aeb0e77cd2fef2d350" exitCode=0 Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.152381 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xgm28" event={"ID":"989cc18d-2aca-4d05-a256-8908d0c5ac60","Type":"ContainerDied","Data":"40144a8e451dbe1a572950c52be0b02a654d805e4c7b97aeb0e77cd2fef2d350"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.152539 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" podStartSLOduration=2.152520884 podStartE2EDuration="2.152520884s" podCreationTimestamp="2026-02-19 21:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:52.146307698 +0000 UTC m=+1292.417750168" watchObservedRunningTime="2026-02-19 21:49:52.152520884 +0000 UTC m=+1292.423963354" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.157871 4771 generic.go:334] "Generic (PLEG): container finished" podID="dcc226cf-f7e9-47c7-b553-b6645f504d4d" containerID="6c7e9318c5ddd5d08bdec6168a67f0d518ff8348344d2f64f96c844314ef2153" exitCode=0 Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.157942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qwjfh" event={"ID":"dcc226cf-f7e9-47c7-b553-b6645f504d4d","Type":"ContainerDied","Data":"6c7e9318c5ddd5d08bdec6168a67f0d518ff8348344d2f64f96c844314ef2153"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.157962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qwjfh" event={"ID":"dcc226cf-f7e9-47c7-b553-b6645f504d4d","Type":"ContainerStarted","Data":"6e4c3a1024630e1124335501d40b172e37fce8e50b4fba7b944c6abdb2464d14"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.159355 4771 generic.go:334] "Generic (PLEG): container finished" podID="8a3d5744-034c-43b6-851e-ffc1fb4eca48" containerID="91cf6aa4394a77413c6ac8c75e4e0d70166f76d1cea8bfe757e7914bcec5a944" exitCode=0 Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.159394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-71de-account-create-update-2djdv" event={"ID":"8a3d5744-034c-43b6-851e-ffc1fb4eca48","Type":"ContainerDied","Data":"91cf6aa4394a77413c6ac8c75e4e0d70166f76d1cea8bfe757e7914bcec5a944"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.159408 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-71de-account-create-update-2djdv" event={"ID":"8a3d5744-034c-43b6-851e-ffc1fb4eca48","Type":"ContainerStarted","Data":"d5b5915e53f4ecec848e887c0b166e94420eb8b4ca4480307fa5ecb1d0d5bd5b"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.160893 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c7696df7-bt9vd" event={"ID":"13181273-c19b-46fa-a087-da5695148b1f","Type":"ContainerStarted","Data":"4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.160915 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c7696df7-bt9vd" event={"ID":"13181273-c19b-46fa-a087-da5695148b1f","Type":"ContainerStarted","Data":"4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.161307 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.161450 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.163995 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8efa5dd0-e91b-456b-b295-e45608c03c36","Type":"ContainerStarted","Data":"4f6d8e90bd1f7c121485611e130e25fd49c1330210d07a0a974cd4cc5949a26e"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.168501 4771 generic.go:334] "Generic (PLEG): container finished" podID="480beb32-325d-4cdd-bda8-428a55bcf4d4" containerID="6459c1d0eae8011e51f373038f7426ac5702af28aa9315bf859efd56d18e4be8" exitCode=0 Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.168563 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zpprp" event={"ID":"480beb32-325d-4cdd-bda8-428a55bcf4d4","Type":"ContainerDied","Data":"6459c1d0eae8011e51f373038f7426ac5702af28aa9315bf859efd56d18e4be8"} Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.223583 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6c7696df7-bt9vd" podStartSLOduration=3.223562555 podStartE2EDuration="3.223562555s" podCreationTimestamp="2026-02-19 21:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:52.208849403 +0000 UTC m=+1292.480291893" watchObservedRunningTime="2026-02-19 21:49:52.223562555 +0000 UTC m=+1292.495005025" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.254233 4771 scope.go:117] "RemoveContainer" containerID="b97d6b46293a12fd9d5002169110b0ce2c18eb6532ec0575517600274e33c5ae" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.288084 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.338101 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.355750 4771 scope.go:117] "RemoveContainer" containerID="489543f4c2581d9007a1cecc6d4a40185759151af2fd2cbb5dee52c8d835eb1b" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.355876 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:52 crc kubenswrapper[4771]: E0219 21:49:52.356240 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="sg-core" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356255 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="sg-core" Feb 19 21:49:52 crc kubenswrapper[4771]: E0219 21:49:52.356277 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-central-agent" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356285 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-central-agent" Feb 19 21:49:52 crc kubenswrapper[4771]: E0219 21:49:52.356311 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-notification-agent" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356317 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-notification-agent" Feb 19 21:49:52 crc kubenswrapper[4771]: E0219 21:49:52.356327 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="proxy-httpd" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356333 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="proxy-httpd" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356485 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-notification-agent" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356500 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="ceilometer-central-agent" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356515 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="proxy-httpd" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.356529 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e456659d-0497-4509-b998-ee4b5de724d9" containerName="sg-core" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.357996 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.362926 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.363110 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.423989 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.480852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-log-httpd\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.480910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ttg\" (UniqueName: \"kubernetes.io/projected/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-kube-api-access-v9ttg\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.480935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.480976 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-run-httpd\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.481228 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-scripts\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.481686 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.482532 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.496189 4771 scope.go:117] "RemoveContainer" containerID="797328426e0aa2206ab417aa2383a42589dfff89b0e57599eb711c18a60a46ab" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.498861 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e456659d-0497-4509-b998-ee4b5de724d9" path="/var/lib/kubelet/pods/e456659d-0497-4509-b998-ee4b5de724d9/volumes" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.586783 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.586873 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-log-httpd\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.586895 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ttg\" (UniqueName: \"kubernetes.io/projected/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-kube-api-access-v9ttg\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.586912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.586932 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-run-httpd\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.586956 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-scripts\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.586988 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.590706 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-run-httpd\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.591431 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-log-httpd\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.594459 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-scripts\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.601682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.602059 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.606624 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ttg\" (UniqueName: \"kubernetes.io/projected/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-kube-api-access-v9ttg\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.607281 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " pod="openstack/ceilometer-0" Feb 19 21:49:52 crc kubenswrapper[4771]: I0219 21:49:52.734806 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.183863 4771 generic.go:334] "Generic (PLEG): container finished" podID="f68b419d-929c-4ea3-a95c-dd6a436248a3" containerID="71cc7d9442ba94c1444da4fb61fedc176b18a5f13ddee418fad83e7fc53e74bd" exitCode=0 Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.183934 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" event={"ID":"f68b419d-929c-4ea3-a95c-dd6a436248a3","Type":"ContainerDied","Data":"71cc7d9442ba94c1444da4fb61fedc176b18a5f13ddee418fad83e7fc53e74bd"} Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.185920 4771 generic.go:334] "Generic (PLEG): container finished" podID="e63630cc-8a7e-4c82-a69b-3babe4a33e43" containerID="82509bf1ab50929ddc715d7c46a7dac697c4c007fec020d2a38c870dfe0d48c8" exitCode=0 Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.187215 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" event={"ID":"e63630cc-8a7e-4c82-a69b-3babe4a33e43","Type":"ContainerDied","Data":"82509bf1ab50929ddc715d7c46a7dac697c4c007fec020d2a38c870dfe0d48c8"} Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.264686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.574208 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.712974 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/989cc18d-2aca-4d05-a256-8908d0c5ac60-operator-scripts\") pod \"989cc18d-2aca-4d05-a256-8908d0c5ac60\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.713244 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95lp6\" (UniqueName: \"kubernetes.io/projected/989cc18d-2aca-4d05-a256-8908d0c5ac60-kube-api-access-95lp6\") pod \"989cc18d-2aca-4d05-a256-8908d0c5ac60\" (UID: \"989cc18d-2aca-4d05-a256-8908d0c5ac60\") " Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.713522 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/989cc18d-2aca-4d05-a256-8908d0c5ac60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "989cc18d-2aca-4d05-a256-8908d0c5ac60" (UID: "989cc18d-2aca-4d05-a256-8908d0c5ac60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.714202 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/989cc18d-2aca-4d05-a256-8908d0c5ac60-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.720096 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989cc18d-2aca-4d05-a256-8908d0c5ac60-kube-api-access-95lp6" (OuterVolumeSpecName: "kube-api-access-95lp6") pod "989cc18d-2aca-4d05-a256-8908d0c5ac60" (UID: "989cc18d-2aca-4d05-a256-8908d0c5ac60"). InnerVolumeSpecName "kube-api-access-95lp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.815821 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95lp6\" (UniqueName: \"kubernetes.io/projected/989cc18d-2aca-4d05-a256-8908d0c5ac60-kube-api-access-95lp6\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.855542 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.863776 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:53 crc kubenswrapper[4771]: I0219 21:49:53.900481 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.019250 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2sln\" (UniqueName: \"kubernetes.io/projected/8a3d5744-034c-43b6-851e-ffc1fb4eca48-kube-api-access-g2sln\") pod \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.019327 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a3d5744-034c-43b6-851e-ffc1fb4eca48-operator-scripts\") pod \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\" (UID: \"8a3d5744-034c-43b6-851e-ffc1fb4eca48\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.019436 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc226cf-f7e9-47c7-b553-b6645f504d4d-operator-scripts\") pod \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.019507 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/480beb32-325d-4cdd-bda8-428a55bcf4d4-operator-scripts\") pod \"480beb32-325d-4cdd-bda8-428a55bcf4d4\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.019540 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx4pq\" (UniqueName: \"kubernetes.io/projected/dcc226cf-f7e9-47c7-b553-b6645f504d4d-kube-api-access-vx4pq\") pod \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\" (UID: \"dcc226cf-f7e9-47c7-b553-b6645f504d4d\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.019680 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69mgp\" (UniqueName: \"kubernetes.io/projected/480beb32-325d-4cdd-bda8-428a55bcf4d4-kube-api-access-69mgp\") pod \"480beb32-325d-4cdd-bda8-428a55bcf4d4\" (UID: \"480beb32-325d-4cdd-bda8-428a55bcf4d4\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.021182 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc226cf-f7e9-47c7-b553-b6645f504d4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcc226cf-f7e9-47c7-b553-b6645f504d4d" (UID: "dcc226cf-f7e9-47c7-b553-b6645f504d4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.022569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3d5744-034c-43b6-851e-ffc1fb4eca48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a3d5744-034c-43b6-851e-ffc1fb4eca48" (UID: "8a3d5744-034c-43b6-851e-ffc1fb4eca48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.022919 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480beb32-325d-4cdd-bda8-428a55bcf4d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "480beb32-325d-4cdd-bda8-428a55bcf4d4" (UID: "480beb32-325d-4cdd-bda8-428a55bcf4d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.025653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480beb32-325d-4cdd-bda8-428a55bcf4d4-kube-api-access-69mgp" (OuterVolumeSpecName: "kube-api-access-69mgp") pod "480beb32-325d-4cdd-bda8-428a55bcf4d4" (UID: "480beb32-325d-4cdd-bda8-428a55bcf4d4"). InnerVolumeSpecName "kube-api-access-69mgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.028258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3d5744-034c-43b6-851e-ffc1fb4eca48-kube-api-access-g2sln" (OuterVolumeSpecName: "kube-api-access-g2sln") pod "8a3d5744-034c-43b6-851e-ffc1fb4eca48" (UID: "8a3d5744-034c-43b6-851e-ffc1fb4eca48"). InnerVolumeSpecName "kube-api-access-g2sln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.043704 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc226cf-f7e9-47c7-b553-b6645f504d4d-kube-api-access-vx4pq" (OuterVolumeSpecName: "kube-api-access-vx4pq") pod "dcc226cf-f7e9-47c7-b553-b6645f504d4d" (UID: "dcc226cf-f7e9-47c7-b553-b6645f504d4d"). InnerVolumeSpecName "kube-api-access-vx4pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.122460 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2sln\" (UniqueName: \"kubernetes.io/projected/8a3d5744-034c-43b6-851e-ffc1fb4eca48-kube-api-access-g2sln\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.122493 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a3d5744-034c-43b6-851e-ffc1fb4eca48-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.122505 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc226cf-f7e9-47c7-b553-b6645f504d4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.122518 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/480beb32-325d-4cdd-bda8-428a55bcf4d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.122529 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx4pq\" (UniqueName: \"kubernetes.io/projected/dcc226cf-f7e9-47c7-b553-b6645f504d4d-kube-api-access-vx4pq\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.122542 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69mgp\" (UniqueName: \"kubernetes.io/projected/480beb32-325d-4cdd-bda8-428a55bcf4d4-kube-api-access-69mgp\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.204689 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zpprp" event={"ID":"480beb32-325d-4cdd-bda8-428a55bcf4d4","Type":"ContainerDied","Data":"8b6ba713c482aee8fe00fb7dbda184ae3fb77f90f44659a18537773d2fb6a393"} Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.204719 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zpprp" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.204731 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b6ba713c482aee8fe00fb7dbda184ae3fb77f90f44659a18537773d2fb6a393" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.206273 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xgm28" event={"ID":"989cc18d-2aca-4d05-a256-8908d0c5ac60","Type":"ContainerDied","Data":"0a75c774acf0bd8c3048724e044fb5b70968357a9b45d80d4f94c5a797b88577"} Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.206295 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a75c774acf0bd8c3048724e044fb5b70968357a9b45d80d4f94c5a797b88577" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.206339 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xgm28" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.212033 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerStarted","Data":"2976085c0a9901ab236c8e293ea95d3467ba07a24308a85fb2a3e0840818206e"} Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.212075 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerStarted","Data":"bdfb790dcfe3fef4d576ddd73ffeaba795fd728c94a2ee095e54027f80fb7c42"} Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.213475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qwjfh" event={"ID":"dcc226cf-f7e9-47c7-b553-b6645f504d4d","Type":"ContainerDied","Data":"6e4c3a1024630e1124335501d40b172e37fce8e50b4fba7b944c6abdb2464d14"} Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.213504 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e4c3a1024630e1124335501d40b172e37fce8e50b4fba7b944c6abdb2464d14" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.213504 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qwjfh" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.215992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-71de-account-create-update-2djdv" event={"ID":"8a3d5744-034c-43b6-851e-ffc1fb4eca48","Type":"ContainerDied","Data":"d5b5915e53f4ecec848e887c0b166e94420eb8b4ca4480307fa5ecb1d0d5bd5b"} Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.216196 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5b5915e53f4ecec848e887c0b166e94420eb8b4ca4480307fa5ecb1d0d5bd5b" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.216167 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-71de-account-create-update-2djdv" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.470420 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.566602 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.613570 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.632920 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68b419d-929c-4ea3-a95c-dd6a436248a3-operator-scripts\") pod \"f68b419d-929c-4ea3-a95c-dd6a436248a3\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.633258 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsq7j\" (UniqueName: \"kubernetes.io/projected/e63630cc-8a7e-4c82-a69b-3babe4a33e43-kube-api-access-hsq7j\") pod \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.633293 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d98gt\" (UniqueName: \"kubernetes.io/projected/f68b419d-929c-4ea3-a95c-dd6a436248a3-kube-api-access-d98gt\") pod \"f68b419d-929c-4ea3-a95c-dd6a436248a3\" (UID: \"f68b419d-929c-4ea3-a95c-dd6a436248a3\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.633342 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63630cc-8a7e-4c82-a69b-3babe4a33e43-operator-scripts\") pod \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\" (UID: \"e63630cc-8a7e-4c82-a69b-3babe4a33e43\") " Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.633828 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68b419d-929c-4ea3-a95c-dd6a436248a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f68b419d-929c-4ea3-a95c-dd6a436248a3" (UID: "f68b419d-929c-4ea3-a95c-dd6a436248a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.633917 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63630cc-8a7e-4c82-a69b-3babe4a33e43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e63630cc-8a7e-4c82-a69b-3babe4a33e43" (UID: "e63630cc-8a7e-4c82-a69b-3babe4a33e43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.634099 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f68b419d-929c-4ea3-a95c-dd6a436248a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.639300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68b419d-929c-4ea3-a95c-dd6a436248a3-kube-api-access-d98gt" (OuterVolumeSpecName: "kube-api-access-d98gt") pod "f68b419d-929c-4ea3-a95c-dd6a436248a3" (UID: "f68b419d-929c-4ea3-a95c-dd6a436248a3"). InnerVolumeSpecName "kube-api-access-d98gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.639370 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63630cc-8a7e-4c82-a69b-3babe4a33e43-kube-api-access-hsq7j" (OuterVolumeSpecName: "kube-api-access-hsq7j") pod "e63630cc-8a7e-4c82-a69b-3babe4a33e43" (UID: "e63630cc-8a7e-4c82-a69b-3babe4a33e43"). InnerVolumeSpecName "kube-api-access-hsq7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.734854 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsq7j\" (UniqueName: \"kubernetes.io/projected/e63630cc-8a7e-4c82-a69b-3babe4a33e43-kube-api-access-hsq7j\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.734889 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d98gt\" (UniqueName: \"kubernetes.io/projected/f68b419d-929c-4ea3-a95c-dd6a436248a3-kube-api-access-d98gt\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:54 crc kubenswrapper[4771]: I0219 21:49:54.734900 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63630cc-8a7e-4c82-a69b-3babe4a33e43-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.224695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" event={"ID":"f68b419d-929c-4ea3-a95c-dd6a436248a3","Type":"ContainerDied","Data":"36a156bc9ac40f6951ec533c83e5f4283964a051010a7344f532d633896a4d0e"} Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.225041 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-jdsz9" Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.225067 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36a156bc9ac40f6951ec533c83e5f4283964a051010a7344f532d633896a4d0e" Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.226943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerStarted","Data":"3bc9ee82370b7d43ac92c2ccc19ef7adb6d4f482a45e4d45c2d41ab33459a0f7"} Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.227041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerStarted","Data":"bed5675c06a13ba4658feab34e1ebe6d3f010abae04a8f01aeb5d2cc0059fcf1"} Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.228298 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" event={"ID":"e63630cc-8a7e-4c82-a69b-3babe4a33e43","Type":"ContainerDied","Data":"3fc9caa01d916171d2a9ebe9ab6591e32e5ab9f9341daa38ec695144afe109ca"} Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.228315 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fc9caa01d916171d2a9ebe9ab6591e32e5ab9f9341daa38ec695144afe109ca" Feb 19 21:49:55 crc kubenswrapper[4771]: I0219 21:49:55.228371 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-2zlx2" Feb 19 21:49:57 crc kubenswrapper[4771]: I0219 21:49:57.255488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerStarted","Data":"fc705f1e614abc0d148ca4ef70a8053dd124f77ea8fdaa8709214e39ebc2b49f"} Feb 19 21:49:57 crc kubenswrapper[4771]: I0219 21:49:57.256082 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:49:57 crc kubenswrapper[4771]: I0219 21:49:57.280790 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.319251585 podStartE2EDuration="5.280772729s" podCreationTimestamp="2026-02-19 21:49:52 +0000 UTC" firstStartedPulling="2026-02-19 21:49:53.289134056 +0000 UTC m=+1293.560576526" lastFinishedPulling="2026-02-19 21:49:56.2506552 +0000 UTC m=+1296.522097670" observedRunningTime="2026-02-19 21:49:57.280138283 +0000 UTC m=+1297.551580753" watchObservedRunningTime="2026-02-19 21:49:57.280772729 +0000 UTC m=+1297.552215199" Feb 19 21:49:59 crc kubenswrapper[4771]: I0219 21:49:59.166183 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:59 crc kubenswrapper[4771]: I0219 21:49:59.273496 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="sg-core" containerID="cri-o://3bc9ee82370b7d43ac92c2ccc19ef7adb6d4f482a45e4d45c2d41ab33459a0f7" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4771]: I0219 21:49:59.273458 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-central-agent" containerID="cri-o://2976085c0a9901ab236c8e293ea95d3467ba07a24308a85fb2a3e0840818206e" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4771]: I0219 21:49:59.273508 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="proxy-httpd" containerID="cri-o://fc705f1e614abc0d148ca4ef70a8053dd124f77ea8fdaa8709214e39ebc2b49f" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4771]: I0219 21:49:59.273508 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-notification-agent" containerID="cri-o://bed5675c06a13ba4658feab34e1ebe6d3f010abae04a8f01aeb5d2cc0059fcf1" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4771]: I0219 21:49:59.878067 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:49:59 crc kubenswrapper[4771]: I0219 21:49:59.878410 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284474 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerID="fc705f1e614abc0d148ca4ef70a8053dd124f77ea8fdaa8709214e39ebc2b49f" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284511 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerID="3bc9ee82370b7d43ac92c2ccc19ef7adb6d4f482a45e4d45c2d41ab33459a0f7" exitCode=2 Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284521 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerID="bed5675c06a13ba4658feab34e1ebe6d3f010abae04a8f01aeb5d2cc0059fcf1" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284531 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerID="2976085c0a9901ab236c8e293ea95d3467ba07a24308a85fb2a3e0840818206e" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284555 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerDied","Data":"fc705f1e614abc0d148ca4ef70a8053dd124f77ea8fdaa8709214e39ebc2b49f"} Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284584 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerDied","Data":"3bc9ee82370b7d43ac92c2ccc19ef7adb6d4f482a45e4d45c2d41ab33459a0f7"} Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerDied","Data":"bed5675c06a13ba4658feab34e1ebe6d3f010abae04a8f01aeb5d2cc0059fcf1"} Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.284607 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerDied","Data":"2976085c0a9901ab236c8e293ea95d3467ba07a24308a85fb2a3e0840818206e"} Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.305835 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rl7qm"] Feb 19 21:50:00 crc kubenswrapper[4771]: E0219 21:50:00.307500 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480beb32-325d-4cdd-bda8-428a55bcf4d4" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307519 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="480beb32-325d-4cdd-bda8-428a55bcf4d4" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: E0219 21:50:00.307540 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3d5744-034c-43b6-851e-ffc1fb4eca48" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307546 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3d5744-034c-43b6-851e-ffc1fb4eca48" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: E0219 21:50:00.307569 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63630cc-8a7e-4c82-a69b-3babe4a33e43" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307574 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63630cc-8a7e-4c82-a69b-3babe4a33e43" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: E0219 21:50:00.307582 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc226cf-f7e9-47c7-b553-b6645f504d4d" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307588 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc226cf-f7e9-47c7-b553-b6645f504d4d" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: E0219 21:50:00.307598 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989cc18d-2aca-4d05-a256-8908d0c5ac60" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307604 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="989cc18d-2aca-4d05-a256-8908d0c5ac60" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: E0219 21:50:00.307617 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68b419d-929c-4ea3-a95c-dd6a436248a3" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307622 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68b419d-929c-4ea3-a95c-dd6a436248a3" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307775 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3d5744-034c-43b6-851e-ffc1fb4eca48" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307783 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="989cc18d-2aca-4d05-a256-8908d0c5ac60" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307791 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc226cf-f7e9-47c7-b553-b6645f504d4d" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307803 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63630cc-8a7e-4c82-a69b-3babe4a33e43" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307812 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="480beb32-325d-4cdd-bda8-428a55bcf4d4" containerName="mariadb-database-create" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.307822 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68b419d-929c-4ea3-a95c-dd6a436248a3" containerName="mariadb-account-create-update" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.308383 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.311668 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.311752 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.312092 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5thxh" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.317151 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rl7qm"] Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.441941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.442123 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24td\" (UniqueName: \"kubernetes.io/projected/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-kube-api-access-j24td\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.442155 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-config-data\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.442192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-scripts\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.543983 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j24td\" (UniqueName: \"kubernetes.io/projected/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-kube-api-access-j24td\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.544032 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-config-data\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.544070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-scripts\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.544127 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.550069 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.551753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-config-data\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.552073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-scripts\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.560695 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24td\" (UniqueName: \"kubernetes.io/projected/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-kube-api-access-j24td\") pod \"nova-cell0-conductor-db-sync-rl7qm\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:00 crc kubenswrapper[4771]: I0219 21:50:00.630082 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.311866 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.321864 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d6dcaf3-6abb-4a6d-b006-f53dfe753121","Type":"ContainerDied","Data":"bdfb790dcfe3fef4d576ddd73ffeaba795fd728c94a2ee095e54027f80fb7c42"} Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.321911 4771 scope.go:117] "RemoveContainer" containerID="fc705f1e614abc0d148ca4ef70a8053dd124f77ea8fdaa8709214e39ebc2b49f" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.322075 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.346979 4771 scope.go:117] "RemoveContainer" containerID="3bc9ee82370b7d43ac92c2ccc19ef7adb6d4f482a45e4d45c2d41ab33459a0f7" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.376820 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.470345316 podStartE2EDuration="14.376804049s" podCreationTimestamp="2026-02-19 21:49:49 +0000 UTC" firstStartedPulling="2026-02-19 21:49:51.170626929 +0000 UTC m=+1291.442069399" lastFinishedPulling="2026-02-19 21:50:03.077085662 +0000 UTC m=+1303.348528132" observedRunningTime="2026-02-19 21:50:03.370203154 +0000 UTC m=+1303.641645624" watchObservedRunningTime="2026-02-19 21:50:03.376804049 +0000 UTC m=+1303.648246519" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.385540 4771 scope.go:117] "RemoveContainer" containerID="bed5675c06a13ba4658feab34e1ebe6d3f010abae04a8f01aeb5d2cc0059fcf1" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.393659 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9ttg\" (UniqueName: \"kubernetes.io/projected/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-kube-api-access-v9ttg\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.393694 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-sg-core-conf-yaml\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.393713 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.393747 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-run-httpd\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.393778 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-log-httpd\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.393830 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-scripts\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.393885 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-combined-ca-bundle\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.397160 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.397175 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.401862 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-scripts" (OuterVolumeSpecName: "scripts") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.404097 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-kube-api-access-v9ttg" (OuterVolumeSpecName: "kube-api-access-v9ttg") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "kube-api-access-v9ttg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.407936 4771 scope.go:117] "RemoveContainer" containerID="2976085c0a9901ab236c8e293ea95d3467ba07a24308a85fb2a3e0840818206e" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.425919 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.468187 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.494956 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data" (OuterVolumeSpecName: "config-data") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.495687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data\") pod \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\" (UID: \"0d6dcaf3-6abb-4a6d-b006-f53dfe753121\") " Feb 19 21:50:03 crc kubenswrapper[4771]: W0219 21:50:03.495776 4771 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0d6dcaf3-6abb-4a6d-b006-f53dfe753121/volumes/kubernetes.io~secret/config-data Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.495796 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data" (OuterVolumeSpecName: "config-data") pod "0d6dcaf3-6abb-4a6d-b006-f53dfe753121" (UID: "0d6dcaf3-6abb-4a6d-b006-f53dfe753121"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.496450 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.496470 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.496478 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.496489 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9ttg\" (UniqueName: \"kubernetes.io/projected/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-kube-api-access-v9ttg\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.496498 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.496506 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.496514 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d6dcaf3-6abb-4a6d-b006-f53dfe753121-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4771]: W0219 21:50:03.498618 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2cfedeb_5cb5_4fc7_9bcf_eedd63c5bd02.slice/crio-82876872656d7c8e6760d9625d2ac75dcc364b82135514d874c849b40abef29f WatchSource:0}: Error finding container 82876872656d7c8e6760d9625d2ac75dcc364b82135514d874c849b40abef29f: Status 404 returned error can't find the container with id 82876872656d7c8e6760d9625d2ac75dcc364b82135514d874c849b40abef29f Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.502370 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rl7qm"] Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.659555 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.674824 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.682944 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:03 crc kubenswrapper[4771]: E0219 21:50:03.683296 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-notification-agent" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683315 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-notification-agent" Feb 19 21:50:03 crc kubenswrapper[4771]: E0219 21:50:03.683332 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="sg-core" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683338 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="sg-core" Feb 19 21:50:03 crc kubenswrapper[4771]: E0219 21:50:03.683362 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-central-agent" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683368 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-central-agent" Feb 19 21:50:03 crc kubenswrapper[4771]: E0219 21:50:03.683381 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="proxy-httpd" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683388 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="proxy-httpd" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683534 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-central-agent" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683547 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="sg-core" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683558 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="proxy-httpd" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.683574 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" containerName="ceilometer-notification-agent" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.684998 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.688673 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.688693 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.699946 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.700194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7s2l\" (UniqueName: \"kubernetes.io/projected/f7a38a20-14af-4bb6-837b-ff8d963a23db-kube-api-access-r7s2l\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.708364 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-log-httpd\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.708528 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-scripts\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.708622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-run-httpd\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.708815 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.708845 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-config-data\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.767193 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.810144 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-run-httpd\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.810234 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.810268 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-config-data\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.810301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.810341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7s2l\" (UniqueName: \"kubernetes.io/projected/f7a38a20-14af-4bb6-837b-ff8d963a23db-kube-api-access-r7s2l\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.810376 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-log-httpd\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.810428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-scripts\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.811413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-log-httpd\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.811467 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-run-httpd\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.814176 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.814776 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.815932 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-scripts\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.816526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-config-data\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.826477 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7s2l\" (UniqueName: \"kubernetes.io/projected/f7a38a20-14af-4bb6-837b-ff8d963a23db-kube-api-access-r7s2l\") pod \"ceilometer-0\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " pod="openstack/ceilometer-0" Feb 19 21:50:03 crc kubenswrapper[4771]: I0219 21:50:03.998895 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:04 crc kubenswrapper[4771]: I0219 21:50:04.343934 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:04 crc kubenswrapper[4771]: I0219 21:50:04.365303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8efa5dd0-e91b-456b-b295-e45608c03c36","Type":"ContainerStarted","Data":"da66b35c116d28899b782ad55eca69eeed804e2fa899b61d3bf29d45b676830f"} Feb 19 21:50:04 crc kubenswrapper[4771]: I0219 21:50:04.367680 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" event={"ID":"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02","Type":"ContainerStarted","Data":"82876872656d7c8e6760d9625d2ac75dcc364b82135514d874c849b40abef29f"} Feb 19 21:50:04 crc kubenswrapper[4771]: I0219 21:50:04.434721 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:04 crc kubenswrapper[4771]: I0219 21:50:04.446980 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d6dcaf3-6abb-4a6d-b006-f53dfe753121" path="/var/lib/kubelet/pods/0d6dcaf3-6abb-4a6d-b006-f53dfe753121/volumes" Feb 19 21:50:04 crc kubenswrapper[4771]: W0219 21:50:04.449960 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7a38a20_14af_4bb6_837b_ff8d963a23db.slice/crio-9c2352c4685f5630245538848dc98e41f83addb66901dd74f723837b09aa52de WatchSource:0}: Error finding container 9c2352c4685f5630245538848dc98e41f83addb66901dd74f723837b09aa52de: Status 404 returned error can't find the container with id 9c2352c4685f5630245538848dc98e41f83addb66901dd74f723837b09aa52de Feb 19 21:50:05 crc kubenswrapper[4771]: I0219 21:50:05.382411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerStarted","Data":"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d"} Feb 19 21:50:05 crc kubenswrapper[4771]: I0219 21:50:05.382666 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerStarted","Data":"9c2352c4685f5630245538848dc98e41f83addb66901dd74f723837b09aa52de"} Feb 19 21:50:06 crc kubenswrapper[4771]: I0219 21:50:06.391047 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerStarted","Data":"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60"} Feb 19 21:50:07 crc kubenswrapper[4771]: I0219 21:50:07.402092 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerStarted","Data":"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa"} Feb 19 21:50:09 crc kubenswrapper[4771]: I0219 21:50:09.985607 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:50:10 crc kubenswrapper[4771]: I0219 21:50:10.053141 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd457cb8d-5vbsm"] Feb 19 21:50:10 crc kubenswrapper[4771]: I0219 21:50:10.053351 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd457cb8d-5vbsm" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-api" containerID="cri-o://81928f25ef40f7c96ea048cc2dcbe4ec0c56577d24a6353cac6b73cbbf4e6ed5" gracePeriod=30 Feb 19 21:50:10 crc kubenswrapper[4771]: I0219 21:50:10.053687 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd457cb8d-5vbsm" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-httpd" containerID="cri-o://56ec40b5a72abb5cddcb68f7c06683a0c85a8037fde988353754a66196601dda" gracePeriod=30 Feb 19 21:50:10 crc kubenswrapper[4771]: I0219 21:50:10.427245 4771 generic.go:334] "Generic (PLEG): container finished" podID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerID="56ec40b5a72abb5cddcb68f7c06683a0c85a8037fde988353754a66196601dda" exitCode=0 Feb 19 21:50:10 crc kubenswrapper[4771]: I0219 21:50:10.427461 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd457cb8d-5vbsm" event={"ID":"fcb1436d-c34f-4211-b6f9-72a78630334d","Type":"ContainerDied","Data":"56ec40b5a72abb5cddcb68f7c06683a0c85a8037fde988353754a66196601dda"} Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.104620 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.105096 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-log" containerID="cri-o://3bd7ebdb5d9120c17dd8d1df15d7fd4beb48ccd3ef7ec08acaf67e67d0447b11" gracePeriod=30 Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.105205 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-httpd" containerID="cri-o://cf44c0f0303ceeed0b2c620f4aa0d0b8e0d088bfb5d009114b9968c51752a1a7" gracePeriod=30 Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.444618 4771 generic.go:334] "Generic (PLEG): container finished" podID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerID="3bd7ebdb5d9120c17dd8d1df15d7fd4beb48ccd3ef7ec08acaf67e67d0447b11" exitCode=143 Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.444787 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"20b982fd-3b6b-487a-9ec1-956360ee92d9","Type":"ContainerDied","Data":"3bd7ebdb5d9120c17dd8d1df15d7fd4beb48ccd3ef7ec08acaf67e67d0447b11"} Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.796392 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.796852 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-log" containerID="cri-o://f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756" gracePeriod=30 Feb 19 21:50:11 crc kubenswrapper[4771]: I0219 21:50:11.796979 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-httpd" containerID="cri-o://b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb" gracePeriod=30 Feb 19 21:50:12 crc kubenswrapper[4771]: I0219 21:50:12.453238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" event={"ID":"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02","Type":"ContainerStarted","Data":"70ccbec8c9a37573cbcbf07ef46b413724db3b6ad9381cf0ff63195ae6ffb18a"} Feb 19 21:50:12 crc kubenswrapper[4771]: I0219 21:50:12.455031 4771 generic.go:334] "Generic (PLEG): container finished" podID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerID="f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756" exitCode=143 Feb 19 21:50:12 crc kubenswrapper[4771]: I0219 21:50:12.455099 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01893f8d-263f-4c95-a1f9-864e9e655ee8","Type":"ContainerDied","Data":"f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756"} Feb 19 21:50:12 crc kubenswrapper[4771]: I0219 21:50:12.472607 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" podStartSLOduration=4.271248976 podStartE2EDuration="12.47259273s" podCreationTimestamp="2026-02-19 21:50:00 +0000 UTC" firstStartedPulling="2026-02-19 21:50:03.500612127 +0000 UTC m=+1303.772054597" lastFinishedPulling="2026-02-19 21:50:11.701955881 +0000 UTC m=+1311.973398351" observedRunningTime="2026-02-19 21:50:12.469373293 +0000 UTC m=+1312.740815763" watchObservedRunningTime="2026-02-19 21:50:12.47259273 +0000 UTC m=+1312.744035200" Feb 19 21:50:13 crc kubenswrapper[4771]: I0219 21:50:13.465897 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-central-agent" containerID="cri-o://dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" gracePeriod=30 Feb 19 21:50:13 crc kubenswrapper[4771]: I0219 21:50:13.466292 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerStarted","Data":"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3"} Feb 19 21:50:13 crc kubenswrapper[4771]: I0219 21:50:13.466357 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:50:13 crc kubenswrapper[4771]: I0219 21:50:13.466447 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="proxy-httpd" containerID="cri-o://759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" gracePeriod=30 Feb 19 21:50:13 crc kubenswrapper[4771]: I0219 21:50:13.466486 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="sg-core" containerID="cri-o://ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" gracePeriod=30 Feb 19 21:50:13 crc kubenswrapper[4771]: I0219 21:50:13.466465 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-notification-agent" containerID="cri-o://e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" gracePeriod=30 Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.203818 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.330646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-scripts\") pod \"f7a38a20-14af-4bb6-837b-ff8d963a23db\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.330760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-log-httpd\") pod \"f7a38a20-14af-4bb6-837b-ff8d963a23db\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.330826 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-sg-core-conf-yaml\") pod \"f7a38a20-14af-4bb6-837b-ff8d963a23db\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.330868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-run-httpd\") pod \"f7a38a20-14af-4bb6-837b-ff8d963a23db\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.330907 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7s2l\" (UniqueName: \"kubernetes.io/projected/f7a38a20-14af-4bb6-837b-ff8d963a23db-kube-api-access-r7s2l\") pod \"f7a38a20-14af-4bb6-837b-ff8d963a23db\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.330942 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-config-data\") pod \"f7a38a20-14af-4bb6-837b-ff8d963a23db\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.331063 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-combined-ca-bundle\") pod \"f7a38a20-14af-4bb6-837b-ff8d963a23db\" (UID: \"f7a38a20-14af-4bb6-837b-ff8d963a23db\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.331353 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7a38a20-14af-4bb6-837b-ff8d963a23db" (UID: "f7a38a20-14af-4bb6-837b-ff8d963a23db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.331606 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.331724 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7a38a20-14af-4bb6-837b-ff8d963a23db" (UID: "f7a38a20-14af-4bb6-837b-ff8d963a23db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.339662 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a38a20-14af-4bb6-837b-ff8d963a23db-kube-api-access-r7s2l" (OuterVolumeSpecName: "kube-api-access-r7s2l") pod "f7a38a20-14af-4bb6-837b-ff8d963a23db" (UID: "f7a38a20-14af-4bb6-837b-ff8d963a23db"). InnerVolumeSpecName "kube-api-access-r7s2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.345365 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-scripts" (OuterVolumeSpecName: "scripts") pod "f7a38a20-14af-4bb6-837b-ff8d963a23db" (UID: "f7a38a20-14af-4bb6-837b-ff8d963a23db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.370275 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7a38a20-14af-4bb6-837b-ff8d963a23db" (UID: "f7a38a20-14af-4bb6-837b-ff8d963a23db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.433249 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.433281 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7s2l\" (UniqueName: \"kubernetes.io/projected/f7a38a20-14af-4bb6-837b-ff8d963a23db-kube-api-access-r7s2l\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.433291 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.433300 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7a38a20-14af-4bb6-837b-ff8d963a23db-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.457234 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-config-data" (OuterVolumeSpecName: "config-data") pod "f7a38a20-14af-4bb6-837b-ff8d963a23db" (UID: "f7a38a20-14af-4bb6-837b-ff8d963a23db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.471210 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7a38a20-14af-4bb6-837b-ff8d963a23db" (UID: "f7a38a20-14af-4bb6-837b-ff8d963a23db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.483144 4771 generic.go:334] "Generic (PLEG): container finished" podID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerID="cf44c0f0303ceeed0b2c620f4aa0d0b8e0d088bfb5d009114b9968c51752a1a7" exitCode=0 Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.483462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"20b982fd-3b6b-487a-9ec1-956360ee92d9","Type":"ContainerDied","Data":"cf44c0f0303ceeed0b2c620f4aa0d0b8e0d088bfb5d009114b9968c51752a1a7"} Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.491927 4771 generic.go:334] "Generic (PLEG): container finished" podID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerID="81928f25ef40f7c96ea048cc2dcbe4ec0c56577d24a6353cac6b73cbbf4e6ed5" exitCode=0 Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.491992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd457cb8d-5vbsm" event={"ID":"fcb1436d-c34f-4211-b6f9-72a78630334d","Type":"ContainerDied","Data":"81928f25ef40f7c96ea048cc2dcbe4ec0c56577d24a6353cac6b73cbbf4e6ed5"} Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501788 4771 generic.go:334] "Generic (PLEG): container finished" podID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerID="759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" exitCode=0 Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501816 4771 generic.go:334] "Generic (PLEG): container finished" podID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerID="ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" exitCode=2 Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501824 4771 generic.go:334] "Generic (PLEG): container finished" podID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerID="e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" exitCode=0 Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501831 4771 generic.go:334] "Generic (PLEG): container finished" podID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerID="dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" exitCode=0 Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerDied","Data":"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3"} Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501872 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerDied","Data":"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa"} Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501882 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerDied","Data":"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60"} Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501890 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerDied","Data":"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d"} Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7a38a20-14af-4bb6-837b-ff8d963a23db","Type":"ContainerDied","Data":"9c2352c4685f5630245538848dc98e41f83addb66901dd74f723837b09aa52de"} Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.501916 4771 scope.go:117] "RemoveContainer" containerID="759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.502062 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.527408 4771 scope.go:117] "RemoveContainer" containerID="ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.534642 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.534674 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a38a20-14af-4bb6-837b-ff8d963a23db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.538116 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.553145 4771 scope.go:117] "RemoveContainer" containerID="e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.565623 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.580738 4771 scope.go:117] "RemoveContainer" containerID="dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.598071 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.614716 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.615160 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-notification-agent" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615179 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-notification-agent" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.615194 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-httpd" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615201 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-httpd" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.615223 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="proxy-httpd" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615229 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="proxy-httpd" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.615238 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-api" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615244 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-api" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.615255 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-central-agent" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615261 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-central-agent" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.615274 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="sg-core" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615282 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="sg-core" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615472 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-httpd" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615489 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-central-agent" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615501 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="sg-core" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615511 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="proxy-httpd" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615525 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" containerName="neutron-api" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.615536 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" containerName="ceilometer-notification-agent" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.623693 4771 scope.go:117] "RemoveContainer" containerID="759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.624226 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": container with ID starting with 759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3 not found: ID does not exist" containerID="759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.624256 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3"} err="failed to get container status \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": rpc error: code = NotFound desc = could not find container \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": container with ID starting with 759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.624287 4771 scope.go:117] "RemoveContainer" containerID="ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.625504 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": container with ID starting with ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa not found: ID does not exist" containerID="ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.625635 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa"} err="failed to get container status \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": rpc error: code = NotFound desc = could not find container \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": container with ID starting with ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.625722 4771 scope.go:117] "RemoveContainer" containerID="e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.626087 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": container with ID starting with e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60 not found: ID does not exist" containerID="e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.626171 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60"} err="failed to get container status \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": rpc error: code = NotFound desc = could not find container \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": container with ID starting with e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.626271 4771 scope.go:117] "RemoveContainer" containerID="dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" Feb 19 21:50:14 crc kubenswrapper[4771]: E0219 21:50:14.626613 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": container with ID starting with dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d not found: ID does not exist" containerID="dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.626743 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d"} err="failed to get container status \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": rpc error: code = NotFound desc = could not find container \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": container with ID starting with dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.626805 4771 scope.go:117] "RemoveContainer" containerID="759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.627128 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3"} err="failed to get container status \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": rpc error: code = NotFound desc = could not find container \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": container with ID starting with 759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.627208 4771 scope.go:117] "RemoveContainer" containerID="ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.627566 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.627565 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa"} err="failed to get container status \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": rpc error: code = NotFound desc = could not find container \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": container with ID starting with ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.627702 4771 scope.go:117] "RemoveContainer" containerID="e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.627986 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60"} err="failed to get container status \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": rpc error: code = NotFound desc = could not find container \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": container with ID starting with e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.628039 4771 scope.go:117] "RemoveContainer" containerID="dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.628381 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d"} err="failed to get container status \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": rpc error: code = NotFound desc = could not find container \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": container with ID starting with dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.628463 4771 scope.go:117] "RemoveContainer" containerID="759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.628688 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3"} err="failed to get container status \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": rpc error: code = NotFound desc = could not find container \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": container with ID starting with 759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.628755 4771 scope.go:117] "RemoveContainer" containerID="ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.629729 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa"} err="failed to get container status \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": rpc error: code = NotFound desc = could not find container \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": container with ID starting with ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.629772 4771 scope.go:117] "RemoveContainer" containerID="e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.630121 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60"} err="failed to get container status \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": rpc error: code = NotFound desc = could not find container \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": container with ID starting with e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.630188 4771 scope.go:117] "RemoveContainer" containerID="dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.630473 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d"} err="failed to get container status \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": rpc error: code = NotFound desc = could not find container \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": container with ID starting with dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.630542 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.630580 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.630551 4771 scope.go:117] "RemoveContainer" containerID="759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.631292 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3"} err="failed to get container status \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": rpc error: code = NotFound desc = could not find container \"759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3\": container with ID starting with 759582f562b9fc862bebd27cb86518acfce60e247216f0c3a6396f73b59ff6d3 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.631337 4771 scope.go:117] "RemoveContainer" containerID="ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.631533 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa"} err="failed to get container status \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": rpc error: code = NotFound desc = could not find container \"ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa\": container with ID starting with ccb707028fff4a7f8ccc5a06ad98554a2dac2197739eed89ba82fdc5f20d4aaa not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.631602 4771 scope.go:117] "RemoveContainer" containerID="e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.631869 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60"} err="failed to get container status \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": rpc error: code = NotFound desc = could not find container \"e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60\": container with ID starting with e36699547564fbf6f040cc7e9f3f8754a9963a512e00e853dd09284fcd5fcb60 not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.631904 4771 scope.go:117] "RemoveContainer" containerID="dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.632121 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d"} err="failed to get container status \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": rpc error: code = NotFound desc = could not find container \"dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d\": container with ID starting with dd7b81ffed5fc1978058535aa93543d1b575348b95bff182bde4b1f812d70f5d not found: ID does not exist" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.649656 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.739572 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf2rg\" (UniqueName: \"kubernetes.io/projected/fcb1436d-c34f-4211-b6f9-72a78630334d-kube-api-access-rf2rg\") pod \"fcb1436d-c34f-4211-b6f9-72a78630334d\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.739628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-httpd-config\") pod \"fcb1436d-c34f-4211-b6f9-72a78630334d\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.739733 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-ovndb-tls-certs\") pod \"fcb1436d-c34f-4211-b6f9-72a78630334d\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.739824 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-combined-ca-bundle\") pod \"fcb1436d-c34f-4211-b6f9-72a78630334d\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.739884 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-config\") pod \"fcb1436d-c34f-4211-b6f9-72a78630334d\" (UID: \"fcb1436d-c34f-4211-b6f9-72a78630334d\") " Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.740118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-config-data\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.740165 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dlhz\" (UniqueName: \"kubernetes.io/projected/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-kube-api-access-2dlhz\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.740215 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-log-httpd\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.740234 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.740277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-scripts\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.740322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.740348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-run-httpd\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.761078 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fcb1436d-c34f-4211-b6f9-72a78630334d" (UID: "fcb1436d-c34f-4211-b6f9-72a78630334d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.761210 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb1436d-c34f-4211-b6f9-72a78630334d-kube-api-access-rf2rg" (OuterVolumeSpecName: "kube-api-access-rf2rg") pod "fcb1436d-c34f-4211-b6f9-72a78630334d" (UID: "fcb1436d-c34f-4211-b6f9-72a78630334d"). InnerVolumeSpecName "kube-api-access-rf2rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.805559 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcb1436d-c34f-4211-b6f9-72a78630334d" (UID: "fcb1436d-c34f-4211-b6f9-72a78630334d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.808135 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-config" (OuterVolumeSpecName: "config") pod "fcb1436d-c34f-4211-b6f9-72a78630334d" (UID: "fcb1436d-c34f-4211-b6f9-72a78630334d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.832627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fcb1436d-c34f-4211-b6f9-72a78630334d" (UID: "fcb1436d-c34f-4211-b6f9-72a78630334d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.841763 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-log-httpd\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.841819 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.841871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-scripts\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.841941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.841982 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-run-httpd\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842048 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-config-data\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842102 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dlhz\" (UniqueName: \"kubernetes.io/projected/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-kube-api-access-2dlhz\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842199 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842219 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842231 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf2rg\" (UniqueName: \"kubernetes.io/projected/fcb1436d-c34f-4211-b6f9-72a78630334d-kube-api-access-rf2rg\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842243 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842256 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb1436d-c34f-4211-b6f9-72a78630334d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842237 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-log-httpd\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.842469 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-run-httpd\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.848276 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.849399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.852752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-config-data\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.858187 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-scripts\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.862171 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dlhz\" (UniqueName: \"kubernetes.io/projected/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-kube-api-access-2dlhz\") pod \"ceilometer-0\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " pod="openstack/ceilometer-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.933549 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:14 crc kubenswrapper[4771]: I0219 21:50:14.947229 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.045636 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-public-tls-certs\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.045929 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-logs\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.045983 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-httpd-run\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.046030 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-config-data\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.046139 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-scripts\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.046160 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42cmd\" (UniqueName: \"kubernetes.io/projected/20b982fd-3b6b-487a-9ec1-956360ee92d9-kube-api-access-42cmd\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.046182 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.046217 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-combined-ca-bundle\") pod \"20b982fd-3b6b-487a-9ec1-956360ee92d9\" (UID: \"20b982fd-3b6b-487a-9ec1-956360ee92d9\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.055644 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-logs" (OuterVolumeSpecName: "logs") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.055889 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.096127 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-scripts" (OuterVolumeSpecName: "scripts") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.096265 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b982fd-3b6b-487a-9ec1-956360ee92d9-kube-api-access-42cmd" (OuterVolumeSpecName: "kube-api-access-42cmd") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "kube-api-access-42cmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.096763 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.125711 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.148348 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.148585 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42cmd\" (UniqueName: \"kubernetes.io/projected/20b982fd-3b6b-487a-9ec1-956360ee92d9-kube-api-access-42cmd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.148680 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.148784 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.148876 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.148959 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20b982fd-3b6b-487a-9ec1-956360ee92d9-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.153057 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.168132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-config-data" (OuterVolumeSpecName: "config-data") pod "20b982fd-3b6b-487a-9ec1-956360ee92d9" (UID: "20b982fd-3b6b-487a-9ec1-956360ee92d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.210330 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.250984 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.251215 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.251293 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b982fd-3b6b-487a-9ec1-956360ee92d9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.436490 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:50:15 crc kubenswrapper[4771]: W0219 21:50:15.479359 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod442a4e73_a4e7_4ea3_8df3_b8d90c84088e.slice/crio-03b5d407c5483b0bfb8d137a8bd07afc8486462373b6c91c1b363ededa48221f WatchSource:0}: Error finding container 03b5d407c5483b0bfb8d137a8bd07afc8486462373b6c91c1b363ededa48221f: Status 404 returned error can't find the container with id 03b5d407c5483b0bfb8d137a8bd07afc8486462373b6c91c1b363ededa48221f Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.479518 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.544843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerStarted","Data":"03b5d407c5483b0bfb8d137a8bd07afc8486462373b6c91c1b363ededa48221f"} Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.545578 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.551835 4771 generic.go:334] "Generic (PLEG): container finished" podID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerID="b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb" exitCode=0 Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.551926 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01893f8d-263f-4c95-a1f9-864e9e655ee8","Type":"ContainerDied","Data":"b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb"} Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.551951 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"01893f8d-263f-4c95-a1f9-864e9e655ee8","Type":"ContainerDied","Data":"f83bd29ceaa6412f47ff847d6746725a4cdebb458b5005b52f86dee15d9ca078"} Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.551968 4771 scope.go:117] "RemoveContainer" containerID="b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.574482 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"20b982fd-3b6b-487a-9ec1-956360ee92d9","Type":"ContainerDied","Data":"b5ccfc745f69f321ce59b9a775fda3e5f2d7f9f91ff2d5fe8e8d4a6d648a82cc"} Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.574511 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.604680 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd457cb8d-5vbsm" event={"ID":"fcb1436d-c34f-4211-b6f9-72a78630334d","Type":"ContainerDied","Data":"abc55d25efe64fd10842ab56b937310df4758ae815d2789b9adf92e464d8444d"} Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.604861 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd457cb8d-5vbsm" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.605795 4771 scope.go:117] "RemoveContainer" containerID="f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.655160 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661513 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-internal-tls-certs\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661601 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-logs\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2j47\" (UniqueName: \"kubernetes.io/projected/01893f8d-263f-4c95-a1f9-864e9e655ee8-kube-api-access-j2j47\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661704 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-combined-ca-bundle\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661752 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-scripts\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661782 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-config-data\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.661867 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-httpd-run\") pod \"01893f8d-263f-4c95-a1f9-864e9e655ee8\" (UID: \"01893f8d-263f-4c95-a1f9-864e9e655ee8\") " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.662768 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.667663 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-logs" (OuterVolumeSpecName: "logs") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.672047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-scripts" (OuterVolumeSpecName: "scripts") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.672564 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.673138 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01893f8d-263f-4c95-a1f9-864e9e655ee8-kube-api-access-j2j47" (OuterVolumeSpecName: "kube-api-access-j2j47") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "kube-api-access-j2j47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.685127 4771 scope.go:117] "RemoveContainer" containerID="b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.689425 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:15 crc kubenswrapper[4771]: E0219 21:50:15.689633 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb\": container with ID starting with b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb not found: ID does not exist" containerID="b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.689670 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb"} err="failed to get container status \"b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb\": rpc error: code = NotFound desc = could not find container \"b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb\": container with ID starting with b714fdd472ba7dc1fae850dc8e4864270471a43849a047bf0d11fd700d856efb not found: ID does not exist" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.689695 4771 scope.go:117] "RemoveContainer" containerID="f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756" Feb 19 21:50:15 crc kubenswrapper[4771]: E0219 21:50:15.691585 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756\": container with ID starting with f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756 not found: ID does not exist" containerID="f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.691727 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756"} err="failed to get container status \"f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756\": rpc error: code = NotFound desc = could not find container \"f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756\": container with ID starting with f67bb05c8266cc62e7079a7a0823b4a278f9878da3bf9f96337e4a3dcbae7756 not found: ID does not exist" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.691853 4771 scope.go:117] "RemoveContainer" containerID="cf44c0f0303ceeed0b2c620f4aa0d0b8e0d088bfb5d009114b9968c51752a1a7" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.696155 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723053 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:15 crc kubenswrapper[4771]: E0219 21:50:15.723540 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-log" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723565 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-log" Feb 19 21:50:15 crc kubenswrapper[4771]: E0219 21:50:15.723590 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-httpd" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723600 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-httpd" Feb 19 21:50:15 crc kubenswrapper[4771]: E0219 21:50:15.723625 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-log" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723632 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-log" Feb 19 21:50:15 crc kubenswrapper[4771]: E0219 21:50:15.723641 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-httpd" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723647 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-httpd" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723796 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-httpd" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723816 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-log" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723839 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" containerName="glance-log" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.723852 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" containerName="glance-httpd" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.724716 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.728592 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.728929 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.745087 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.767263 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.767298 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2j47\" (UniqueName: \"kubernetes.io/projected/01893f8d-263f-4c95-a1f9-864e9e655ee8-kube-api-access-j2j47\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.767310 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.767322 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.767333 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01893f8d-263f-4c95-a1f9-864e9e655ee8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.767360 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.777212 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-config-data" (OuterVolumeSpecName: "config-data") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.794070 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.840424 4771 scope.go:117] "RemoveContainer" containerID="3bd7ebdb5d9120c17dd8d1df15d7fd4beb48ccd3ef7ec08acaf67e67d0447b11" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.844496 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869080 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-scripts\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869151 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-logs\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869235 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869266 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9jm\" (UniqueName: \"kubernetes.io/projected/306067f7-88de-4cdb-8ca2-3540ada9b006-kube-api-access-pp9jm\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869328 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869352 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869399 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-config-data\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869462 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.869475 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.871900 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "01893f8d-263f-4c95-a1f9-864e9e655ee8" (UID: "01893f8d-263f-4c95-a1f9-864e9e655ee8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.928785 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-64d6c64fc4-hcgn4"] Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.938303 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-64d6c64fc4-hcgn4" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-log" containerID="cri-o://5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06" gracePeriod=30 Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.938774 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-64d6c64fc4-hcgn4" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-api" containerID="cri-o://00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67" gracePeriod=30 Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.939093 4771 scope.go:117] "RemoveContainer" containerID="56ec40b5a72abb5cddcb68f7c06683a0c85a8037fde988353754a66196601dda" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.949752 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd457cb8d-5vbsm"] Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.968235 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bd457cb8d-5vbsm"] Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.971670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.971738 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp9jm\" (UniqueName: \"kubernetes.io/projected/306067f7-88de-4cdb-8ca2-3540ada9b006-kube-api-access-pp9jm\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.971804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.971828 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.972312 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-config-data\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.972523 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-scripts\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.972571 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-logs\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.972706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.972759 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01893f8d-263f-4c95-a1f9-864e9e655ee8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.972884 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.974632 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-logs\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.975087 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.977538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-scripts\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.978520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.979311 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.985437 4771 scope.go:117] "RemoveContainer" containerID="81928f25ef40f7c96ea048cc2dcbe4ec0c56577d24a6353cac6b73cbbf4e6ed5" Feb 19 21:50:15 crc kubenswrapper[4771]: I0219 21:50:15.988841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp9jm\" (UniqueName: \"kubernetes.io/projected/306067f7-88de-4cdb-8ca2-3540ada9b006-kube-api-access-pp9jm\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.003565 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-config-data\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.010155 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " pod="openstack/glance-default-external-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.219701 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.418618 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.457864 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b982fd-3b6b-487a-9ec1-956360ee92d9" path="/var/lib/kubelet/pods/20b982fd-3b6b-487a-9ec1-956360ee92d9/volumes" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.458965 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a38a20-14af-4bb6-837b-ff8d963a23db" path="/var/lib/kubelet/pods/f7a38a20-14af-4bb6-837b-ff8d963a23db/volumes" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.461141 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb1436d-c34f-4211-b6f9-72a78630334d" path="/var/lib/kubelet/pods/fcb1436d-c34f-4211-b6f9-72a78630334d/volumes" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.617750 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.620074 4771 generic.go:334] "Generic (PLEG): container finished" podID="eafa04ef-6985-4909-9228-1da911369751" containerID="5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06" exitCode=143 Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.620141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d6c64fc4-hcgn4" event={"ID":"eafa04ef-6985-4909-9228-1da911369751","Type":"ContainerDied","Data":"5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06"} Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.646501 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.658373 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.696828 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.698346 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.703812 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.704213 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.705052 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.781005 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:16 crc kubenswrapper[4771]: W0219 21:50:16.783372 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod306067f7_88de_4cdb_8ca2_3540ada9b006.slice/crio-fe0314b3b1991ef8347004b1f018c7add23f262bf102c2bbac1322abfbbc647e WatchSource:0}: Error finding container fe0314b3b1991ef8347004b1f018c7add23f262bf102c2bbac1322abfbbc647e: Status 404 returned error can't find the container with id fe0314b3b1991ef8347004b1f018c7add23f262bf102c2bbac1322abfbbc647e Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.793889 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.793982 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-logs\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.794065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.794156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.794193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.794255 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9vt\" (UniqueName: \"kubernetes.io/projected/b64cc52a-4f20-4e05-b444-c46f97727527-kube-api-access-vz9vt\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.794285 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.794312 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.895753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-logs\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.895847 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.895904 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.895935 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.895991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9vt\" (UniqueName: \"kubernetes.io/projected/b64cc52a-4f20-4e05-b444-c46f97727527-kube-api-access-vz9vt\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.896037 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.896065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.896122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.896220 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-logs\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.896593 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.897303 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.914703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.921641 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.922564 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.925650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.936690 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9vt\" (UniqueName: \"kubernetes.io/projected/b64cc52a-4f20-4e05-b444-c46f97727527-kube-api-access-vz9vt\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:16 crc kubenswrapper[4771]: I0219 21:50:16.980684 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:50:17 crc kubenswrapper[4771]: I0219 21:50:17.028418 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:17 crc kubenswrapper[4771]: I0219 21:50:17.640364 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"306067f7-88de-4cdb-8ca2-3540ada9b006","Type":"ContainerStarted","Data":"fe0314b3b1991ef8347004b1f018c7add23f262bf102c2bbac1322abfbbc647e"} Feb 19 21:50:17 crc kubenswrapper[4771]: I0219 21:50:17.647407 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:17 crc kubenswrapper[4771]: W0219 21:50:17.663527 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb64cc52a_4f20_4e05_b444_c46f97727527.slice/crio-148169a0c855d492d63ceb6c713da40443055b3a4aa0abada920d121f7ae1f20 WatchSource:0}: Error finding container 148169a0c855d492d63ceb6c713da40443055b3a4aa0abada920d121f7ae1f20: Status 404 returned error can't find the container with id 148169a0c855d492d63ceb6c713da40443055b3a4aa0abada920d121f7ae1f20 Feb 19 21:50:18 crc kubenswrapper[4771]: I0219 21:50:18.447584 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01893f8d-263f-4c95-a1f9-864e9e655ee8" path="/var/lib/kubelet/pods/01893f8d-263f-4c95-a1f9-864e9e655ee8/volumes" Feb 19 21:50:18 crc kubenswrapper[4771]: I0219 21:50:18.649334 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b64cc52a-4f20-4e05-b444-c46f97727527","Type":"ContainerStarted","Data":"148169a0c855d492d63ceb6c713da40443055b3a4aa0abada920d121f7ae1f20"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.466868 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.550867 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h2z9\" (UniqueName: \"kubernetes.io/projected/eafa04ef-6985-4909-9228-1da911369751-kube-api-access-7h2z9\") pod \"eafa04ef-6985-4909-9228-1da911369751\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.550959 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa04ef-6985-4909-9228-1da911369751-logs\") pod \"eafa04ef-6985-4909-9228-1da911369751\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.551003 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-internal-tls-certs\") pod \"eafa04ef-6985-4909-9228-1da911369751\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.551134 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-public-tls-certs\") pod \"eafa04ef-6985-4909-9228-1da911369751\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.551162 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-scripts\") pod \"eafa04ef-6985-4909-9228-1da911369751\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.551185 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-combined-ca-bundle\") pod \"eafa04ef-6985-4909-9228-1da911369751\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.551236 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-config-data\") pod \"eafa04ef-6985-4909-9228-1da911369751\" (UID: \"eafa04ef-6985-4909-9228-1da911369751\") " Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.556736 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafa04ef-6985-4909-9228-1da911369751-logs" (OuterVolumeSpecName: "logs") pod "eafa04ef-6985-4909-9228-1da911369751" (UID: "eafa04ef-6985-4909-9228-1da911369751"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.574160 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-scripts" (OuterVolumeSpecName: "scripts") pod "eafa04ef-6985-4909-9228-1da911369751" (UID: "eafa04ef-6985-4909-9228-1da911369751"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.576839 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafa04ef-6985-4909-9228-1da911369751-kube-api-access-7h2z9" (OuterVolumeSpecName: "kube-api-access-7h2z9") pod "eafa04ef-6985-4909-9228-1da911369751" (UID: "eafa04ef-6985-4909-9228-1da911369751"). InnerVolumeSpecName "kube-api-access-7h2z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.648121 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eafa04ef-6985-4909-9228-1da911369751" (UID: "eafa04ef-6985-4909-9228-1da911369751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.650595 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-config-data" (OuterVolumeSpecName: "config-data") pod "eafa04ef-6985-4909-9228-1da911369751" (UID: "eafa04ef-6985-4909-9228-1da911369751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.652947 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.652971 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.652982 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.652991 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h2z9\" (UniqueName: \"kubernetes.io/projected/eafa04ef-6985-4909-9228-1da911369751-kube-api-access-7h2z9\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.653001 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafa04ef-6985-4909-9228-1da911369751-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.682323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b64cc52a-4f20-4e05-b444-c46f97727527","Type":"ContainerStarted","Data":"8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.682371 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b64cc52a-4f20-4e05-b444-c46f97727527","Type":"ContainerStarted","Data":"4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.687128 4771 generic.go:334] "Generic (PLEG): container finished" podID="eafa04ef-6985-4909-9228-1da911369751" containerID="00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67" exitCode=0 Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.687178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d6c64fc4-hcgn4" event={"ID":"eafa04ef-6985-4909-9228-1da911369751","Type":"ContainerDied","Data":"00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.687203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d6c64fc4-hcgn4" event={"ID":"eafa04ef-6985-4909-9228-1da911369751","Type":"ContainerDied","Data":"cc88aab17c914b0b17ceabccae14c5de86203d6e91ddee7dc43a97254a95a585"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.687233 4771 scope.go:117] "RemoveContainer" containerID="00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.687347 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d6c64fc4-hcgn4" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.697447 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"306067f7-88de-4cdb-8ca2-3540ada9b006","Type":"ContainerStarted","Data":"3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.697485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"306067f7-88de-4cdb-8ca2-3540ada9b006","Type":"ContainerStarted","Data":"286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.699635 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerStarted","Data":"4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3"} Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.736781 4771 scope.go:117] "RemoveContainer" containerID="5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.739445 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.7393966770000002 podStartE2EDuration="3.739396677s" podCreationTimestamp="2026-02-19 21:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:50:19.70583891 +0000 UTC m=+1319.977281380" watchObservedRunningTime="2026-02-19 21:50:19.739396677 +0000 UTC m=+1320.010839157" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.740681 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eafa04ef-6985-4909-9228-1da911369751" (UID: "eafa04ef-6985-4909-9228-1da911369751"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.742399 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eafa04ef-6985-4909-9228-1da911369751" (UID: "eafa04ef-6985-4909-9228-1da911369751"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.745205 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.745193511 podStartE2EDuration="4.745193511s" podCreationTimestamp="2026-02-19 21:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:50:19.73280342 +0000 UTC m=+1320.004245900" watchObservedRunningTime="2026-02-19 21:50:19.745193511 +0000 UTC m=+1320.016635971" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.754203 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.754447 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafa04ef-6985-4909-9228-1da911369751-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.761237 4771 scope.go:117] "RemoveContainer" containerID="00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67" Feb 19 21:50:19 crc kubenswrapper[4771]: E0219 21:50:19.763612 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67\": container with ID starting with 00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67 not found: ID does not exist" containerID="00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.763673 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67"} err="failed to get container status \"00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67\": rpc error: code = NotFound desc = could not find container \"00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67\": container with ID starting with 00cbfaee9d7cda601b3c13c1047171d6c7c738127bd3b70f0889541dd1534b67 not found: ID does not exist" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.763699 4771 scope.go:117] "RemoveContainer" containerID="5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06" Feb 19 21:50:19 crc kubenswrapper[4771]: E0219 21:50:19.764300 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06\": container with ID starting with 5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06 not found: ID does not exist" containerID="5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06" Feb 19 21:50:19 crc kubenswrapper[4771]: I0219 21:50:19.764327 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06"} err="failed to get container status \"5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06\": rpc error: code = NotFound desc = could not find container \"5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06\": container with ID starting with 5f2c6be42272d0a5a443b8ef08b3aa1f5acd8ec3d631d294a6b69b83b4facc06 not found: ID does not exist" Feb 19 21:50:20 crc kubenswrapper[4771]: I0219 21:50:20.017676 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-64d6c64fc4-hcgn4"] Feb 19 21:50:20 crc kubenswrapper[4771]: I0219 21:50:20.024955 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-64d6c64fc4-hcgn4"] Feb 19 21:50:20 crc kubenswrapper[4771]: I0219 21:50:20.450971 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafa04ef-6985-4909-9228-1da911369751" path="/var/lib/kubelet/pods/eafa04ef-6985-4909-9228-1da911369751/volumes" Feb 19 21:50:20 crc kubenswrapper[4771]: I0219 21:50:20.713861 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerStarted","Data":"cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f"} Feb 19 21:50:20 crc kubenswrapper[4771]: I0219 21:50:20.713902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerStarted","Data":"7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a"} Feb 19 21:50:22 crc kubenswrapper[4771]: I0219 21:50:22.753490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerStarted","Data":"4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664"} Feb 19 21:50:22 crc kubenswrapper[4771]: I0219 21:50:22.754071 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-central-agent" containerID="cri-o://4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3" gracePeriod=30 Feb 19 21:50:22 crc kubenswrapper[4771]: I0219 21:50:22.755003 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="sg-core" containerID="cri-o://cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f" gracePeriod=30 Feb 19 21:50:22 crc kubenswrapper[4771]: I0219 21:50:22.755150 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="proxy-httpd" containerID="cri-o://4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664" gracePeriod=30 Feb 19 21:50:22 crc kubenswrapper[4771]: I0219 21:50:22.755209 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:50:22 crc kubenswrapper[4771]: I0219 21:50:22.755232 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-notification-agent" containerID="cri-o://7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a" gracePeriod=30 Feb 19 21:50:22 crc kubenswrapper[4771]: I0219 21:50:22.777391 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.23226468 podStartE2EDuration="8.777377267s" podCreationTimestamp="2026-02-19 21:50:14 +0000 UTC" firstStartedPulling="2026-02-19 21:50:15.482587183 +0000 UTC m=+1315.754029643" lastFinishedPulling="2026-02-19 21:50:22.02769975 +0000 UTC m=+1322.299142230" observedRunningTime="2026-02-19 21:50:22.774637694 +0000 UTC m=+1323.046080154" watchObservedRunningTime="2026-02-19 21:50:22.777377267 +0000 UTC m=+1323.048819737" Feb 19 21:50:23 crc kubenswrapper[4771]: I0219 21:50:23.765373 4771 generic.go:334] "Generic (PLEG): container finished" podID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerID="4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664" exitCode=0 Feb 19 21:50:23 crc kubenswrapper[4771]: I0219 21:50:23.765690 4771 generic.go:334] "Generic (PLEG): container finished" podID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerID="cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f" exitCode=2 Feb 19 21:50:23 crc kubenswrapper[4771]: I0219 21:50:23.765698 4771 generic.go:334] "Generic (PLEG): container finished" podID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerID="7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a" exitCode=0 Feb 19 21:50:23 crc kubenswrapper[4771]: I0219 21:50:23.765439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerDied","Data":"4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664"} Feb 19 21:50:23 crc kubenswrapper[4771]: I0219 21:50:23.765727 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerDied","Data":"cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f"} Feb 19 21:50:23 crc kubenswrapper[4771]: I0219 21:50:23.765742 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerDied","Data":"7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a"} Feb 19 21:50:24 crc kubenswrapper[4771]: I0219 21:50:24.776236 4771 generic.go:334] "Generic (PLEG): container finished" podID="d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" containerID="70ccbec8c9a37573cbcbf07ef46b413724db3b6ad9381cf0ff63195ae6ffb18a" exitCode=0 Feb 19 21:50:24 crc kubenswrapper[4771]: I0219 21:50:24.776286 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" event={"ID":"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02","Type":"ContainerDied","Data":"70ccbec8c9a37573cbcbf07ef46b413724db3b6ad9381cf0ff63195ae6ffb18a"} Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.221137 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.221736 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.241692 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.272301 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.273132 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.285550 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-combined-ca-bundle\") pod \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.285631 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-config-data\") pod \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.285674 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-scripts\") pod \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.285730 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j24td\" (UniqueName: \"kubernetes.io/projected/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-kube-api-access-j24td\") pod \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\" (UID: \"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02\") " Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.310678 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-kube-api-access-j24td" (OuterVolumeSpecName: "kube-api-access-j24td") pod "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" (UID: "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02"). InnerVolumeSpecName "kube-api-access-j24td". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.311231 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-scripts" (OuterVolumeSpecName: "scripts") pod "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" (UID: "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.343392 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" (UID: "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.346706 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-config-data" (OuterVolumeSpecName: "config-data") pod "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" (UID: "d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.388355 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.388533 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.388619 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.388687 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j24td\" (UniqueName: \"kubernetes.io/projected/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02-kube-api-access-j24td\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.828740 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" event={"ID":"d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02","Type":"ContainerDied","Data":"82876872656d7c8e6760d9625d2ac75dcc364b82135514d874c849b40abef29f"} Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.828804 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82876872656d7c8e6760d9625d2ac75dcc364b82135514d874c849b40abef29f" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.828769 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rl7qm" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.829436 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.829486 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.929558 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:26 crc kubenswrapper[4771]: E0219 21:50:26.930408 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" containerName="nova-cell0-conductor-db-sync" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.930437 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" containerName="nova-cell0-conductor-db-sync" Feb 19 21:50:26 crc kubenswrapper[4771]: E0219 21:50:26.930473 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-api" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.930487 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-api" Feb 19 21:50:26 crc kubenswrapper[4771]: E0219 21:50:26.930507 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-log" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.930520 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-log" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.930820 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-api" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.930869 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafa04ef-6985-4909-9228-1da911369751" containerName="placement-log" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.930888 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" containerName="nova-cell0-conductor-db-sync" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.931776 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.934370 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5thxh" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.935247 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 21:50:26 crc kubenswrapper[4771]: I0219 21:50:26.938705 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.000709 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.000812 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57vfd\" (UniqueName: \"kubernetes.io/projected/a3597602-2a6d-4a26-87d7-77696a25cef5-kube-api-access-57vfd\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.000924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.029709 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.029797 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.071237 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.080871 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.105254 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.105650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.105982 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57vfd\" (UniqueName: \"kubernetes.io/projected/a3597602-2a6d-4a26-87d7-77696a25cef5-kube-api-access-57vfd\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.113230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.118379 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.130487 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57vfd\" (UniqueName: \"kubernetes.io/projected/a3597602-2a6d-4a26-87d7-77696a25cef5-kube-api-access-57vfd\") pod \"nova-cell0-conductor-0\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.252363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.428636 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.615352 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-combined-ca-bundle\") pod \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.615738 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-run-httpd\") pod \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.615897 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-config-data\") pod \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.616061 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-scripts\") pod \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.616213 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dlhz\" (UniqueName: \"kubernetes.io/projected/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-kube-api-access-2dlhz\") pod \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.616240 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "442a4e73-a4e7-4ea3-8df3-b8d90c84088e" (UID: "442a4e73-a4e7-4ea3-8df3-b8d90c84088e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.616629 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-log-httpd\") pod \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.618456 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-sg-core-conf-yaml\") pod \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\" (UID: \"442a4e73-a4e7-4ea3-8df3-b8d90c84088e\") " Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.619265 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.617247 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "442a4e73-a4e7-4ea3-8df3-b8d90c84088e" (UID: "442a4e73-a4e7-4ea3-8df3-b8d90c84088e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.620470 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-kube-api-access-2dlhz" (OuterVolumeSpecName: "kube-api-access-2dlhz") pod "442a4e73-a4e7-4ea3-8df3-b8d90c84088e" (UID: "442a4e73-a4e7-4ea3-8df3-b8d90c84088e"). InnerVolumeSpecName "kube-api-access-2dlhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.621038 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-scripts" (OuterVolumeSpecName: "scripts") pod "442a4e73-a4e7-4ea3-8df3-b8d90c84088e" (UID: "442a4e73-a4e7-4ea3-8df3-b8d90c84088e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.644269 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "442a4e73-a4e7-4ea3-8df3-b8d90c84088e" (UID: "442a4e73-a4e7-4ea3-8df3-b8d90c84088e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.689701 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "442a4e73-a4e7-4ea3-8df3-b8d90c84088e" (UID: "442a4e73-a4e7-4ea3-8df3-b8d90c84088e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.721374 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.721710 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.721851 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dlhz\" (UniqueName: \"kubernetes.io/projected/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-kube-api-access-2dlhz\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.721962 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.722074 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.723871 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-config-data" (OuterVolumeSpecName: "config-data") pod "442a4e73-a4e7-4ea3-8df3-b8d90c84088e" (UID: "442a4e73-a4e7-4ea3-8df3-b8d90c84088e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.759949 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.823134 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442a4e73-a4e7-4ea3-8df3-b8d90c84088e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.839695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3597602-2a6d-4a26-87d7-77696a25cef5","Type":"ContainerStarted","Data":"2ca0ecb5780daad37971ca8c53acf54d14efed1c7a0a304c934b2054c8dbeccb"} Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.842490 4771 generic.go:334] "Generic (PLEG): container finished" podID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerID="4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3" exitCode=0 Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.842558 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerDied","Data":"4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3"} Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.842579 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"442a4e73-a4e7-4ea3-8df3-b8d90c84088e","Type":"ContainerDied","Data":"03b5d407c5483b0bfb8d137a8bd07afc8486462373b6c91c1b363ededa48221f"} Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.842595 4771 scope.go:117] "RemoveContainer" containerID="4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.843059 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.843099 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.844374 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.903367 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.910747 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.924370 4771 scope.go:117] "RemoveContainer" containerID="cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.943436 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:27 crc kubenswrapper[4771]: E0219 21:50:27.943964 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-notification-agent" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944046 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-notification-agent" Feb 19 21:50:27 crc kubenswrapper[4771]: E0219 21:50:27.944117 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-central-agent" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944167 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-central-agent" Feb 19 21:50:27 crc kubenswrapper[4771]: E0219 21:50:27.944230 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="proxy-httpd" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944285 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="proxy-httpd" Feb 19 21:50:27 crc kubenswrapper[4771]: E0219 21:50:27.944411 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="sg-core" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944469 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="sg-core" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944683 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-central-agent" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944745 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="proxy-httpd" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944805 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="ceilometer-notification-agent" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.944873 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" containerName="sg-core" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.946522 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.949012 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:50:27 crc kubenswrapper[4771]: I0219 21:50:27.949473 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.001399 4771 scope.go:117] "RemoveContainer" containerID="7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.001551 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.086821 4771 scope.go:117] "RemoveContainer" containerID="4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.136140 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-run-httpd\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.136237 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.136266 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.136292 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-scripts\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.136314 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-config-data\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.136346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-log-httpd\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.136361 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlhj4\" (UniqueName: \"kubernetes.io/projected/425ee156-5d76-44b9-be17-19470181ee6e-kube-api-access-nlhj4\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.151224 4771 scope.go:117] "RemoveContainer" containerID="4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664" Feb 19 21:50:28 crc kubenswrapper[4771]: E0219 21:50:28.158162 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664\": container with ID starting with 4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664 not found: ID does not exist" containerID="4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.158213 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664"} err="failed to get container status \"4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664\": rpc error: code = NotFound desc = could not find container \"4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664\": container with ID starting with 4c9832f673d30bfe88e2510fc23d4ec54eed2425f7a8bc38f35e5e6e44b30664 not found: ID does not exist" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.158244 4771 scope.go:117] "RemoveContainer" containerID="cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f" Feb 19 21:50:28 crc kubenswrapper[4771]: E0219 21:50:28.162206 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f\": container with ID starting with cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f not found: ID does not exist" containerID="cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.162262 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f"} err="failed to get container status \"cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f\": rpc error: code = NotFound desc = could not find container \"cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f\": container with ID starting with cc3b9f65bfe84781422a70d2f59e8c520f6dd6654894443863a91ed4920c255f not found: ID does not exist" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.162288 4771 scope.go:117] "RemoveContainer" containerID="7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a" Feb 19 21:50:28 crc kubenswrapper[4771]: E0219 21:50:28.164974 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a\": container with ID starting with 7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a not found: ID does not exist" containerID="7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.165009 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a"} err="failed to get container status \"7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a\": rpc error: code = NotFound desc = could not find container \"7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a\": container with ID starting with 7d0c538460a794ef7401e0d73ee47f12d0886c7129353048385c627efda2eb8a not found: ID does not exist" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.165033 4771 scope.go:117] "RemoveContainer" containerID="4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3" Feb 19 21:50:28 crc kubenswrapper[4771]: E0219 21:50:28.174196 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3\": container with ID starting with 4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3 not found: ID does not exist" containerID="4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.174249 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3"} err="failed to get container status \"4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3\": rpc error: code = NotFound desc = could not find container \"4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3\": container with ID starting with 4da5a5e9a3130ee62f31197c94bec43753c35b102a232d22e9befbe8938376c3 not found: ID does not exist" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-scripts\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238350 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-config-data\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-log-httpd\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238402 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlhj4\" (UniqueName: \"kubernetes.io/projected/425ee156-5d76-44b9-be17-19470181ee6e-kube-api-access-nlhj4\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238443 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-run-httpd\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.238823 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-run-httpd\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.239037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-log-httpd\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.242971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.243815 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-config-data\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.248699 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.252745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-scripts\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.260449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlhj4\" (UniqueName: \"kubernetes.io/projected/425ee156-5d76-44b9-be17-19470181ee6e-kube-api-access-nlhj4\") pod \"ceilometer-0\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.288595 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.292755 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.466915 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442a4e73-a4e7-4ea3-8df3-b8d90c84088e" path="/var/lib/kubelet/pods/442a4e73-a4e7-4ea3-8df3-b8d90c84088e/volumes" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.725912 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.846163 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.856613 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3597602-2a6d-4a26-87d7-77696a25cef5","Type":"ContainerStarted","Data":"894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423"} Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.856672 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.857089 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.872669 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.872651427 podStartE2EDuration="2.872651427s" podCreationTimestamp="2026-02-19 21:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:50:28.871838575 +0000 UTC m=+1329.143281045" watchObservedRunningTime="2026-02-19 21:50:28.872651427 +0000 UTC m=+1329.144093907" Feb 19 21:50:28 crc kubenswrapper[4771]: I0219 21:50:28.937492 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:50:29 crc kubenswrapper[4771]: I0219 21:50:29.785074 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:29 crc kubenswrapper[4771]: I0219 21:50:29.786962 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:29 crc kubenswrapper[4771]: I0219 21:50:29.867713 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerStarted","Data":"a5796140ce3930d7b94fb683cb19c12cc003c704cf8e67532f634e9d756889c5"} Feb 19 21:50:29 crc kubenswrapper[4771]: I0219 21:50:29.867965 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerStarted","Data":"fcdf3e51cf007b9b682471f5cc2e29000ecada729be903bfdd691acd7fb697c2"} Feb 19 21:50:29 crc kubenswrapper[4771]: I0219 21:50:29.868091 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" containerID="cri-o://894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" gracePeriod=30 Feb 19 21:50:30 crc kubenswrapper[4771]: I0219 21:50:30.194294 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:30 crc kubenswrapper[4771]: I0219 21:50:30.875494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerStarted","Data":"61e8ff3a16ec07649a86fd400b5feb97eb45b4f9d4b821ff90889910db873526"} Feb 19 21:50:30 crc kubenswrapper[4771]: I0219 21:50:30.875919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerStarted","Data":"90910b92ad5d25595ba949dfa25fee5c2d747bc5169c6de1b10663990fa5d88d"} Feb 19 21:50:32 crc kubenswrapper[4771]: I0219 21:50:32.915627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerStarted","Data":"fa19b0f7844363da2985f827a1a7fb435bcfc32e30dc3debb4d0f287ad46a165"} Feb 19 21:50:32 crc kubenswrapper[4771]: I0219 21:50:32.917259 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:50:32 crc kubenswrapper[4771]: I0219 21:50:32.916660 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="proxy-httpd" containerID="cri-o://fa19b0f7844363da2985f827a1a7fb435bcfc32e30dc3debb4d0f287ad46a165" gracePeriod=30 Feb 19 21:50:32 crc kubenswrapper[4771]: I0219 21:50:32.916687 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="sg-core" containerID="cri-o://61e8ff3a16ec07649a86fd400b5feb97eb45b4f9d4b821ff90889910db873526" gracePeriod=30 Feb 19 21:50:32 crc kubenswrapper[4771]: I0219 21:50:32.916704 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-notification-agent" containerID="cri-o://90910b92ad5d25595ba949dfa25fee5c2d747bc5169c6de1b10663990fa5d88d" gracePeriod=30 Feb 19 21:50:32 crc kubenswrapper[4771]: I0219 21:50:32.915975 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-central-agent" containerID="cri-o://a5796140ce3930d7b94fb683cb19c12cc003c704cf8e67532f634e9d756889c5" gracePeriod=30 Feb 19 21:50:32 crc kubenswrapper[4771]: I0219 21:50:32.953953 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.111839783 podStartE2EDuration="5.953935851s" podCreationTimestamp="2026-02-19 21:50:27 +0000 UTC" firstStartedPulling="2026-02-19 21:50:28.852402236 +0000 UTC m=+1329.123844716" lastFinishedPulling="2026-02-19 21:50:31.694498274 +0000 UTC m=+1331.965940784" observedRunningTime="2026-02-19 21:50:32.943611115 +0000 UTC m=+1333.215053615" watchObservedRunningTime="2026-02-19 21:50:32.953935851 +0000 UTC m=+1333.225378321" Feb 19 21:50:33 crc kubenswrapper[4771]: I0219 21:50:33.932067 4771 generic.go:334] "Generic (PLEG): container finished" podID="425ee156-5d76-44b9-be17-19470181ee6e" containerID="fa19b0f7844363da2985f827a1a7fb435bcfc32e30dc3debb4d0f287ad46a165" exitCode=0 Feb 19 21:50:33 crc kubenswrapper[4771]: I0219 21:50:33.932327 4771 generic.go:334] "Generic (PLEG): container finished" podID="425ee156-5d76-44b9-be17-19470181ee6e" containerID="61e8ff3a16ec07649a86fd400b5feb97eb45b4f9d4b821ff90889910db873526" exitCode=2 Feb 19 21:50:33 crc kubenswrapper[4771]: I0219 21:50:33.932342 4771 generic.go:334] "Generic (PLEG): container finished" podID="425ee156-5d76-44b9-be17-19470181ee6e" containerID="90910b92ad5d25595ba949dfa25fee5c2d747bc5169c6de1b10663990fa5d88d" exitCode=0 Feb 19 21:50:33 crc kubenswrapper[4771]: I0219 21:50:33.932123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerDied","Data":"fa19b0f7844363da2985f827a1a7fb435bcfc32e30dc3debb4d0f287ad46a165"} Feb 19 21:50:33 crc kubenswrapper[4771]: I0219 21:50:33.932390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerDied","Data":"61e8ff3a16ec07649a86fd400b5feb97eb45b4f9d4b821ff90889910db873526"} Feb 19 21:50:33 crc kubenswrapper[4771]: I0219 21:50:33.932409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerDied","Data":"90910b92ad5d25595ba949dfa25fee5c2d747bc5169c6de1b10663990fa5d88d"} Feb 19 21:50:36 crc kubenswrapper[4771]: I0219 21:50:36.972637 4771 generic.go:334] "Generic (PLEG): container finished" podID="425ee156-5d76-44b9-be17-19470181ee6e" containerID="a5796140ce3930d7b94fb683cb19c12cc003c704cf8e67532f634e9d756889c5" exitCode=0 Feb 19 21:50:36 crc kubenswrapper[4771]: I0219 21:50:36.972746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerDied","Data":"a5796140ce3930d7b94fb683cb19c12cc003c704cf8e67532f634e9d756889c5"} Feb 19 21:50:37 crc kubenswrapper[4771]: E0219 21:50:37.258088 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:37 crc kubenswrapper[4771]: E0219 21:50:37.260155 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:37 crc kubenswrapper[4771]: E0219 21:50:37.262074 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:37 crc kubenswrapper[4771]: E0219 21:50:37.262116 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.398206 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.548292 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-sg-core-conf-yaml\") pod \"425ee156-5d76-44b9-be17-19470181ee6e\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.548337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-scripts\") pod \"425ee156-5d76-44b9-be17-19470181ee6e\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.548411 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlhj4\" (UniqueName: \"kubernetes.io/projected/425ee156-5d76-44b9-be17-19470181ee6e-kube-api-access-nlhj4\") pod \"425ee156-5d76-44b9-be17-19470181ee6e\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.548456 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-config-data\") pod \"425ee156-5d76-44b9-be17-19470181ee6e\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.548489 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-log-httpd\") pod \"425ee156-5d76-44b9-be17-19470181ee6e\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.548551 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-combined-ca-bundle\") pod \"425ee156-5d76-44b9-be17-19470181ee6e\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.548651 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-run-httpd\") pod \"425ee156-5d76-44b9-be17-19470181ee6e\" (UID: \"425ee156-5d76-44b9-be17-19470181ee6e\") " Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.549079 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "425ee156-5d76-44b9-be17-19470181ee6e" (UID: "425ee156-5d76-44b9-be17-19470181ee6e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.549102 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "425ee156-5d76-44b9-be17-19470181ee6e" (UID: "425ee156-5d76-44b9-be17-19470181ee6e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.549319 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.549335 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425ee156-5d76-44b9-be17-19470181ee6e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.553581 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-scripts" (OuterVolumeSpecName: "scripts") pod "425ee156-5d76-44b9-be17-19470181ee6e" (UID: "425ee156-5d76-44b9-be17-19470181ee6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.559434 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425ee156-5d76-44b9-be17-19470181ee6e-kube-api-access-nlhj4" (OuterVolumeSpecName: "kube-api-access-nlhj4") pod "425ee156-5d76-44b9-be17-19470181ee6e" (UID: "425ee156-5d76-44b9-be17-19470181ee6e"). InnerVolumeSpecName "kube-api-access-nlhj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.585225 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "425ee156-5d76-44b9-be17-19470181ee6e" (UID: "425ee156-5d76-44b9-be17-19470181ee6e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.614720 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425ee156-5d76-44b9-be17-19470181ee6e" (UID: "425ee156-5d76-44b9-be17-19470181ee6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.649314 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-config-data" (OuterVolumeSpecName: "config-data") pod "425ee156-5d76-44b9-be17-19470181ee6e" (UID: "425ee156-5d76-44b9-be17-19470181ee6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.650781 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.650815 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.650833 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlhj4\" (UniqueName: \"kubernetes.io/projected/425ee156-5d76-44b9-be17-19470181ee6e-kube-api-access-nlhj4\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.650885 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.650902 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425ee156-5d76-44b9-be17-19470181ee6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.990957 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.990850 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425ee156-5d76-44b9-be17-19470181ee6e","Type":"ContainerDied","Data":"fcdf3e51cf007b9b682471f5cc2e29000ecada729be903bfdd691acd7fb697c2"} Feb 19 21:50:37 crc kubenswrapper[4771]: I0219 21:50:37.991523 4771 scope.go:117] "RemoveContainer" containerID="fa19b0f7844363da2985f827a1a7fb435bcfc32e30dc3debb4d0f287ad46a165" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.032936 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.036420 4771 scope.go:117] "RemoveContainer" containerID="61e8ff3a16ec07649a86fd400b5feb97eb45b4f9d4b821ff90889910db873526" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.044716 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.067275 4771 scope.go:117] "RemoveContainer" containerID="90910b92ad5d25595ba949dfa25fee5c2d747bc5169c6de1b10663990fa5d88d" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.085278 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:38 crc kubenswrapper[4771]: E0219 21:50:38.085758 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-central-agent" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.085778 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-central-agent" Feb 19 21:50:38 crc kubenswrapper[4771]: E0219 21:50:38.085800 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="proxy-httpd" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.085809 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="proxy-httpd" Feb 19 21:50:38 crc kubenswrapper[4771]: E0219 21:50:38.085826 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="sg-core" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.085835 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="sg-core" Feb 19 21:50:38 crc kubenswrapper[4771]: E0219 21:50:38.085850 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-notification-agent" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.085859 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-notification-agent" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.086227 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="proxy-httpd" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.086252 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-notification-agent" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.086267 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="sg-core" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.086308 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="425ee156-5d76-44b9-be17-19470181ee6e" containerName="ceilometer-central-agent" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.088201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.105260 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.105484 4771 scope.go:117] "RemoveContainer" containerID="a5796140ce3930d7b94fb683cb19c12cc003c704cf8e67532f634e9d756889c5" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.106693 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.106887 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.160186 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-log-httpd\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.160229 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48xjc\" (UniqueName: \"kubernetes.io/projected/23c80346-4fcf-4c8c-b674-85b242187b94-kube-api-access-48xjc\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.160299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-scripts\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.160355 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-config-data\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.160402 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.160450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-run-httpd\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.160489 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.262202 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-log-httpd\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.262238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48xjc\" (UniqueName: \"kubernetes.io/projected/23c80346-4fcf-4c8c-b674-85b242187b94-kube-api-access-48xjc\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.262280 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-scripts\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.262329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-config-data\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.262356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.262391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-run-httpd\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.262416 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.263367 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-log-httpd\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.263636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-run-httpd\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.267550 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-scripts\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.274707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-config-data\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.275263 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.280537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.280891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48xjc\" (UniqueName: \"kubernetes.io/projected/23c80346-4fcf-4c8c-b674-85b242187b94-kube-api-access-48xjc\") pod \"ceilometer-0\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.437839 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.451859 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425ee156-5d76-44b9-be17-19470181ee6e" path="/var/lib/kubelet/pods/425ee156-5d76-44b9-be17-19470181ee6e/volumes" Feb 19 21:50:38 crc kubenswrapper[4771]: I0219 21:50:38.927621 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:39 crc kubenswrapper[4771]: I0219 21:50:39.002514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerStarted","Data":"32cd8454dbdc6d53698456f364ba5185271ba48e9ad9c4928df989dcc3ef13e4"} Feb 19 21:50:40 crc kubenswrapper[4771]: I0219 21:50:40.017422 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerStarted","Data":"28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9"} Feb 19 21:50:41 crc kubenswrapper[4771]: I0219 21:50:41.030931 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerStarted","Data":"56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4"} Feb 19 21:50:42 crc kubenswrapper[4771]: I0219 21:50:42.045315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerStarted","Data":"1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669"} Feb 19 21:50:42 crc kubenswrapper[4771]: E0219 21:50:42.256092 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:42 crc kubenswrapper[4771]: E0219 21:50:42.259233 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:42 crc kubenswrapper[4771]: E0219 21:50:42.261247 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:42 crc kubenswrapper[4771]: E0219 21:50:42.261305 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:50:43 crc kubenswrapper[4771]: I0219 21:50:43.058831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerStarted","Data":"457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb"} Feb 19 21:50:43 crc kubenswrapper[4771]: I0219 21:50:43.059249 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:50:43 crc kubenswrapper[4771]: I0219 21:50:43.089830 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.307189641 podStartE2EDuration="5.089810987s" podCreationTimestamp="2026-02-19 21:50:38 +0000 UTC" firstStartedPulling="2026-02-19 21:50:38.93318883 +0000 UTC m=+1339.204631310" lastFinishedPulling="2026-02-19 21:50:42.715810146 +0000 UTC m=+1342.987252656" observedRunningTime="2026-02-19 21:50:43.081422593 +0000 UTC m=+1343.352865093" watchObservedRunningTime="2026-02-19 21:50:43.089810987 +0000 UTC m=+1343.361253457" Feb 19 21:50:47 crc kubenswrapper[4771]: E0219 21:50:47.255758 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:47 crc kubenswrapper[4771]: E0219 21:50:47.264727 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:47 crc kubenswrapper[4771]: E0219 21:50:47.266917 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:47 crc kubenswrapper[4771]: E0219 21:50:47.267007 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:50:52 crc kubenswrapper[4771]: E0219 21:50:52.255606 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:52 crc kubenswrapper[4771]: E0219 21:50:52.258510 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:52 crc kubenswrapper[4771]: E0219 21:50:52.260474 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:52 crc kubenswrapper[4771]: E0219 21:50:52.260582 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:50:57 crc kubenswrapper[4771]: E0219 21:50:57.256474 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:57 crc kubenswrapper[4771]: E0219 21:50:57.258489 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:57 crc kubenswrapper[4771]: E0219 21:50:57.259774 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:57 crc kubenswrapper[4771]: E0219 21:50:57.259824 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.271804 4771 generic.go:334] "Generic (PLEG): container finished" podID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" exitCode=137 Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.271932 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3597602-2a6d-4a26-87d7-77696a25cef5","Type":"ContainerDied","Data":"894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423"} Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.272600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3597602-2a6d-4a26-87d7-77696a25cef5","Type":"ContainerDied","Data":"2ca0ecb5780daad37971ca8c53acf54d14efed1c7a0a304c934b2054c8dbeccb"} Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.272683 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca0ecb5780daad37971ca8c53acf54d14efed1c7a0a304c934b2054c8dbeccb" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.351728 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.498417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-combined-ca-bundle\") pod \"a3597602-2a6d-4a26-87d7-77696a25cef5\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.498543 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-config-data\") pod \"a3597602-2a6d-4a26-87d7-77696a25cef5\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.498661 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57vfd\" (UniqueName: \"kubernetes.io/projected/a3597602-2a6d-4a26-87d7-77696a25cef5-kube-api-access-57vfd\") pod \"a3597602-2a6d-4a26-87d7-77696a25cef5\" (UID: \"a3597602-2a6d-4a26-87d7-77696a25cef5\") " Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.518169 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3597602-2a6d-4a26-87d7-77696a25cef5-kube-api-access-57vfd" (OuterVolumeSpecName: "kube-api-access-57vfd") pod "a3597602-2a6d-4a26-87d7-77696a25cef5" (UID: "a3597602-2a6d-4a26-87d7-77696a25cef5"). InnerVolumeSpecName "kube-api-access-57vfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.556892 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3597602-2a6d-4a26-87d7-77696a25cef5" (UID: "a3597602-2a6d-4a26-87d7-77696a25cef5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.557104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-config-data" (OuterVolumeSpecName: "config-data") pod "a3597602-2a6d-4a26-87d7-77696a25cef5" (UID: "a3597602-2a6d-4a26-87d7-77696a25cef5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.601610 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.601654 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3597602-2a6d-4a26-87d7-77696a25cef5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:00 crc kubenswrapper[4771]: I0219 21:51:00.601665 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57vfd\" (UniqueName: \"kubernetes.io/projected/a3597602-2a6d-4a26-87d7-77696a25cef5-kube-api-access-57vfd\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.285692 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.350722 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.371344 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.384562 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:51:01 crc kubenswrapper[4771]: E0219 21:51:01.385286 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.385362 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.385818 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" containerName="nova-cell0-conductor-conductor" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.387148 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.390563 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5thxh" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.392717 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.394165 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.536563 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.536646 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnwjq\" (UniqueName: \"kubernetes.io/projected/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-kube-api-access-lnwjq\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.536776 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.638704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.638818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnwjq\" (UniqueName: \"kubernetes.io/projected/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-kube-api-access-lnwjq\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.638873 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.644855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.646302 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.659844 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnwjq\" (UniqueName: \"kubernetes.io/projected/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-kube-api-access-lnwjq\") pod \"nova-cell0-conductor-0\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:01 crc kubenswrapper[4771]: I0219 21:51:01.723348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:02 crc kubenswrapper[4771]: I0219 21:51:02.230356 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:51:02 crc kubenswrapper[4771]: W0219 21:51:02.239856 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4938b4a1_32f8_4e7a_b334_8ce3b649fb46.slice/crio-2ac11aa5ae2dfb868022f8538bffff387954bd82a4f1bd15cbccd53efc9ff707 WatchSource:0}: Error finding container 2ac11aa5ae2dfb868022f8538bffff387954bd82a4f1bd15cbccd53efc9ff707: Status 404 returned error can't find the container with id 2ac11aa5ae2dfb868022f8538bffff387954bd82a4f1bd15cbccd53efc9ff707 Feb 19 21:51:02 crc kubenswrapper[4771]: I0219 21:51:02.296832 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4938b4a1-32f8-4e7a-b334-8ce3b649fb46","Type":"ContainerStarted","Data":"2ac11aa5ae2dfb868022f8538bffff387954bd82a4f1bd15cbccd53efc9ff707"} Feb 19 21:51:02 crc kubenswrapper[4771]: I0219 21:51:02.455236 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3597602-2a6d-4a26-87d7-77696a25cef5" path="/var/lib/kubelet/pods/a3597602-2a6d-4a26-87d7-77696a25cef5/volumes" Feb 19 21:51:03 crc kubenswrapper[4771]: I0219 21:51:03.314560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4938b4a1-32f8-4e7a-b334-8ce3b649fb46","Type":"ContainerStarted","Data":"8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc"} Feb 19 21:51:03 crc kubenswrapper[4771]: I0219 21:51:03.314824 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:08 crc kubenswrapper[4771]: I0219 21:51:08.455731 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 21:51:08 crc kubenswrapper[4771]: I0219 21:51:08.528389 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=7.528353183 podStartE2EDuration="7.528353183s" podCreationTimestamp="2026-02-19 21:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:03.339725477 +0000 UTC m=+1363.611167987" watchObservedRunningTime="2026-02-19 21:51:08.528353183 +0000 UTC m=+1368.799795733" Feb 19 21:51:11 crc kubenswrapper[4771]: I0219 21:51:11.757940 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.487703 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-25p7k"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.488774 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.492253 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.494495 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.499545 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-25p7k"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.572443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwdxv\" (UniqueName: \"kubernetes.io/projected/76514a37-a198-4afe-b306-46a87ed1f21f-kube-api-access-fwdxv\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.572798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-scripts\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.572885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-config-data\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.572952 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.679043 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-config-data\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.679137 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.679173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwdxv\" (UniqueName: \"kubernetes.io/projected/76514a37-a198-4afe-b306-46a87ed1f21f-kube-api-access-fwdxv\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.679269 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-scripts\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.689216 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.690983 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.692263 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-config-data\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.694701 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.698625 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-scripts\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.701555 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.705562 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.716264 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.717798 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.722920 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.750282 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.775652 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwdxv\" (UniqueName: \"kubernetes.io/projected/76514a37-a198-4afe-b306-46a87ed1f21f-kube-api-access-fwdxv\") pod \"nova-cell0-cell-mapping-25p7k\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.781731 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.782911 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784234 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-logs\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784273 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-config-data\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784294 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784331 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-config-data\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j9r9\" (UniqueName: \"kubernetes.io/projected/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-kube-api-access-6j9r9\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784415 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5w9w\" (UniqueName: \"kubernetes.io/projected/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-kube-api-access-v5w9w\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.784462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-logs\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.785503 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.804718 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.825810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.826289 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-qvsck"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.827703 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.845343 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-qvsck"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.897888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.897936 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-logs\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.897960 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmf97\" (UniqueName: \"kubernetes.io/projected/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-kube-api-access-wmf97\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.897983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6bcd\" (UniqueName: \"kubernetes.io/projected/49e59cf4-4766-4548-9eb8-cd04f34153ef-kube-api-access-s6bcd\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898005 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-logs\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898082 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-config\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-config-data\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898121 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898140 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-config-data\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898206 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898256 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j9r9\" (UniqueName: \"kubernetes.io/projected/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-kube-api-access-6j9r9\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898310 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5w9w\" (UniqueName: \"kubernetes.io/projected/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-kube-api-access-v5w9w\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.898903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-logs\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.899192 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-logs\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.909664 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-config-data\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.910834 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.911501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.936369 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.937512 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.941205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-config-data\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.941468 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.948294 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.953535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5w9w\" (UniqueName: \"kubernetes.io/projected/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-kube-api-access-v5w9w\") pod \"nova-metadata-0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " pod="openstack/nova-metadata-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.957098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j9r9\" (UniqueName: \"kubernetes.io/projected/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-kube-api-access-6j9r9\") pod \"nova-api-0\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " pod="openstack/nova-api-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.999820 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.999888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2xsw\" (UniqueName: \"kubernetes.io/projected/23c35c42-227c-4e14-bf15-0d931a9442ae-kube-api-access-b2xsw\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:12 crc kubenswrapper[4771]: I0219 21:51:12.999917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:12.999991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmf97\" (UniqueName: \"kubernetes.io/projected/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-kube-api-access-wmf97\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000053 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6bcd\" (UniqueName: \"kubernetes.io/projected/49e59cf4-4766-4548-9eb8-cd04f34153ef-kube-api-access-s6bcd\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000086 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000110 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000127 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-config\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000173 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.000211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.001127 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.001878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.002701 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.003266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.003328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-config\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.005483 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.008912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.027730 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6bcd\" (UniqueName: \"kubernetes.io/projected/49e59cf4-4766-4548-9eb8-cd04f34153ef-kube-api-access-s6bcd\") pod \"dnsmasq-dns-6bc699f5c5-qvsck\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.041106 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmf97\" (UniqueName: \"kubernetes.io/projected/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-kube-api-access-wmf97\") pod \"nova-scheduler-0\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.101384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.101460 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.101547 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2xsw\" (UniqueName: \"kubernetes.io/projected/23c35c42-227c-4e14-bf15-0d931a9442ae-kube-api-access-b2xsw\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.105147 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.105685 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.117707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2xsw\" (UniqueName: \"kubernetes.io/projected/23c35c42-227c-4e14-bf15-0d931a9442ae-kube-api-access-b2xsw\") pod \"nova-cell1-novncproxy-0\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.154012 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.181356 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.278307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.296402 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.338583 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.363326 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-25p7k"] Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.442970 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-25p7k" event={"ID":"76514a37-a198-4afe-b306-46a87ed1f21f","Type":"ContainerStarted","Data":"191d181ee90c69088fb3e7bb731e6dae968ce81340f50aa7fa68d8aa84caacfa"} Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.598823 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jpvrl"] Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.600475 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.605765 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.607451 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.619354 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.635284 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jpvrl"] Feb 19 21:51:13 crc kubenswrapper[4771]: W0219 21:51:13.715656 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e5074c0_269d_4ef0_851b_a6d9e51d4de0.slice/crio-0b376f9aa9b94e7fbb9d8b19e5f8bbfac59de17fa8d48d01474a59400c5c9731 WatchSource:0}: Error finding container 0b376f9aa9b94e7fbb9d8b19e5f8bbfac59de17fa8d48d01474a59400c5c9731: Status 404 returned error can't find the container with id 0b376f9aa9b94e7fbb9d8b19e5f8bbfac59de17fa8d48d01474a59400c5c9731 Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.717471 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.719278 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rcml\" (UniqueName: \"kubernetes.io/projected/fc115457-bb0d-4eb1-accb-43b6c88c79ef-kube-api-access-6rcml\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.719396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-scripts\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.719439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-config-data\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.719462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.806220 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:13 crc kubenswrapper[4771]: W0219 21:51:13.812077 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07f26ae3_5e1a_40fa_a9a6_01d0338a0b2e.slice/crio-454b4aef9e4de165a98937aa7e594688925c802e318ea7967ba8687b6b54c8c2 WatchSource:0}: Error finding container 454b4aef9e4de165a98937aa7e594688925c802e318ea7967ba8687b6b54c8c2: Status 404 returned error can't find the container with id 454b4aef9e4de165a98937aa7e594688925c802e318ea7967ba8687b6b54c8c2 Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.835529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-scripts\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.835608 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-config-data\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.835649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.835762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rcml\" (UniqueName: \"kubernetes.io/projected/fc115457-bb0d-4eb1-accb-43b6c88c79ef-kube-api-access-6rcml\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.843510 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.843599 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-config-data\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.843905 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-scripts\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.852446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rcml\" (UniqueName: \"kubernetes.io/projected/fc115457-bb0d-4eb1-accb-43b6c88c79ef-kube-api-access-6rcml\") pod \"nova-cell1-conductor-db-sync-jpvrl\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.917010 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-qvsck"] Feb 19 21:51:13 crc kubenswrapper[4771]: W0219 21:51:13.924414 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49e59cf4_4766_4548_9eb8_cd04f34153ef.slice/crio-a1b747e41cc4c2ec78d1fc260f26a8d2f0eaf6d59caa4bb75df6c73eacc0b00c WatchSource:0}: Error finding container a1b747e41cc4c2ec78d1fc260f26a8d2f0eaf6d59caa4bb75df6c73eacc0b00c: Status 404 returned error can't find the container with id a1b747e41cc4c2ec78d1fc260f26a8d2f0eaf6d59caa4bb75df6c73eacc0b00c Feb 19 21:51:13 crc kubenswrapper[4771]: I0219 21:51:13.936792 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.076398 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.404209 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jpvrl"] Feb 19 21:51:14 crc kubenswrapper[4771]: W0219 21:51:14.408392 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc115457_bb0d_4eb1_accb_43b6c88c79ef.slice/crio-6d0a265a4baba5eb280e3c2a85602bad47b986432acfb835c4a4bb635f36bc1f WatchSource:0}: Error finding container 6d0a265a4baba5eb280e3c2a85602bad47b986432acfb835c4a4bb635f36bc1f: Status 404 returned error can't find the container with id 6d0a265a4baba5eb280e3c2a85602bad47b986432acfb835c4a4bb635f36bc1f Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.454040 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e","Type":"ContainerStarted","Data":"454b4aef9e4de165a98937aa7e594688925c802e318ea7967ba8687b6b54c8c2"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.463706 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-25p7k" event={"ID":"76514a37-a198-4afe-b306-46a87ed1f21f","Type":"ContainerStarted","Data":"b8ca1cd5e8fb62c9444684cc61027938add1231dfcf2c180029a45a58ab67818"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.466206 4771 generic.go:334] "Generic (PLEG): container finished" podID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerID="24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7" exitCode=0 Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.466252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" event={"ID":"49e59cf4-4766-4548-9eb8-cd04f34153ef","Type":"ContainerDied","Data":"24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.466269 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" event={"ID":"49e59cf4-4766-4548-9eb8-cd04f34153ef","Type":"ContainerStarted","Data":"a1b747e41cc4c2ec78d1fc260f26a8d2f0eaf6d59caa4bb75df6c73eacc0b00c"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.468462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e5074c0-269d-4ef0-851b-a6d9e51d4de0","Type":"ContainerStarted","Data":"0b376f9aa9b94e7fbb9d8b19e5f8bbfac59de17fa8d48d01474a59400c5c9731"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.471189 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"23c35c42-227c-4e14-bf15-0d931a9442ae","Type":"ContainerStarted","Data":"9961ef09a452078269baae0a3fed5c6cb07026b7058fae2a87a16241f5031c31"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.472366 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" event={"ID":"fc115457-bb0d-4eb1-accb-43b6c88c79ef","Type":"ContainerStarted","Data":"6d0a265a4baba5eb280e3c2a85602bad47b986432acfb835c4a4bb635f36bc1f"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.473571 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4eb7e7d0-7664-4858-af7c-04377b1ea3fe","Type":"ContainerStarted","Data":"5c1558267285c0c2364f0082cd49953062c0399f9359ef04b607fbc45d66444a"} Feb 19 21:51:14 crc kubenswrapper[4771]: I0219 21:51:14.487706 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-25p7k" podStartSLOduration=2.487692472 podStartE2EDuration="2.487692472s" podCreationTimestamp="2026-02-19 21:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:14.481087605 +0000 UTC m=+1374.752530075" watchObservedRunningTime="2026-02-19 21:51:14.487692472 +0000 UTC m=+1374.759134942" Feb 19 21:51:15 crc kubenswrapper[4771]: I0219 21:51:15.485271 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" event={"ID":"fc115457-bb0d-4eb1-accb-43b6c88c79ef","Type":"ContainerStarted","Data":"572dcb283fd1c97c3eac3d2d950a2cb097579be9167f01910d217a90b523c5b4"} Feb 19 21:51:15 crc kubenswrapper[4771]: I0219 21:51:15.495423 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" event={"ID":"49e59cf4-4766-4548-9eb8-cd04f34153ef","Type":"ContainerStarted","Data":"02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de"} Feb 19 21:51:15 crc kubenswrapper[4771]: I0219 21:51:15.495462 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:15 crc kubenswrapper[4771]: I0219 21:51:15.510697 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" podStartSLOduration=2.5106776010000003 podStartE2EDuration="2.510677601s" podCreationTimestamp="2026-02-19 21:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:15.503817977 +0000 UTC m=+1375.775260447" watchObservedRunningTime="2026-02-19 21:51:15.510677601 +0000 UTC m=+1375.782120071" Feb 19 21:51:15 crc kubenswrapper[4771]: I0219 21:51:15.531480 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" podStartSLOduration=3.531457366 podStartE2EDuration="3.531457366s" podCreationTimestamp="2026-02-19 21:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:15.52411262 +0000 UTC m=+1375.795555100" watchObservedRunningTime="2026-02-19 21:51:15.531457366 +0000 UTC m=+1375.802899836" Feb 19 21:51:16 crc kubenswrapper[4771]: I0219 21:51:16.718882 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:16 crc kubenswrapper[4771]: I0219 21:51:16.742917 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:16 crc kubenswrapper[4771]: I0219 21:51:16.754978 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:51:16 crc kubenswrapper[4771]: I0219 21:51:16.755755 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b8ee33e7-850d-465e-9add-481bfcbdd6b4" containerName="kube-state-metrics" containerID="cri-o://902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968" gracePeriod=30 Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.118251 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.207591 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb96l\" (UniqueName: \"kubernetes.io/projected/b8ee33e7-850d-465e-9add-481bfcbdd6b4-kube-api-access-lb96l\") pod \"b8ee33e7-850d-465e-9add-481bfcbdd6b4\" (UID: \"b8ee33e7-850d-465e-9add-481bfcbdd6b4\") " Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.212938 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ee33e7-850d-465e-9add-481bfcbdd6b4-kube-api-access-lb96l" (OuterVolumeSpecName: "kube-api-access-lb96l") pod "b8ee33e7-850d-465e-9add-481bfcbdd6b4" (UID: "b8ee33e7-850d-465e-9add-481bfcbdd6b4"). InnerVolumeSpecName "kube-api-access-lb96l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.312508 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb96l\" (UniqueName: \"kubernetes.io/projected/b8ee33e7-850d-465e-9add-481bfcbdd6b4-kube-api-access-lb96l\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.511799 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"23c35c42-227c-4e14-bf15-0d931a9442ae","Type":"ContainerStarted","Data":"d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.514745 4771 generic.go:334] "Generic (PLEG): container finished" podID="b8ee33e7-850d-465e-9add-481bfcbdd6b4" containerID="902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968" exitCode=2 Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.514810 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.514855 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8ee33e7-850d-465e-9add-481bfcbdd6b4","Type":"ContainerDied","Data":"902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.514941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b8ee33e7-850d-465e-9add-481bfcbdd6b4","Type":"ContainerDied","Data":"2192e5a5bc14fa5d1baeecb5533857cd4e28b57a9357bf67150f8409c7bb91ea"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.514969 4771 scope.go:117] "RemoveContainer" containerID="902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.529072 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4eb7e7d0-7664-4858-af7c-04377b1ea3fe","Type":"ContainerStarted","Data":"920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.529134 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4eb7e7d0-7664-4858-af7c-04377b1ea3fe","Type":"ContainerStarted","Data":"bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.531141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e","Type":"ContainerStarted","Data":"97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.534007 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e5074c0-269d-4ef0-851b-a6d9e51d4de0","Type":"ContainerStarted","Data":"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.534059 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e5074c0-269d-4ef0-851b-a6d9e51d4de0","Type":"ContainerStarted","Data":"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d"} Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.534145 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-log" containerID="cri-o://04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d" gracePeriod=30 Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.534327 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-metadata" containerID="cri-o://8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b" gracePeriod=30 Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.537949 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.42007568 podStartE2EDuration="5.5379298s" podCreationTimestamp="2026-02-19 21:51:12 +0000 UTC" firstStartedPulling="2026-02-19 21:51:14.094937139 +0000 UTC m=+1374.366379609" lastFinishedPulling="2026-02-19 21:51:16.212791259 +0000 UTC m=+1376.484233729" observedRunningTime="2026-02-19 21:51:17.532859014 +0000 UTC m=+1377.804301494" watchObservedRunningTime="2026-02-19 21:51:17.5379298 +0000 UTC m=+1377.809372270" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.557773 4771 scope.go:117] "RemoveContainer" containerID="902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968" Feb 19 21:51:17 crc kubenswrapper[4771]: E0219 21:51:17.558445 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968\": container with ID starting with 902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968 not found: ID does not exist" containerID="902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.558489 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968"} err="failed to get container status \"902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968\": rpc error: code = NotFound desc = could not find container \"902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968\": container with ID starting with 902d154019808b1c3e8b563e276cb0674d5f1fdc5465e590ebe17707ebff9968 not found: ID does not exist" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.560232 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.570409 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.580558 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:51:17 crc kubenswrapper[4771]: E0219 21:51:17.580904 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ee33e7-850d-465e-9add-481bfcbdd6b4" containerName="kube-state-metrics" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.580920 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ee33e7-850d-465e-9add-481bfcbdd6b4" containerName="kube-state-metrics" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.581117 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ee33e7-850d-465e-9add-481bfcbdd6b4" containerName="kube-state-metrics" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.581684 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.590482 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.590536 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.592787 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.194190005 podStartE2EDuration="5.592772895s" podCreationTimestamp="2026-02-19 21:51:12 +0000 UTC" firstStartedPulling="2026-02-19 21:51:13.813748186 +0000 UTC m=+1374.085190656" lastFinishedPulling="2026-02-19 21:51:16.212331076 +0000 UTC m=+1376.483773546" observedRunningTime="2026-02-19 21:51:17.57010804 +0000 UTC m=+1377.841550540" watchObservedRunningTime="2026-02-19 21:51:17.592772895 +0000 UTC m=+1377.864215365" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.607838 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.634483 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.033019109 podStartE2EDuration="5.634463729s" podCreationTimestamp="2026-02-19 21:51:12 +0000 UTC" firstStartedPulling="2026-02-19 21:51:13.616807165 +0000 UTC m=+1373.888249635" lastFinishedPulling="2026-02-19 21:51:16.218251765 +0000 UTC m=+1376.489694255" observedRunningTime="2026-02-19 21:51:17.620474335 +0000 UTC m=+1377.891916825" watchObservedRunningTime="2026-02-19 21:51:17.634463729 +0000 UTC m=+1377.905906199" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.654402 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.159471678 podStartE2EDuration="5.654382771s" podCreationTimestamp="2026-02-19 21:51:12 +0000 UTC" firstStartedPulling="2026-02-19 21:51:13.717855965 +0000 UTC m=+1373.989298435" lastFinishedPulling="2026-02-19 21:51:16.212767058 +0000 UTC m=+1376.484209528" observedRunningTime="2026-02-19 21:51:17.652002568 +0000 UTC m=+1377.923445048" watchObservedRunningTime="2026-02-19 21:51:17.654382771 +0000 UTC m=+1377.925825241" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.725320 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.725525 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.725654 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.725698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxmcw\" (UniqueName: \"kubernetes.io/projected/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-api-access-jxmcw\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.827304 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.827380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.827407 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxmcw\" (UniqueName: \"kubernetes.io/projected/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-api-access-jxmcw\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.827451 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.832837 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.833430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.833918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.846572 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxmcw\" (UniqueName: \"kubernetes.io/projected/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-api-access-jxmcw\") pod \"kube-state-metrics-0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " pod="openstack/kube-state-metrics-0" Feb 19 21:51:17 crc kubenswrapper[4771]: I0219 21:51:17.904533 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.182104 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.182157 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.279232 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.321763 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:51:18 crc kubenswrapper[4771]: W0219 21:51:18.325881 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9c230c6_1af8_440e_806d_b3b1e98544c0.slice/crio-b73186e1941ca07d3ecc1fb16194d53e4a327da4e24c68bbbb6283d86aef3fa3 WatchSource:0}: Error finding container b73186e1941ca07d3ecc1fb16194d53e4a327da4e24c68bbbb6283d86aef3fa3: Status 404 returned error can't find the container with id b73186e1941ca07d3ecc1fb16194d53e4a327da4e24c68bbbb6283d86aef3fa3 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.340847 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.414139 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.469245 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ee33e7-850d-465e-9add-481bfcbdd6b4" path="/var/lib/kubelet/pods/b8ee33e7-850d-465e-9add-481bfcbdd6b4/volumes" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.537438 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-config-data\") pod \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.537494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5w9w\" (UniqueName: \"kubernetes.io/projected/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-kube-api-access-v5w9w\") pod \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.537536 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-logs\") pod \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.537628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-combined-ca-bundle\") pod \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\" (UID: \"2e5074c0-269d-4ef0-851b-a6d9e51d4de0\") " Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.538079 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-logs" (OuterVolumeSpecName: "logs") pod "2e5074c0-269d-4ef0-851b-a6d9e51d4de0" (UID: "2e5074c0-269d-4ef0-851b-a6d9e51d4de0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.538936 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.549283 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-kube-api-access-v5w9w" (OuterVolumeSpecName: "kube-api-access-v5w9w") pod "2e5074c0-269d-4ef0-851b-a6d9e51d4de0" (UID: "2e5074c0-269d-4ef0-851b-a6d9e51d4de0"). InnerVolumeSpecName "kube-api-access-v5w9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.553220 4771 generic.go:334] "Generic (PLEG): container finished" podID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerID="8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b" exitCode=0 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.553261 4771 generic.go:334] "Generic (PLEG): container finished" podID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerID="04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d" exitCode=143 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.553288 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e5074c0-269d-4ef0-851b-a6d9e51d4de0","Type":"ContainerDied","Data":"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b"} Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.553306 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.553331 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e5074c0-269d-4ef0-851b-a6d9e51d4de0","Type":"ContainerDied","Data":"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d"} Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.553388 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e5074c0-269d-4ef0-851b-a6d9e51d4de0","Type":"ContainerDied","Data":"0b376f9aa9b94e7fbb9d8b19e5f8bbfac59de17fa8d48d01474a59400c5c9731"} Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.553404 4771 scope.go:117] "RemoveContainer" containerID="8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.556634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9c230c6-1af8-440e-806d-b3b1e98544c0","Type":"ContainerStarted","Data":"b73186e1941ca07d3ecc1fb16194d53e4a327da4e24c68bbbb6283d86aef3fa3"} Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.557240 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="23c35c42-227c-4e14-bf15-0d931a9442ae" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245" gracePeriod=30 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.566358 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-config-data" (OuterVolumeSpecName: "config-data") pod "2e5074c0-269d-4ef0-851b-a6d9e51d4de0" (UID: "2e5074c0-269d-4ef0-851b-a6d9e51d4de0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.574647 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e5074c0-269d-4ef0-851b-a6d9e51d4de0" (UID: "2e5074c0-269d-4ef0-851b-a6d9e51d4de0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.640716 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.640895 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5w9w\" (UniqueName: \"kubernetes.io/projected/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-kube-api-access-v5w9w\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.640984 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5074c0-269d-4ef0-851b-a6d9e51d4de0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.660146 4771 scope.go:117] "RemoveContainer" containerID="04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.680084 4771 scope.go:117] "RemoveContainer" containerID="8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b" Feb 19 21:51:18 crc kubenswrapper[4771]: E0219 21:51:18.680701 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b\": container with ID starting with 8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b not found: ID does not exist" containerID="8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.680751 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b"} err="failed to get container status \"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b\": rpc error: code = NotFound desc = could not find container \"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b\": container with ID starting with 8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b not found: ID does not exist" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.680780 4771 scope.go:117] "RemoveContainer" containerID="04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d" Feb 19 21:51:18 crc kubenswrapper[4771]: E0219 21:51:18.681620 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d\": container with ID starting with 04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d not found: ID does not exist" containerID="04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.681661 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d"} err="failed to get container status \"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d\": rpc error: code = NotFound desc = could not find container \"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d\": container with ID starting with 04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d not found: ID does not exist" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.681688 4771 scope.go:117] "RemoveContainer" containerID="8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.681943 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b"} err="failed to get container status \"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b\": rpc error: code = NotFound desc = could not find container \"8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b\": container with ID starting with 8ac554c7235709dbbcb98d261c40aaf97e1066a28059c0c25fc2328f6d1b867b not found: ID does not exist" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.681966 4771 scope.go:117] "RemoveContainer" containerID="04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.682236 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d"} err="failed to get container status \"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d\": rpc error: code = NotFound desc = could not find container \"04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d\": container with ID starting with 04f0c671d947a568426ec1e5f8b55940562fffdaba4dfad65496632f573bf96d not found: ID does not exist" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.834367 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.835746 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-central-agent" containerID="cri-o://28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9" gracePeriod=30 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.836193 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="proxy-httpd" containerID="cri-o://457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb" gracePeriod=30 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.836317 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="sg-core" containerID="cri-o://1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669" gracePeriod=30 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.836409 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-notification-agent" containerID="cri-o://56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4" gracePeriod=30 Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.896143 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.930416 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.939225 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:18 crc kubenswrapper[4771]: E0219 21:51:18.939601 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-log" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.939617 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-log" Feb 19 21:51:18 crc kubenswrapper[4771]: E0219 21:51:18.939655 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-metadata" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.939662 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-metadata" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.939806 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-metadata" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.939841 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" containerName="nova-metadata-log" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.940855 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.946334 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.955740 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:51:18 crc kubenswrapper[4771]: I0219 21:51:18.956163 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.052049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-config-data\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.052106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgl8\" (UniqueName: \"kubernetes.io/projected/c61c43ff-92a4-4297-87b5-67ee379cff2a-kube-api-access-fxgl8\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.052147 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.052198 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61c43ff-92a4-4297-87b5-67ee379cff2a-logs\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.052237 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.154104 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.154217 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-config-data\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.154253 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgl8\" (UniqueName: \"kubernetes.io/projected/c61c43ff-92a4-4297-87b5-67ee379cff2a-kube-api-access-fxgl8\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.154284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.154322 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61c43ff-92a4-4297-87b5-67ee379cff2a-logs\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.154860 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61c43ff-92a4-4297-87b5-67ee379cff2a-logs\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.159488 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.161488 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.161950 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-config-data\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.170682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgl8\" (UniqueName: \"kubernetes.io/projected/c61c43ff-92a4-4297-87b5-67ee379cff2a-kube-api-access-fxgl8\") pod \"nova-metadata-0\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.304595 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.572993 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9c230c6-1af8-440e-806d-b3b1e98544c0","Type":"ContainerStarted","Data":"e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92"} Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.574097 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.578119 4771 generic.go:334] "Generic (PLEG): container finished" podID="23c80346-4fcf-4c8c-b674-85b242187b94" containerID="457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb" exitCode=0 Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.578142 4771 generic.go:334] "Generic (PLEG): container finished" podID="23c80346-4fcf-4c8c-b674-85b242187b94" containerID="1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669" exitCode=2 Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.578149 4771 generic.go:334] "Generic (PLEG): container finished" podID="23c80346-4fcf-4c8c-b674-85b242187b94" containerID="28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9" exitCode=0 Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.578179 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerDied","Data":"457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb"} Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.578196 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerDied","Data":"1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669"} Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.578207 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerDied","Data":"28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9"} Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.597090 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.208714057 podStartE2EDuration="2.597072982s" podCreationTimestamp="2026-02-19 21:51:17 +0000 UTC" firstStartedPulling="2026-02-19 21:51:18.331177933 +0000 UTC m=+1378.602620403" lastFinishedPulling="2026-02-19 21:51:18.719536858 +0000 UTC m=+1378.990979328" observedRunningTime="2026-02-19 21:51:19.589122339 +0000 UTC m=+1379.860564809" watchObservedRunningTime="2026-02-19 21:51:19.597072982 +0000 UTC m=+1379.868515452" Feb 19 21:51:19 crc kubenswrapper[4771]: I0219 21:51:19.753646 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:20 crc kubenswrapper[4771]: I0219 21:51:20.458981 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5074c0-269d-4ef0-851b-a6d9e51d4de0" path="/var/lib/kubelet/pods/2e5074c0-269d-4ef0-851b-a6d9e51d4de0/volumes" Feb 19 21:51:20 crc kubenswrapper[4771]: I0219 21:51:20.588719 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c61c43ff-92a4-4297-87b5-67ee379cff2a","Type":"ContainerStarted","Data":"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526"} Feb 19 21:51:20 crc kubenswrapper[4771]: I0219 21:51:20.588768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c61c43ff-92a4-4297-87b5-67ee379cff2a","Type":"ContainerStarted","Data":"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e"} Feb 19 21:51:20 crc kubenswrapper[4771]: I0219 21:51:20.588783 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c61c43ff-92a4-4297-87b5-67ee379cff2a","Type":"ContainerStarted","Data":"3e486eac98668ce62ec76e34344280b59a880a451c6451997048e8459e49beb9"} Feb 19 21:51:20 crc kubenswrapper[4771]: I0219 21:51:20.613145 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.613121885 podStartE2EDuration="2.613121885s" podCreationTimestamp="2026-02-19 21:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:20.605558344 +0000 UTC m=+1380.877000824" watchObservedRunningTime="2026-02-19 21:51:20.613121885 +0000 UTC m=+1380.884564365" Feb 19 21:51:21 crc kubenswrapper[4771]: I0219 21:51:21.611166 4771 generic.go:334] "Generic (PLEG): container finished" podID="76514a37-a198-4afe-b306-46a87ed1f21f" containerID="b8ca1cd5e8fb62c9444684cc61027938add1231dfcf2c180029a45a58ab67818" exitCode=0 Feb 19 21:51:21 crc kubenswrapper[4771]: I0219 21:51:21.615334 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-25p7k" event={"ID":"76514a37-a198-4afe-b306-46a87ed1f21f","Type":"ContainerDied","Data":"b8ca1cd5e8fb62c9444684cc61027938add1231dfcf2c180029a45a58ab67818"} Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.533089 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.619950 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc115457-bb0d-4eb1-accb-43b6c88c79ef" containerID="572dcb283fd1c97c3eac3d2d950a2cb097579be9167f01910d217a90b523c5b4" exitCode=0 Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.620031 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" event={"ID":"fc115457-bb0d-4eb1-accb-43b6c88c79ef","Type":"ContainerDied","Data":"572dcb283fd1c97c3eac3d2d950a2cb097579be9167f01910d217a90b523c5b4"} Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.627337 4771 generic.go:334] "Generic (PLEG): container finished" podID="23c80346-4fcf-4c8c-b674-85b242187b94" containerID="56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4" exitCode=0 Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.627394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerDied","Data":"56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4"} Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.627445 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.627480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23c80346-4fcf-4c8c-b674-85b242187b94","Type":"ContainerDied","Data":"32cd8454dbdc6d53698456f364ba5185271ba48e9ad9c4928df989dcc3ef13e4"} Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.627518 4771 scope.go:117] "RemoveContainer" containerID="457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.639861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-config-data\") pod \"23c80346-4fcf-4c8c-b674-85b242187b94\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.639959 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-scripts\") pod \"23c80346-4fcf-4c8c-b674-85b242187b94\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.640048 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-combined-ca-bundle\") pod \"23c80346-4fcf-4c8c-b674-85b242187b94\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.640093 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48xjc\" (UniqueName: \"kubernetes.io/projected/23c80346-4fcf-4c8c-b674-85b242187b94-kube-api-access-48xjc\") pod \"23c80346-4fcf-4c8c-b674-85b242187b94\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.640155 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-run-httpd\") pod \"23c80346-4fcf-4c8c-b674-85b242187b94\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.640209 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-sg-core-conf-yaml\") pod \"23c80346-4fcf-4c8c-b674-85b242187b94\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.640236 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-log-httpd\") pod \"23c80346-4fcf-4c8c-b674-85b242187b94\" (UID: \"23c80346-4fcf-4c8c-b674-85b242187b94\") " Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.641667 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "23c80346-4fcf-4c8c-b674-85b242187b94" (UID: "23c80346-4fcf-4c8c-b674-85b242187b94"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.641710 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "23c80346-4fcf-4c8c-b674-85b242187b94" (UID: "23c80346-4fcf-4c8c-b674-85b242187b94"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.652340 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-scripts" (OuterVolumeSpecName: "scripts") pod "23c80346-4fcf-4c8c-b674-85b242187b94" (UID: "23c80346-4fcf-4c8c-b674-85b242187b94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.652375 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c80346-4fcf-4c8c-b674-85b242187b94-kube-api-access-48xjc" (OuterVolumeSpecName: "kube-api-access-48xjc") pod "23c80346-4fcf-4c8c-b674-85b242187b94" (UID: "23c80346-4fcf-4c8c-b674-85b242187b94"). InnerVolumeSpecName "kube-api-access-48xjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.690790 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "23c80346-4fcf-4c8c-b674-85b242187b94" (UID: "23c80346-4fcf-4c8c-b674-85b242187b94"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.742911 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48xjc\" (UniqueName: \"kubernetes.io/projected/23c80346-4fcf-4c8c-b674-85b242187b94-kube-api-access-48xjc\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.743285 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.743366 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.743498 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c80346-4fcf-4c8c-b674-85b242187b94-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.743646 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.760172 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23c80346-4fcf-4c8c-b674-85b242187b94" (UID: "23c80346-4fcf-4c8c-b674-85b242187b94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.768212 4771 scope.go:117] "RemoveContainer" containerID="1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.808502 4771 scope.go:117] "RemoveContainer" containerID="56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.824248 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-config-data" (OuterVolumeSpecName: "config-data") pod "23c80346-4fcf-4c8c-b674-85b242187b94" (UID: "23c80346-4fcf-4c8c-b674-85b242187b94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.826622 4771 scope.go:117] "RemoveContainer" containerID="28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.845140 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.845209 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c80346-4fcf-4c8c-b674-85b242187b94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.864679 4771 scope.go:117] "RemoveContainer" containerID="457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.865173 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb\": container with ID starting with 457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb not found: ID does not exist" containerID="457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.865224 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb"} err="failed to get container status \"457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb\": rpc error: code = NotFound desc = could not find container \"457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb\": container with ID starting with 457db7a865dbfd1381087e931238218dad2bf76580e11eb2012ef3aed3050bbb not found: ID does not exist" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.865248 4771 scope.go:117] "RemoveContainer" containerID="1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.865545 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669\": container with ID starting with 1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669 not found: ID does not exist" containerID="1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.865564 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669"} err="failed to get container status \"1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669\": rpc error: code = NotFound desc = could not find container \"1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669\": container with ID starting with 1f08a0b29eef2ae11e595aeec4ffa8738b3e92b574a97d28def3a8a1c09b5669 not found: ID does not exist" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.865577 4771 scope.go:117] "RemoveContainer" containerID="56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.865872 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4\": container with ID starting with 56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4 not found: ID does not exist" containerID="56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.865892 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4"} err="failed to get container status \"56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4\": rpc error: code = NotFound desc = could not find container \"56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4\": container with ID starting with 56179dd78cde97cd4b78e457cf629810f54dc39dd74bac0b7903b215887c26d4 not found: ID does not exist" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.865904 4771 scope.go:117] "RemoveContainer" containerID="28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.866332 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9\": container with ID starting with 28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9 not found: ID does not exist" containerID="28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.866351 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9"} err="failed to get container status \"28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9\": rpc error: code = NotFound desc = could not find container \"28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9\": container with ID starting with 28b31984ab824db4706a1664ae30969a9cb51347f4c1a4e14588ce3ab85adee9 not found: ID does not exist" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.957728 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.968678 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.991366 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993171 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.993489 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="proxy-httpd" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993505 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="proxy-httpd" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.993529 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="sg-core" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993535 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="sg-core" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.993551 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76514a37-a198-4afe-b306-46a87ed1f21f" containerName="nova-manage" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993557 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="76514a37-a198-4afe-b306-46a87ed1f21f" containerName="nova-manage" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.993566 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-central-agent" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993572 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-central-agent" Feb 19 21:51:22 crc kubenswrapper[4771]: E0219 21:51:22.993584 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-notification-agent" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993589 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-notification-agent" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993905 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="sg-core" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993927 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="proxy-httpd" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993939 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-central-agent" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993949 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="76514a37-a198-4afe-b306-46a87ed1f21f" containerName="nova-manage" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.993963 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" containerName="ceilometer-notification-agent" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.995471 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.999506 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:51:22 crc kubenswrapper[4771]: I0219 21:51:22.999682 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.000157 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.013536 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.149870 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-combined-ca-bundle\") pod \"76514a37-a198-4afe-b306-46a87ed1f21f\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.149963 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-config-data\") pod \"76514a37-a198-4afe-b306-46a87ed1f21f\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150204 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwdxv\" (UniqueName: \"kubernetes.io/projected/76514a37-a198-4afe-b306-46a87ed1f21f-kube-api-access-fwdxv\") pod \"76514a37-a198-4afe-b306-46a87ed1f21f\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150298 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-scripts\") pod \"76514a37-a198-4afe-b306-46a87ed1f21f\" (UID: \"76514a37-a198-4afe-b306-46a87ed1f21f\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hnxt\" (UniqueName: \"kubernetes.io/projected/7ea28885-8813-47ef-bd23-3911e4fdeacc-kube-api-access-5hnxt\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-log-httpd\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150754 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150796 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-run-httpd\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.150977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.151241 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-scripts\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.151445 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-config-data\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.154766 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.154826 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.158398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76514a37-a198-4afe-b306-46a87ed1f21f-kube-api-access-fwdxv" (OuterVolumeSpecName: "kube-api-access-fwdxv") pod "76514a37-a198-4afe-b306-46a87ed1f21f" (UID: "76514a37-a198-4afe-b306-46a87ed1f21f"). InnerVolumeSpecName "kube-api-access-fwdxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.158527 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-scripts" (OuterVolumeSpecName: "scripts") pod "76514a37-a198-4afe-b306-46a87ed1f21f" (UID: "76514a37-a198-4afe-b306-46a87ed1f21f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.182483 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76514a37-a198-4afe-b306-46a87ed1f21f" (UID: "76514a37-a198-4afe-b306-46a87ed1f21f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.192135 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-config-data" (OuterVolumeSpecName: "config-data") pod "76514a37-a198-4afe-b306-46a87ed1f21f" (UID: "76514a37-a198-4afe-b306-46a87ed1f21f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.252748 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-run-httpd\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.252806 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.252886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-scripts\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.252932 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-config-data\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.252956 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hnxt\" (UniqueName: \"kubernetes.io/projected/7ea28885-8813-47ef-bd23-3911e4fdeacc-kube-api-access-5hnxt\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.252981 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-log-httpd\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.253012 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.253080 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.253132 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwdxv\" (UniqueName: \"kubernetes.io/projected/76514a37-a198-4afe-b306-46a87ed1f21f-kube-api-access-fwdxv\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.253143 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.253152 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.253161 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76514a37-a198-4afe-b306-46a87ed1f21f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.254749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-run-httpd\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.260109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-log-httpd\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.260258 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.260832 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.261975 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-scripts\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.262394 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-config-data\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.265072 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.279166 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.286878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hnxt\" (UniqueName: \"kubernetes.io/projected/7ea28885-8813-47ef-bd23-3911e4fdeacc-kube-api-access-5hnxt\") pod \"ceilometer-0\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.298256 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.310944 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.403524 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-khtzg"] Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.403762 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" podUID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerName="dnsmasq-dns" containerID="cri-o://66ddd0890bf0446c337501370919c8c56f71b19bf3b7a656b3b554ade625e9bd" gracePeriod=10 Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.415344 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.643389 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-25p7k" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.643373 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-25p7k" event={"ID":"76514a37-a198-4afe-b306-46a87ed1f21f","Type":"ContainerDied","Data":"191d181ee90c69088fb3e7bb731e6dae968ce81340f50aa7fa68d8aa84caacfa"} Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.643676 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191d181ee90c69088fb3e7bb731e6dae968ce81340f50aa7fa68d8aa84caacfa" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.661863 4771 generic.go:334] "Generic (PLEG): container finished" podID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerID="66ddd0890bf0446c337501370919c8c56f71b19bf3b7a656b3b554ade625e9bd" exitCode=0 Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.662363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" event={"ID":"c2974e7d-3e17-4fb5-bc64-6e33b4237107","Type":"ContainerDied","Data":"66ddd0890bf0446c337501370919c8c56f71b19bf3b7a656b3b554ade625e9bd"} Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.699582 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.823912 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.824176 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-log" containerID="cri-o://bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c" gracePeriod=30 Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.824604 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-api" containerID="cri-o://920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013" gracePeriod=30 Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.829864 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.832294 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.837683 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.837888 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-log" containerID="cri-o://6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e" gracePeriod=30 Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.838359 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-metadata" containerID="cri-o://a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526" gracePeriod=30 Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.895405 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.919569 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.919740 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.965671 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-svc\") pod \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.965793 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-nb\") pod \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.965838 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-swift-storage-0\") pod \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.965857 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-sb\") pod \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.965872 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-config\") pod \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.965944 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l68mb\" (UniqueName: \"kubernetes.io/projected/c2974e7d-3e17-4fb5-bc64-6e33b4237107-kube-api-access-l68mb\") pod \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\" (UID: \"c2974e7d-3e17-4fb5-bc64-6e33b4237107\") " Feb 19 21:51:23 crc kubenswrapper[4771]: I0219 21:51:23.974996 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2974e7d-3e17-4fb5-bc64-6e33b4237107-kube-api-access-l68mb" (OuterVolumeSpecName: "kube-api-access-l68mb") pod "c2974e7d-3e17-4fb5-bc64-6e33b4237107" (UID: "c2974e7d-3e17-4fb5-bc64-6e33b4237107"). InnerVolumeSpecName "kube-api-access-l68mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.062363 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-config" (OuterVolumeSpecName: "config") pod "c2974e7d-3e17-4fb5-bc64-6e33b4237107" (UID: "c2974e7d-3e17-4fb5-bc64-6e33b4237107"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.069180 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l68mb\" (UniqueName: \"kubernetes.io/projected/c2974e7d-3e17-4fb5-bc64-6e33b4237107-kube-api-access-l68mb\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.069210 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.071719 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2974e7d-3e17-4fb5-bc64-6e33b4237107" (UID: "c2974e7d-3e17-4fb5-bc64-6e33b4237107"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.091439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c2974e7d-3e17-4fb5-bc64-6e33b4237107" (UID: "c2974e7d-3e17-4fb5-bc64-6e33b4237107"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.091585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2974e7d-3e17-4fb5-bc64-6e33b4237107" (UID: "c2974e7d-3e17-4fb5-bc64-6e33b4237107"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.093638 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.102886 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c2974e7d-3e17-4fb5-bc64-6e33b4237107" (UID: "c2974e7d-3e17-4fb5-bc64-6e33b4237107"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.170470 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-config-data\") pod \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.170813 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-combined-ca-bundle\") pod \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.170837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-scripts\") pod \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.170900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rcml\" (UniqueName: \"kubernetes.io/projected/fc115457-bb0d-4eb1-accb-43b6c88c79ef-kube-api-access-6rcml\") pod \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\" (UID: \"fc115457-bb0d-4eb1-accb-43b6c88c79ef\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.171454 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.171471 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.171480 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.171489 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2974e7d-3e17-4fb5-bc64-6e33b4237107-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.175674 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc115457-bb0d-4eb1-accb-43b6c88c79ef-kube-api-access-6rcml" (OuterVolumeSpecName: "kube-api-access-6rcml") pod "fc115457-bb0d-4eb1-accb-43b6c88c79ef" (UID: "fc115457-bb0d-4eb1-accb-43b6c88c79ef"). InnerVolumeSpecName "kube-api-access-6rcml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.180238 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-scripts" (OuterVolumeSpecName: "scripts") pod "fc115457-bb0d-4eb1-accb-43b6c88c79ef" (UID: "fc115457-bb0d-4eb1-accb-43b6c88c79ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.199621 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-config-data" (OuterVolumeSpecName: "config-data") pod "fc115457-bb0d-4eb1-accb-43b6c88c79ef" (UID: "fc115457-bb0d-4eb1-accb-43b6c88c79ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.238785 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc115457-bb0d-4eb1-accb-43b6c88c79ef" (UID: "fc115457-bb0d-4eb1-accb-43b6c88c79ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.267645 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.272742 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.272762 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.272772 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc115457-bb0d-4eb1-accb-43b6c88c79ef-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.272782 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rcml\" (UniqueName: \"kubernetes.io/projected/fc115457-bb0d-4eb1-accb-43b6c88c79ef-kube-api-access-6rcml\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.304743 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.304804 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.435001 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.449489 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c80346-4fcf-4c8c-b674-85b242187b94" path="/var/lib/kubelet/pods/23c80346-4fcf-4c8c-b674-85b242187b94/volumes" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.591044 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-config-data\") pod \"c61c43ff-92a4-4297-87b5-67ee379cff2a\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.591503 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61c43ff-92a4-4297-87b5-67ee379cff2a-logs\") pod \"c61c43ff-92a4-4297-87b5-67ee379cff2a\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.591776 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-combined-ca-bundle\") pod \"c61c43ff-92a4-4297-87b5-67ee379cff2a\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.591877 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxgl8\" (UniqueName: \"kubernetes.io/projected/c61c43ff-92a4-4297-87b5-67ee379cff2a-kube-api-access-fxgl8\") pod \"c61c43ff-92a4-4297-87b5-67ee379cff2a\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.591973 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-nova-metadata-tls-certs\") pod \"c61c43ff-92a4-4297-87b5-67ee379cff2a\" (UID: \"c61c43ff-92a4-4297-87b5-67ee379cff2a\") " Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.591869 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61c43ff-92a4-4297-87b5-67ee379cff2a-logs" (OuterVolumeSpecName: "logs") pod "c61c43ff-92a4-4297-87b5-67ee379cff2a" (UID: "c61c43ff-92a4-4297-87b5-67ee379cff2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.597240 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61c43ff-92a4-4297-87b5-67ee379cff2a-kube-api-access-fxgl8" (OuterVolumeSpecName: "kube-api-access-fxgl8") pod "c61c43ff-92a4-4297-87b5-67ee379cff2a" (UID: "c61c43ff-92a4-4297-87b5-67ee379cff2a"). InnerVolumeSpecName "kube-api-access-fxgl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.605864 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61c43ff-92a4-4297-87b5-67ee379cff2a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.605895 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxgl8\" (UniqueName: \"kubernetes.io/projected/c61c43ff-92a4-4297-87b5-67ee379cff2a-kube-api-access-fxgl8\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.616485 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c61c43ff-92a4-4297-87b5-67ee379cff2a" (UID: "c61c43ff-92a4-4297-87b5-67ee379cff2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.625863 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-config-data" (OuterVolumeSpecName: "config-data") pod "c61c43ff-92a4-4297-87b5-67ee379cff2a" (UID: "c61c43ff-92a4-4297-87b5-67ee379cff2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.661941 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c61c43ff-92a4-4297-87b5-67ee379cff2a" (UID: "c61c43ff-92a4-4297-87b5-67ee379cff2a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.681569 4771 generic.go:334] "Generic (PLEG): container finished" podID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerID="bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c" exitCode=143 Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.681633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4eb7e7d0-7664-4858-af7c-04377b1ea3fe","Type":"ContainerDied","Data":"bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.692348 4771 generic.go:334] "Generic (PLEG): container finished" podID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerID="a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526" exitCode=0 Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.692383 4771 generic.go:334] "Generic (PLEG): container finished" podID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerID="6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e" exitCode=143 Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.692462 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.692472 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c61c43ff-92a4-4297-87b5-67ee379cff2a","Type":"ContainerDied","Data":"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.692844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c61c43ff-92a4-4297-87b5-67ee379cff2a","Type":"ContainerDied","Data":"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.692858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c61c43ff-92a4-4297-87b5-67ee379cff2a","Type":"ContainerDied","Data":"3e486eac98668ce62ec76e34344280b59a880a451c6451997048e8459e49beb9"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.692877 4771 scope.go:117] "RemoveContainer" containerID="a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.694387 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" event={"ID":"fc115457-bb0d-4eb1-accb-43b6c88c79ef","Type":"ContainerDied","Data":"6d0a265a4baba5eb280e3c2a85602bad47b986432acfb835c4a4bb635f36bc1f"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.694414 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d0a265a4baba5eb280e3c2a85602bad47b986432acfb835c4a4bb635f36bc1f" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.694459 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jpvrl" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.708387 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.708429 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.708438 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61c43ff-92a4-4297-87b5-67ee379cff2a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.711186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerStarted","Data":"e1fb9a440e286e4c6981017fc882be0307a33c07dd611dc0449fb5e94378ea9d"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.711227 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerStarted","Data":"0acb676fa4c9c70e3acaba062945dc22e6b9f524f9d7c80aba6e4f5dc8acef98"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.727285 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.727717 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-khtzg" event={"ID":"c2974e7d-3e17-4fb5-bc64-6e33b4237107","Type":"ContainerDied","Data":"73e080550eb3beb8fdcafd5edec32d8d5ee51d6540a73651d9b2444a29f216cb"} Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.733276 4771 scope.go:117] "RemoveContainer" containerID="6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.739327 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:51:24 crc kubenswrapper[4771]: E0219 21:51:24.739818 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-log" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.739838 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-log" Feb 19 21:51:24 crc kubenswrapper[4771]: E0219 21:51:24.739857 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerName="init" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.739867 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerName="init" Feb 19 21:51:24 crc kubenswrapper[4771]: E0219 21:51:24.739877 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc115457-bb0d-4eb1-accb-43b6c88c79ef" containerName="nova-cell1-conductor-db-sync" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.739885 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc115457-bb0d-4eb1-accb-43b6c88c79ef" containerName="nova-cell1-conductor-db-sync" Feb 19 21:51:24 crc kubenswrapper[4771]: E0219 21:51:24.739897 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-metadata" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.739903 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-metadata" Feb 19 21:51:24 crc kubenswrapper[4771]: E0219 21:51:24.739924 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerName="dnsmasq-dns" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.739930 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerName="dnsmasq-dns" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.740120 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc115457-bb0d-4eb1-accb-43b6c88c79ef" containerName="nova-cell1-conductor-db-sync" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.740141 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" containerName="dnsmasq-dns" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.740151 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-metadata" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.740159 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" containerName="nova-metadata-log" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.740723 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.746706 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.762743 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.770779 4771 scope.go:117] "RemoveContainer" containerID="a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526" Feb 19 21:51:24 crc kubenswrapper[4771]: E0219 21:51:24.772456 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526\": container with ID starting with a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526 not found: ID does not exist" containerID="a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.772509 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526"} err="failed to get container status \"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526\": rpc error: code = NotFound desc = could not find container \"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526\": container with ID starting with a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526 not found: ID does not exist" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.772544 4771 scope.go:117] "RemoveContainer" containerID="6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e" Feb 19 21:51:24 crc kubenswrapper[4771]: E0219 21:51:24.773126 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e\": container with ID starting with 6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e not found: ID does not exist" containerID="6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.773145 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e"} err="failed to get container status \"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e\": rpc error: code = NotFound desc = could not find container \"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e\": container with ID starting with 6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e not found: ID does not exist" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.773158 4771 scope.go:117] "RemoveContainer" containerID="a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.773503 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526"} err="failed to get container status \"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526\": rpc error: code = NotFound desc = could not find container \"a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526\": container with ID starting with a7088c931e2c67a7fe8f598acfb7323527897d912699b9b4b10ded16e58da526 not found: ID does not exist" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.773522 4771 scope.go:117] "RemoveContainer" containerID="6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.773795 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e"} err="failed to get container status \"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e\": rpc error: code = NotFound desc = could not find container \"6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e\": container with ID starting with 6608c5fd214aba9d0024d2f53e3e2ffb76a8c7915a2467fccb4482d0e7d4b82e not found: ID does not exist" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.773877 4771 scope.go:117] "RemoveContainer" containerID="66ddd0890bf0446c337501370919c8c56f71b19bf3b7a656b3b554ade625e9bd" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.784959 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.805327 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.816565 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.821142 4771 scope.go:117] "RemoveContainer" containerID="05a5cf0de26dd34b5edca24d533555fe952a16f66da1168202f36726ddafe278" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.825614 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.826305 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-khtzg"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.827533 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.828602 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.841255 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-khtzg"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.848838 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.912773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhzvm\" (UniqueName: \"kubernetes.io/projected/d56d3128-877b-4d8d-a48f-42a7b83e9347-kube-api-access-jhzvm\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.912833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.913005 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hwk\" (UniqueName: \"kubernetes.io/projected/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-kube-api-access-67hwk\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.913118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-config-data\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.913180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-logs\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.913284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.913380 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:24 crc kubenswrapper[4771]: I0219 21:51:24.913459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhzvm\" (UniqueName: \"kubernetes.io/projected/d56d3128-877b-4d8d-a48f-42a7b83e9347-kube-api-access-jhzvm\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016569 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016624 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67hwk\" (UniqueName: \"kubernetes.io/projected/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-kube-api-access-67hwk\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016663 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-config-data\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016696 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-logs\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016739 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.016784 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.017280 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-logs\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.027757 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.027789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.028724 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-config-data\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.030554 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.032628 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.039433 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hwk\" (UniqueName: \"kubernetes.io/projected/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-kube-api-access-67hwk\") pod \"nova-metadata-0\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.040525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhzvm\" (UniqueName: \"kubernetes.io/projected/d56d3128-877b-4d8d-a48f-42a7b83e9347-kube-api-access-jhzvm\") pod \"nova-cell1-conductor-0\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.079835 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.142752 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.601460 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:51:25 crc kubenswrapper[4771]: W0219 21:51:25.605895 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd56d3128_877b_4d8d_a48f_42a7b83e9347.slice/crio-6d2fdee37dded2467744f12f3354205ed44f3de4f244de30200cc7739d19d674 WatchSource:0}: Error finding container 6d2fdee37dded2467744f12f3354205ed44f3de4f244de30200cc7739d19d674: Status 404 returned error can't find the container with id 6d2fdee37dded2467744f12f3354205ed44f3de4f244de30200cc7739d19d674 Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.695715 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:51:25 crc kubenswrapper[4771]: W0219 21:51:25.700801 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda738e2ab_8437_4ca3_8beb_a7a74d112ea4.slice/crio-cc7cc23095d312ba90ae35b9151c2252826d2e5ceed5e6cc4738150d746c040c WatchSource:0}: Error finding container cc7cc23095d312ba90ae35b9151c2252826d2e5ceed5e6cc4738150d746c040c: Status 404 returned error can't find the container with id cc7cc23095d312ba90ae35b9151c2252826d2e5ceed5e6cc4738150d746c040c Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.753121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a738e2ab-8437-4ca3-8beb-a7a74d112ea4","Type":"ContainerStarted","Data":"cc7cc23095d312ba90ae35b9151c2252826d2e5ceed5e6cc4738150d746c040c"} Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.754709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerStarted","Data":"6560d8bdcf78b4d23a84cce7379f0fe8b2ea6bf86b5c5d3682577097354d8e2e"} Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.756985 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" containerName="nova-scheduler-scheduler" containerID="cri-o://97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" gracePeriod=30 Feb 19 21:51:25 crc kubenswrapper[4771]: I0219 21:51:25.757153 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d56d3128-877b-4d8d-a48f-42a7b83e9347","Type":"ContainerStarted","Data":"6d2fdee37dded2467744f12f3354205ed44f3de4f244de30200cc7739d19d674"} Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.455836 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2974e7d-3e17-4fb5-bc64-6e33b4237107" path="/var/lib/kubelet/pods/c2974e7d-3e17-4fb5-bc64-6e33b4237107/volumes" Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.456900 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61c43ff-92a4-4297-87b5-67ee379cff2a" path="/var/lib/kubelet/pods/c61c43ff-92a4-4297-87b5-67ee379cff2a/volumes" Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.766880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerStarted","Data":"1f646a14a029e2911cbf4d93cd215e826e06c6bacea6273b1b18d9e72ab68be4"} Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.769778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d56d3128-877b-4d8d-a48f-42a7b83e9347","Type":"ContainerStarted","Data":"4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6"} Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.769858 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.772818 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a738e2ab-8437-4ca3-8beb-a7a74d112ea4","Type":"ContainerStarted","Data":"57fb31ab711e5d13d400bec76bdcbd0c4341b1fc0d21f6f6600f2084488fa088"} Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.772853 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a738e2ab-8437-4ca3-8beb-a7a74d112ea4","Type":"ContainerStarted","Data":"7259b0900abc0f8f057c7e0cc3ae36201abb5c977bdf9917bb917adf06c8bd91"} Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.790874 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.790855048 podStartE2EDuration="2.790855048s" podCreationTimestamp="2026-02-19 21:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:26.785804043 +0000 UTC m=+1387.057246513" watchObservedRunningTime="2026-02-19 21:51:26.790855048 +0000 UTC m=+1387.062297518" Feb 19 21:51:26 crc kubenswrapper[4771]: I0219 21:51:26.811295 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.811280563 podStartE2EDuration="2.811280563s" podCreationTimestamp="2026-02-19 21:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:26.806693551 +0000 UTC m=+1387.078136031" watchObservedRunningTime="2026-02-19 21:51:26.811280563 +0000 UTC m=+1387.082723033" Feb 19 21:51:27 crc kubenswrapper[4771]: I0219 21:51:27.784745 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerStarted","Data":"05122010f4de560850b26a44315e06592e72f23224d8137e856dac23b041d8f0"} Feb 19 21:51:27 crc kubenswrapper[4771]: I0219 21:51:27.785141 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:51:27 crc kubenswrapper[4771]: I0219 21:51:27.924515 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 21:51:27 crc kubenswrapper[4771]: I0219 21:51:27.949608 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.669364551 podStartE2EDuration="5.949583924s" podCreationTimestamp="2026-02-19 21:51:22 +0000 UTC" firstStartedPulling="2026-02-19 21:51:23.919324193 +0000 UTC m=+1384.190766663" lastFinishedPulling="2026-02-19 21:51:27.199543566 +0000 UTC m=+1387.470986036" observedRunningTime="2026-02-19 21:51:27.815160462 +0000 UTC m=+1388.086602942" watchObservedRunningTime="2026-02-19 21:51:27.949583924 +0000 UTC m=+1388.221026424" Feb 19 21:51:28 crc kubenswrapper[4771]: E0219 21:51:28.279329 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9 is running failed: container process not found" containerID="97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:51:28 crc kubenswrapper[4771]: E0219 21:51:28.279615 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9 is running failed: container process not found" containerID="97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:51:28 crc kubenswrapper[4771]: E0219 21:51:28.279851 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9 is running failed: container process not found" containerID="97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:51:28 crc kubenswrapper[4771]: E0219 21:51:28.279885 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" containerName="nova-scheduler-scheduler" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.531073 4771 scope.go:117] "RemoveContainer" containerID="f9e8649e655ee08293c4676833a4f52d85cd6ecfca09d643e70853b6d32816ee" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.668995 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.796930 4771 generic.go:334] "Generic (PLEG): container finished" podID="07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" containerID="97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" exitCode=0 Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.796995 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e","Type":"ContainerDied","Data":"97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9"} Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.797049 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.797072 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e","Type":"ContainerDied","Data":"454b4aef9e4de165a98937aa7e594688925c802e318ea7967ba8687b6b54c8c2"} Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.797096 4771 scope.go:117] "RemoveContainer" containerID="97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.798147 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-combined-ca-bundle\") pod \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.798256 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-config-data\") pod \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.798357 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmf97\" (UniqueName: \"kubernetes.io/projected/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-kube-api-access-wmf97\") pod \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\" (UID: \"07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e\") " Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.808736 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-kube-api-access-wmf97" (OuterVolumeSpecName: "kube-api-access-wmf97") pod "07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" (UID: "07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e"). InnerVolumeSpecName "kube-api-access-wmf97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.842425 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-config-data" (OuterVolumeSpecName: "config-data") pod "07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" (UID: "07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.856233 4771 scope.go:117] "RemoveContainer" containerID="97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" Feb 19 21:51:28 crc kubenswrapper[4771]: E0219 21:51:28.856674 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9\": container with ID starting with 97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9 not found: ID does not exist" containerID="97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.856705 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9"} err="failed to get container status \"97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9\": rpc error: code = NotFound desc = could not find container \"97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9\": container with ID starting with 97144f37ab83388a9b9a2818324929f56602dc2cf81fcaeb5f8c35ea07d40df9 not found: ID does not exist" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.859518 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" (UID: "07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.901047 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.901612 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:28 crc kubenswrapper[4771]: I0219 21:51:28.901632 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmf97\" (UniqueName: \"kubernetes.io/projected/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e-kube-api-access-wmf97\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.148995 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.157667 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.172111 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:29 crc kubenswrapper[4771]: E0219 21:51:29.172568 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" containerName="nova-scheduler-scheduler" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.172596 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" containerName="nova-scheduler-scheduler" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.172822 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" containerName="nova-scheduler-scheduler" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.173632 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.175615 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.185225 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.311440 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.311500 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-config-data\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.311587 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8l9d\" (UniqueName: \"kubernetes.io/projected/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-kube-api-access-s8l9d\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.413039 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.413092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-config-data\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.413164 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8l9d\" (UniqueName: \"kubernetes.io/projected/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-kube-api-access-s8l9d\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.419149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.421520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-config-data\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.448689 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8l9d\" (UniqueName: \"kubernetes.io/projected/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-kube-api-access-s8l9d\") pod \"nova-scheduler-0\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " pod="openstack/nova-scheduler-0" Feb 19 21:51:29 crc kubenswrapper[4771]: I0219 21:51:29.510445 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.004442 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.123642 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.143570 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.143623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.452063 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e" path="/var/lib/kubelet/pods/07f26ae3-5e1a-40fa-a9a6-01d0338a0b2e/volumes" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.580470 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.736243 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j9r9\" (UniqueName: \"kubernetes.io/projected/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-kube-api-access-6j9r9\") pod \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.736565 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-config-data\") pod \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.736681 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-combined-ca-bundle\") pod \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.736708 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-logs\") pod \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\" (UID: \"4eb7e7d0-7664-4858-af7c-04377b1ea3fe\") " Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.737234 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-logs" (OuterVolumeSpecName: "logs") pod "4eb7e7d0-7664-4858-af7c-04377b1ea3fe" (UID: "4eb7e7d0-7664-4858-af7c-04377b1ea3fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.737522 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.760200 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-kube-api-access-6j9r9" (OuterVolumeSpecName: "kube-api-access-6j9r9") pod "4eb7e7d0-7664-4858-af7c-04377b1ea3fe" (UID: "4eb7e7d0-7664-4858-af7c-04377b1ea3fe"). InnerVolumeSpecName "kube-api-access-6j9r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.778197 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-config-data" (OuterVolumeSpecName: "config-data") pod "4eb7e7d0-7664-4858-af7c-04377b1ea3fe" (UID: "4eb7e7d0-7664-4858-af7c-04377b1ea3fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.779253 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4eb7e7d0-7664-4858-af7c-04377b1ea3fe" (UID: "4eb7e7d0-7664-4858-af7c-04377b1ea3fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.835284 4771 generic.go:334] "Generic (PLEG): container finished" podID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerID="920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013" exitCode=0 Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.835336 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.835379 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4eb7e7d0-7664-4858-af7c-04377b1ea3fe","Type":"ContainerDied","Data":"920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013"} Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.835437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4eb7e7d0-7664-4858-af7c-04377b1ea3fe","Type":"ContainerDied","Data":"5c1558267285c0c2364f0082cd49953062c0399f9359ef04b607fbc45d66444a"} Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.835461 4771 scope.go:117] "RemoveContainer" containerID="920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.839631 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j9r9\" (UniqueName: \"kubernetes.io/projected/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-kube-api-access-6j9r9\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.839670 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.839688 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb7e7d0-7664-4858-af7c-04377b1ea3fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.844146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c","Type":"ContainerStarted","Data":"afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879"} Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.844188 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c","Type":"ContainerStarted","Data":"bc589f3dac6ccad53e93b17f0c58798882b96cab2941547b8ff8ce3a47a4d57a"} Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.860423 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.8604057379999999 podStartE2EDuration="1.860405738s" podCreationTimestamp="2026-02-19 21:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:30.857754687 +0000 UTC m=+1391.129197197" watchObservedRunningTime="2026-02-19 21:51:30.860405738 +0000 UTC m=+1391.131848208" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.868526 4771 scope.go:117] "RemoveContainer" containerID="bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.891941 4771 scope.go:117] "RemoveContainer" containerID="920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013" Feb 19 21:51:30 crc kubenswrapper[4771]: E0219 21:51:30.894474 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013\": container with ID starting with 920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013 not found: ID does not exist" containerID="920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.894721 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013"} err="failed to get container status \"920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013\": rpc error: code = NotFound desc = could not find container \"920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013\": container with ID starting with 920f4103c34cd80735e47a67fccd96d59c65f64f2c3b4c30eb8fca8988c83013 not found: ID does not exist" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.894906 4771 scope.go:117] "RemoveContainer" containerID="bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c" Feb 19 21:51:30 crc kubenswrapper[4771]: E0219 21:51:30.897255 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c\": container with ID starting with bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c not found: ID does not exist" containerID="bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.897476 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c"} err="failed to get container status \"bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c\": rpc error: code = NotFound desc = could not find container \"bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c\": container with ID starting with bb0fe2a15392cdfea5c69a620781db321df553d40e9502cfc17c563de5d8145c not found: ID does not exist" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.901641 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.919142 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.927851 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:30 crc kubenswrapper[4771]: E0219 21:51:30.928281 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-log" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.928299 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-log" Feb 19 21:51:30 crc kubenswrapper[4771]: E0219 21:51:30.928313 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-api" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.928319 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-api" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.928543 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-api" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.928558 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" containerName="nova-api-log" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.929462 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.932267 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:51:30 crc kubenswrapper[4771]: I0219 21:51:30.937056 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.043861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-config-data\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.043932 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmn5b\" (UniqueName: \"kubernetes.io/projected/7781d709-f9d4-485a-be5d-ae290dc3dabd-kube-api-access-wmn5b\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.044106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7781d709-f9d4-485a-be5d-ae290dc3dabd-logs\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.044152 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.146668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-config-data\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.146748 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmn5b\" (UniqueName: \"kubernetes.io/projected/7781d709-f9d4-485a-be5d-ae290dc3dabd-kube-api-access-wmn5b\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.146848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7781d709-f9d4-485a-be5d-ae290dc3dabd-logs\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.146890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.148010 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7781d709-f9d4-485a-be5d-ae290dc3dabd-logs\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.150871 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.151262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-config-data\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.165385 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmn5b\" (UniqueName: \"kubernetes.io/projected/7781d709-f9d4-485a-be5d-ae290dc3dabd-kube-api-access-wmn5b\") pod \"nova-api-0\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.249711 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.732458 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:31 crc kubenswrapper[4771]: W0219 21:51:31.738606 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7781d709_f9d4_485a_be5d_ae290dc3dabd.slice/crio-88862e7771bf490487a30e62cb8b87509c91ad2892c8546ab844cbde76d29cd3 WatchSource:0}: Error finding container 88862e7771bf490487a30e62cb8b87509c91ad2892c8546ab844cbde76d29cd3: Status 404 returned error can't find the container with id 88862e7771bf490487a30e62cb8b87509c91ad2892c8546ab844cbde76d29cd3 Feb 19 21:51:31 crc kubenswrapper[4771]: I0219 21:51:31.853964 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7781d709-f9d4-485a-be5d-ae290dc3dabd","Type":"ContainerStarted","Data":"88862e7771bf490487a30e62cb8b87509c91ad2892c8546ab844cbde76d29cd3"} Feb 19 21:51:32 crc kubenswrapper[4771]: I0219 21:51:32.449292 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb7e7d0-7664-4858-af7c-04377b1ea3fe" path="/var/lib/kubelet/pods/4eb7e7d0-7664-4858-af7c-04377b1ea3fe/volumes" Feb 19 21:51:32 crc kubenswrapper[4771]: I0219 21:51:32.872407 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7781d709-f9d4-485a-be5d-ae290dc3dabd","Type":"ContainerStarted","Data":"bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5"} Feb 19 21:51:32 crc kubenswrapper[4771]: I0219 21:51:32.872462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7781d709-f9d4-485a-be5d-ae290dc3dabd","Type":"ContainerStarted","Data":"8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be"} Feb 19 21:51:32 crc kubenswrapper[4771]: I0219 21:51:32.896008 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.89598938 podStartE2EDuration="2.89598938s" podCreationTimestamp="2026-02-19 21:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:32.891001216 +0000 UTC m=+1393.162443696" watchObservedRunningTime="2026-02-19 21:51:32.89598938 +0000 UTC m=+1393.167431860" Feb 19 21:51:34 crc kubenswrapper[4771]: I0219 21:51:34.511186 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:51:35 crc kubenswrapper[4771]: I0219 21:51:35.150748 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:51:35 crc kubenswrapper[4771]: I0219 21:51:35.150852 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:51:36 crc kubenswrapper[4771]: I0219 21:51:36.170284 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:51:36 crc kubenswrapper[4771]: I0219 21:51:36.170582 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:51:39 crc kubenswrapper[4771]: I0219 21:51:39.512004 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:51:39 crc kubenswrapper[4771]: I0219 21:51:39.560815 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:51:39 crc kubenswrapper[4771]: I0219 21:51:39.986522 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:51:41 crc kubenswrapper[4771]: I0219 21:51:41.250096 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:51:41 crc kubenswrapper[4771]: I0219 21:51:41.251187 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:51:42 crc kubenswrapper[4771]: I0219 21:51:42.291379 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:51:42 crc kubenswrapper[4771]: I0219 21:51:42.291449 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:51:45 crc kubenswrapper[4771]: I0219 21:51:45.152068 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:51:45 crc kubenswrapper[4771]: I0219 21:51:45.153908 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:51:45 crc kubenswrapper[4771]: I0219 21:51:45.163720 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:51:46 crc kubenswrapper[4771]: I0219 21:51:46.010135 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:51:48 crc kubenswrapper[4771]: I0219 21:51:48.991982 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.040169 4771 generic.go:334] "Generic (PLEG): container finished" podID="23c35c42-227c-4e14-bf15-0d931a9442ae" containerID="d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245" exitCode=137 Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.040229 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"23c35c42-227c-4e14-bf15-0d931a9442ae","Type":"ContainerDied","Data":"d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245"} Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.040258 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"23c35c42-227c-4e14-bf15-0d931a9442ae","Type":"ContainerDied","Data":"9961ef09a452078269baae0a3fed5c6cb07026b7058fae2a87a16241f5031c31"} Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.040279 4771 scope.go:117] "RemoveContainer" containerID="d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.040408 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.061857 4771 scope.go:117] "RemoveContainer" containerID="d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245" Feb 19 21:51:49 crc kubenswrapper[4771]: E0219 21:51:49.062446 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245\": container with ID starting with d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245 not found: ID does not exist" containerID="d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.062527 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245"} err="failed to get container status \"d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245\": rpc error: code = NotFound desc = could not find container \"d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245\": container with ID starting with d7cfbbed2deca4eba3b064f744dfb923bbcc329807ee6ea10fd45f507c0a6245 not found: ID does not exist" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.112009 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2xsw\" (UniqueName: \"kubernetes.io/projected/23c35c42-227c-4e14-bf15-0d931a9442ae-kube-api-access-b2xsw\") pod \"23c35c42-227c-4e14-bf15-0d931a9442ae\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.112133 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-combined-ca-bundle\") pod \"23c35c42-227c-4e14-bf15-0d931a9442ae\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.112228 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-config-data\") pod \"23c35c42-227c-4e14-bf15-0d931a9442ae\" (UID: \"23c35c42-227c-4e14-bf15-0d931a9442ae\") " Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.120275 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c35c42-227c-4e14-bf15-0d931a9442ae-kube-api-access-b2xsw" (OuterVolumeSpecName: "kube-api-access-b2xsw") pod "23c35c42-227c-4e14-bf15-0d931a9442ae" (UID: "23c35c42-227c-4e14-bf15-0d931a9442ae"). InnerVolumeSpecName "kube-api-access-b2xsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.142619 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23c35c42-227c-4e14-bf15-0d931a9442ae" (UID: "23c35c42-227c-4e14-bf15-0d931a9442ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.150517 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-config-data" (OuterVolumeSpecName: "config-data") pod "23c35c42-227c-4e14-bf15-0d931a9442ae" (UID: "23c35c42-227c-4e14-bf15-0d931a9442ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.214651 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2xsw\" (UniqueName: \"kubernetes.io/projected/23c35c42-227c-4e14-bf15-0d931a9442ae-kube-api-access-b2xsw\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.214687 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.214696 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c35c42-227c-4e14-bf15-0d931a9442ae-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.381094 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.398465 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.411731 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:49 crc kubenswrapper[4771]: E0219 21:51:49.412191 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c35c42-227c-4e14-bf15-0d931a9442ae" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.412216 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c35c42-227c-4e14-bf15-0d931a9442ae" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.412492 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c35c42-227c-4e14-bf15-0d931a9442ae" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.413232 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.419429 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.419580 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.420331 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.424829 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.521091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jfl4\" (UniqueName: \"kubernetes.io/projected/8c8e505e-8c04-4299-a2d1-89a5cb544b81-kube-api-access-4jfl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.521428 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.521642 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.521784 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.521984 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.624445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.624619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jfl4\" (UniqueName: \"kubernetes.io/projected/8c8e505e-8c04-4299-a2d1-89a5cb544b81-kube-api-access-4jfl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.624723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.624893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.624943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.632101 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.632330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.634306 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.635268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.648717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jfl4\" (UniqueName: \"kubernetes.io/projected/8c8e505e-8c04-4299-a2d1-89a5cb544b81-kube-api-access-4jfl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:49 crc kubenswrapper[4771]: I0219 21:51:49.730270 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:50 crc kubenswrapper[4771]: I0219 21:51:50.136410 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:51:50 crc kubenswrapper[4771]: I0219 21:51:50.455259 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c35c42-227c-4e14-bf15-0d931a9442ae" path="/var/lib/kubelet/pods/23c35c42-227c-4e14-bf15-0d931a9442ae/volumes" Feb 19 21:51:51 crc kubenswrapper[4771]: I0219 21:51:51.071699 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c8e505e-8c04-4299-a2d1-89a5cb544b81","Type":"ContainerStarted","Data":"a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af"} Feb 19 21:51:51 crc kubenswrapper[4771]: I0219 21:51:51.072190 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c8e505e-8c04-4299-a2d1-89a5cb544b81","Type":"ContainerStarted","Data":"d986d42ca294547e762a0387a3d9465d27f94eec9a401ab17d189dd82ae584b7"} Feb 19 21:51:51 crc kubenswrapper[4771]: I0219 21:51:51.099567 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.099537149 podStartE2EDuration="2.099537149s" podCreationTimestamp="2026-02-19 21:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:51.092520721 +0000 UTC m=+1411.363963231" watchObservedRunningTime="2026-02-19 21:51:51.099537149 +0000 UTC m=+1411.370979649" Feb 19 21:51:51 crc kubenswrapper[4771]: I0219 21:51:51.254727 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:51:51 crc kubenswrapper[4771]: I0219 21:51:51.255345 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:51:51 crc kubenswrapper[4771]: I0219 21:51:51.256592 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:51:51 crc kubenswrapper[4771]: I0219 21:51:51.260130 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.078937 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.084338 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.288563 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-8s8dd"] Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.291879 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.304458 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-8s8dd"] Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.489004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-config\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.489112 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.489135 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.489159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.489299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7hp4\" (UniqueName: \"kubernetes.io/projected/74f56b2b-3ff3-4a35-a237-180c42816388-kube-api-access-v7hp4\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.489376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-svc\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.590727 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-config\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.590844 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.590867 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.590894 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.590987 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7hp4\" (UniqueName: \"kubernetes.io/projected/74f56b2b-3ff3-4a35-a237-180c42816388-kube-api-access-v7hp4\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.591747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-svc\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.591679 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-config\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.591851 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.591866 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.592111 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.592494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-svc\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.614436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7hp4\" (UniqueName: \"kubernetes.io/projected/74f56b2b-3ff3-4a35-a237-180c42816388-kube-api-access-v7hp4\") pod \"dnsmasq-dns-7dcd758995-8s8dd\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:52 crc kubenswrapper[4771]: I0219 21:51:52.673313 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:53 crc kubenswrapper[4771]: W0219 21:51:53.125430 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74f56b2b_3ff3_4a35_a237_180c42816388.slice/crio-65b4d5b7ce8c048cfbcd21e9d2203d18c1e36c93a47b658f2f3f5931c5ee7355 WatchSource:0}: Error finding container 65b4d5b7ce8c048cfbcd21e9d2203d18c1e36c93a47b658f2f3f5931c5ee7355: Status 404 returned error can't find the container with id 65b4d5b7ce8c048cfbcd21e9d2203d18c1e36c93a47b658f2f3f5931c5ee7355 Feb 19 21:51:53 crc kubenswrapper[4771]: I0219 21:51:53.125748 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-8s8dd"] Feb 19 21:51:53 crc kubenswrapper[4771]: I0219 21:51:53.319254 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.111678 4771 generic.go:334] "Generic (PLEG): container finished" podID="74f56b2b-3ff3-4a35-a237-180c42816388" containerID="ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1" exitCode=0 Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.112145 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" event={"ID":"74f56b2b-3ff3-4a35-a237-180c42816388","Type":"ContainerDied","Data":"ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1"} Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.112225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" event={"ID":"74f56b2b-3ff3-4a35-a237-180c42816388","Type":"ContainerStarted","Data":"65b4d5b7ce8c048cfbcd21e9d2203d18c1e36c93a47b658f2f3f5931c5ee7355"} Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.245447 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.245665 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-central-agent" containerID="cri-o://e1fb9a440e286e4c6981017fc882be0307a33c07dd611dc0449fb5e94378ea9d" gracePeriod=30 Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.246039 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="proxy-httpd" containerID="cri-o://05122010f4de560850b26a44315e06592e72f23224d8137e856dac23b041d8f0" gracePeriod=30 Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.246091 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="sg-core" containerID="cri-o://1f646a14a029e2911cbf4d93cd215e826e06c6bacea6273b1b18d9e72ab68be4" gracePeriod=30 Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.246127 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-notification-agent" containerID="cri-o://6560d8bdcf78b4d23a84cce7379f0fe8b2ea6bf86b5c5d3682577097354d8e2e" gracePeriod=30 Feb 19 21:51:54 crc kubenswrapper[4771]: I0219 21:51:54.762046 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.132132 4771 generic.go:334] "Generic (PLEG): container finished" podID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerID="05122010f4de560850b26a44315e06592e72f23224d8137e856dac23b041d8f0" exitCode=0 Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.132168 4771 generic.go:334] "Generic (PLEG): container finished" podID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerID="1f646a14a029e2911cbf4d93cd215e826e06c6bacea6273b1b18d9e72ab68be4" exitCode=2 Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.132179 4771 generic.go:334] "Generic (PLEG): container finished" podID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerID="e1fb9a440e286e4c6981017fc882be0307a33c07dd611dc0449fb5e94378ea9d" exitCode=0 Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.132245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerDied","Data":"05122010f4de560850b26a44315e06592e72f23224d8137e856dac23b041d8f0"} Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.132425 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerDied","Data":"1f646a14a029e2911cbf4d93cd215e826e06c6bacea6273b1b18d9e72ab68be4"} Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.132442 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerDied","Data":"e1fb9a440e286e4c6981017fc882be0307a33c07dd611dc0449fb5e94378ea9d"} Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.133496 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.135049 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" event={"ID":"74f56b2b-3ff3-4a35-a237-180c42816388","Type":"ContainerStarted","Data":"5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad"} Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.135172 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-log" containerID="cri-o://8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be" gracePeriod=30 Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.135249 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-api" containerID="cri-o://bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5" gracePeriod=30 Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.135669 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:51:55 crc kubenswrapper[4771]: I0219 21:51:55.166741 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" podStartSLOduration=3.166723296 podStartE2EDuration="3.166723296s" podCreationTimestamp="2026-02-19 21:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:51:55.157910381 +0000 UTC m=+1415.429352881" watchObservedRunningTime="2026-02-19 21:51:55.166723296 +0000 UTC m=+1415.438165766" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.002935 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pfm9t"] Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.005473 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.031100 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pfm9t"] Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.146830 4771 generic.go:334] "Generic (PLEG): container finished" podID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerID="8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be" exitCode=143 Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.146917 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7781d709-f9d4-485a-be5d-ae290dc3dabd","Type":"ContainerDied","Data":"8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be"} Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.153762 4771 generic.go:334] "Generic (PLEG): container finished" podID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerID="6560d8bdcf78b4d23a84cce7379f0fe8b2ea6bf86b5c5d3682577097354d8e2e" exitCode=0 Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.153831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerDied","Data":"6560d8bdcf78b4d23a84cce7379f0fe8b2ea6bf86b5c5d3682577097354d8e2e"} Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.192742 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-catalog-content\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.192860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-utilities\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.192883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhjw7\" (UniqueName: \"kubernetes.io/projected/4a5de993-201f-4ac4-a77f-df0a805dc205-kube-api-access-rhjw7\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.294664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-utilities\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.294724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhjw7\" (UniqueName: \"kubernetes.io/projected/4a5de993-201f-4ac4-a77f-df0a805dc205-kube-api-access-rhjw7\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.295088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-catalog-content\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.295496 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-utilities\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.295537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-catalog-content\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.320733 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhjw7\" (UniqueName: \"kubernetes.io/projected/4a5de993-201f-4ac4-a77f-df0a805dc205-kube-api-access-rhjw7\") pod \"community-operators-pfm9t\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.335844 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.448125 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.600510 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-combined-ca-bundle\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.600854 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-run-httpd\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.600877 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-scripts\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.600929 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-sg-core-conf-yaml\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.600973 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-log-httpd\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.601067 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-ceilometer-tls-certs\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.601101 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hnxt\" (UniqueName: \"kubernetes.io/projected/7ea28885-8813-47ef-bd23-3911e4fdeacc-kube-api-access-5hnxt\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.601122 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-config-data\") pod \"7ea28885-8813-47ef-bd23-3911e4fdeacc\" (UID: \"7ea28885-8813-47ef-bd23-3911e4fdeacc\") " Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.602398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.604857 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.621743 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea28885-8813-47ef-bd23-3911e4fdeacc-kube-api-access-5hnxt" (OuterVolumeSpecName: "kube-api-access-5hnxt") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "kube-api-access-5hnxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.636635 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-scripts" (OuterVolumeSpecName: "scripts") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.644258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.667858 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.703639 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.703677 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.703690 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.703699 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea28885-8813-47ef-bd23-3911e4fdeacc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.703709 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.703720 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hnxt\" (UniqueName: \"kubernetes.io/projected/7ea28885-8813-47ef-bd23-3911e4fdeacc-kube-api-access-5hnxt\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.719878 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.729200 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-config-data" (OuterVolumeSpecName: "config-data") pod "7ea28885-8813-47ef-bd23-3911e4fdeacc" (UID: "7ea28885-8813-47ef-bd23-3911e4fdeacc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.805716 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.805751 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea28885-8813-47ef-bd23-3911e4fdeacc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:56 crc kubenswrapper[4771]: W0219 21:51:56.973307 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a5de993_201f_4ac4_a77f_df0a805dc205.slice/crio-e24d27c730e0b2a4a25d9fe81bc25120b8705f445c7109e79ff4aa8806fb4d6f WatchSource:0}: Error finding container e24d27c730e0b2a4a25d9fe81bc25120b8705f445c7109e79ff4aa8806fb4d6f: Status 404 returned error can't find the container with id e24d27c730e0b2a4a25d9fe81bc25120b8705f445c7109e79ff4aa8806fb4d6f Feb 19 21:51:56 crc kubenswrapper[4771]: I0219 21:51:56.996753 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pfm9t"] Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.166240 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea28885-8813-47ef-bd23-3911e4fdeacc","Type":"ContainerDied","Data":"0acb676fa4c9c70e3acaba062945dc22e6b9f524f9d7c80aba6e4f5dc8acef98"} Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.166613 4771 scope.go:117] "RemoveContainer" containerID="05122010f4de560850b26a44315e06592e72f23224d8137e856dac23b041d8f0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.166876 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.169147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfm9t" event={"ID":"4a5de993-201f-4ac4-a77f-df0a805dc205","Type":"ContainerStarted","Data":"e24d27c730e0b2a4a25d9fe81bc25120b8705f445c7109e79ff4aa8806fb4d6f"} Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.191451 4771 scope.go:117] "RemoveContainer" containerID="1f646a14a029e2911cbf4d93cd215e826e06c6bacea6273b1b18d9e72ab68be4" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.224947 4771 scope.go:117] "RemoveContainer" containerID="6560d8bdcf78b4d23a84cce7379f0fe8b2ea6bf86b5c5d3682577097354d8e2e" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.227929 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.242732 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.263976 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:57 crc kubenswrapper[4771]: E0219 21:51:57.264476 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="sg-core" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264498 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="sg-core" Feb 19 21:51:57 crc kubenswrapper[4771]: E0219 21:51:57.264518 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="proxy-httpd" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264527 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="proxy-httpd" Feb 19 21:51:57 crc kubenswrapper[4771]: E0219 21:51:57.264549 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-notification-agent" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264558 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-notification-agent" Feb 19 21:51:57 crc kubenswrapper[4771]: E0219 21:51:57.264570 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-central-agent" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264577 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-central-agent" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264787 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="proxy-httpd" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264800 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-central-agent" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264813 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="sg-core" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.264837 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" containerName="ceilometer-notification-agent" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.266787 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.268694 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.268768 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.268850 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.272214 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.296276 4771 scope.go:117] "RemoveContainer" containerID="e1fb9a440e286e4c6981017fc882be0307a33c07dd611dc0449fb5e94378ea9d" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.421426 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-run-httpd\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.421478 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.421502 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-scripts\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.421556 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.421582 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-log-httpd\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.421698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hcmh\" (UniqueName: \"kubernetes.io/projected/6733d997-54a6-4c46-8692-db99d5a20a9e-kube-api-access-4hcmh\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.421937 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.422223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-config-data\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.523848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.523930 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-log-httpd\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.523952 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hcmh\" (UniqueName: \"kubernetes.io/projected/6733d997-54a6-4c46-8692-db99d5a20a9e-kube-api-access-4hcmh\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.524568 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-log-httpd\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.524658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.525484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-config-data\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.525760 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-run-httpd\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.525810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.525849 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-scripts\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.526535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-run-httpd\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.531661 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.531678 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.531666 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-scripts\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.532749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-config-data\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.534619 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.542710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hcmh\" (UniqueName: \"kubernetes.io/projected/6733d997-54a6-4c46-8692-db99d5a20a9e-kube-api-access-4hcmh\") pod \"ceilometer-0\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " pod="openstack/ceilometer-0" Feb 19 21:51:57 crc kubenswrapper[4771]: I0219 21:51:57.606860 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:51:58 crc kubenswrapper[4771]: I0219 21:51:58.143891 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:51:58 crc kubenswrapper[4771]: I0219 21:51:58.192599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerStarted","Data":"a7189f907c3f6278bb817eb96a4052cc309c4db84001bae0fa037b2f77fbb2c5"} Feb 19 21:51:58 crc kubenswrapper[4771]: I0219 21:51:58.194691 4771 generic.go:334] "Generic (PLEG): container finished" podID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerID="cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f" exitCode=0 Feb 19 21:51:58 crc kubenswrapper[4771]: I0219 21:51:58.194731 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfm9t" event={"ID":"4a5de993-201f-4ac4-a77f-df0a805dc205","Type":"ContainerDied","Data":"cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f"} Feb 19 21:51:58 crc kubenswrapper[4771]: I0219 21:51:58.448134 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea28885-8813-47ef-bd23-3911e4fdeacc" path="/var/lib/kubelet/pods/7ea28885-8813-47ef-bd23-3911e4fdeacc/volumes" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.839157 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.852929 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-config-data\") pod \"7781d709-f9d4-485a-be5d-ae290dc3dabd\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.853045 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-combined-ca-bundle\") pod \"7781d709-f9d4-485a-be5d-ae290dc3dabd\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.853093 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmn5b\" (UniqueName: \"kubernetes.io/projected/7781d709-f9d4-485a-be5d-ae290dc3dabd-kube-api-access-wmn5b\") pod \"7781d709-f9d4-485a-be5d-ae290dc3dabd\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.853122 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7781d709-f9d4-485a-be5d-ae290dc3dabd-logs\") pod \"7781d709-f9d4-485a-be5d-ae290dc3dabd\" (UID: \"7781d709-f9d4-485a-be5d-ae290dc3dabd\") " Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.853831 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7781d709-f9d4-485a-be5d-ae290dc3dabd-logs" (OuterVolumeSpecName: "logs") pod "7781d709-f9d4-485a-be5d-ae290dc3dabd" (UID: "7781d709-f9d4-485a-be5d-ae290dc3dabd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.854385 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7781d709-f9d4-485a-be5d-ae290dc3dabd-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.857187 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7781d709-f9d4-485a-be5d-ae290dc3dabd-kube-api-access-wmn5b" (OuterVolumeSpecName: "kube-api-access-wmn5b") pod "7781d709-f9d4-485a-be5d-ae290dc3dabd" (UID: "7781d709-f9d4-485a-be5d-ae290dc3dabd"). InnerVolumeSpecName "kube-api-access-wmn5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.887553 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7781d709-f9d4-485a-be5d-ae290dc3dabd" (UID: "7781d709-f9d4-485a-be5d-ae290dc3dabd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.900949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-config-data" (OuterVolumeSpecName: "config-data") pod "7781d709-f9d4-485a-be5d-ae290dc3dabd" (UID: "7781d709-f9d4-485a-be5d-ae290dc3dabd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.955081 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmn5b\" (UniqueName: \"kubernetes.io/projected/7781d709-f9d4-485a-be5d-ae290dc3dabd-kube-api-access-wmn5b\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.955112 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:58.955124 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7781d709-f9d4-485a-be5d-ae290dc3dabd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.216888 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerStarted","Data":"88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98"} Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.220334 4771 generic.go:334] "Generic (PLEG): container finished" podID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerID="bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5" exitCode=0 Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.220360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7781d709-f9d4-485a-be5d-ae290dc3dabd","Type":"ContainerDied","Data":"bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5"} Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.220378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7781d709-f9d4-485a-be5d-ae290dc3dabd","Type":"ContainerDied","Data":"88862e7771bf490487a30e62cb8b87509c91ad2892c8546ab844cbde76d29cd3"} Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.220394 4771 scope.go:117] "RemoveContainer" containerID="bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.220424 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.243666 4771 scope.go:117] "RemoveContainer" containerID="8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.256883 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.267240 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.292553 4771 scope.go:117] "RemoveContainer" containerID="bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.302143 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:59 crc kubenswrapper[4771]: E0219 21:51:59.302498 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-log" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.302510 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-log" Feb 19 21:51:59 crc kubenswrapper[4771]: E0219 21:51:59.302541 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-api" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.302547 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-api" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.302705 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-log" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.302754 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" containerName="nova-api-api" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.303522 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.303591 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.320195 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.320414 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.320611 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:51:59 crc kubenswrapper[4771]: E0219 21:51:59.321809 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5\": container with ID starting with bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5 not found: ID does not exist" containerID="bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.321842 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5"} err="failed to get container status \"bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5\": rpc error: code = NotFound desc = could not find container \"bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5\": container with ID starting with bac4e9c6651db64b2b56881749ff4be00c8fbe8fd7ccab42ee0d425bb524a9e5 not found: ID does not exist" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.321865 4771 scope.go:117] "RemoveContainer" containerID="8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be" Feb 19 21:51:59 crc kubenswrapper[4771]: E0219 21:51:59.322269 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be\": container with ID starting with 8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be not found: ID does not exist" containerID="8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.322288 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be"} err="failed to get container status \"8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be\": rpc error: code = NotFound desc = could not find container \"8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be\": container with ID starting with 8104fcc7b714b0fd2aca9854b621906b42c75e2bde55244981d87ec1acbc48be not found: ID does not exist" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.464471 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-internal-tls-certs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.464517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-config-data\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.464547 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.464571 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-public-tls-certs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.464601 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/889fbe6e-7e88-45af-bb42-0ddae2231257-logs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.464659 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc7v2\" (UniqueName: \"kubernetes.io/projected/889fbe6e-7e88-45af-bb42-0ddae2231257-kube-api-access-pc7v2\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.566128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.566188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-public-tls-certs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.566239 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/889fbe6e-7e88-45af-bb42-0ddae2231257-logs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.566297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc7v2\" (UniqueName: \"kubernetes.io/projected/889fbe6e-7e88-45af-bb42-0ddae2231257-kube-api-access-pc7v2\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.566392 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-internal-tls-certs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.566435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-config-data\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.566934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/889fbe6e-7e88-45af-bb42-0ddae2231257-logs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.570884 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-public-tls-certs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.571102 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.574457 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-internal-tls-certs\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.574625 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-config-data\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.582882 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc7v2\" (UniqueName: \"kubernetes.io/projected/889fbe6e-7e88-45af-bb42-0ddae2231257-kube-api-access-pc7v2\") pod \"nova-api-0\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.649441 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.733264 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:51:59 crc kubenswrapper[4771]: I0219 21:51:59.748339 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.116588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:00 crc kubenswrapper[4771]: W0219 21:52:00.120657 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod889fbe6e_7e88_45af_bb42_0ddae2231257.slice/crio-48eac43e1307352ae5f8df93f03d208f54b900bfbd319148d74d6f02386ca24b WatchSource:0}: Error finding container 48eac43e1307352ae5f8df93f03d208f54b900bfbd319148d74d6f02386ca24b: Status 404 returned error can't find the container with id 48eac43e1307352ae5f8df93f03d208f54b900bfbd319148d74d6f02386ca24b Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.237490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerStarted","Data":"30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab"} Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.239929 4771 generic.go:334] "Generic (PLEG): container finished" podID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerID="c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2" exitCode=0 Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.240020 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfm9t" event={"ID":"4a5de993-201f-4ac4-a77f-df0a805dc205","Type":"ContainerDied","Data":"c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2"} Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.242580 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"889fbe6e-7e88-45af-bb42-0ddae2231257","Type":"ContainerStarted","Data":"48eac43e1307352ae5f8df93f03d208f54b900bfbd319148d74d6f02386ca24b"} Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.270800 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.454641 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7781d709-f9d4-485a-be5d-ae290dc3dabd" path="/var/lib/kubelet/pods/7781d709-f9d4-485a-be5d-ae290dc3dabd/volumes" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.593344 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gtz9t"] Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.594382 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.596097 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.596684 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.605774 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gtz9t"] Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.691923 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-config-data\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.691999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.692073 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-scripts\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.692133 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76nx\" (UniqueName: \"kubernetes.io/projected/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-kube-api-access-x76nx\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.793239 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76nx\" (UniqueName: \"kubernetes.io/projected/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-kube-api-access-x76nx\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.793813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-config-data\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.793888 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.793944 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-scripts\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.800241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-config-data\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.801984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-scripts\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.802317 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.807934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76nx\" (UniqueName: \"kubernetes.io/projected/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-kube-api-access-x76nx\") pod \"nova-cell1-cell-mapping-gtz9t\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:00 crc kubenswrapper[4771]: I0219 21:52:00.908604 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:01 crc kubenswrapper[4771]: I0219 21:52:01.406770 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gtz9t"] Feb 19 21:52:01 crc kubenswrapper[4771]: I0219 21:52:01.407230 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerStarted","Data":"b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38"} Feb 19 21:52:01 crc kubenswrapper[4771]: I0219 21:52:01.415717 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfm9t" event={"ID":"4a5de993-201f-4ac4-a77f-df0a805dc205","Type":"ContainerStarted","Data":"e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828"} Feb 19 21:52:01 crc kubenswrapper[4771]: I0219 21:52:01.435197 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pfm9t" podStartSLOduration=3.945859178 podStartE2EDuration="6.435180852s" podCreationTimestamp="2026-02-19 21:51:55 +0000 UTC" firstStartedPulling="2026-02-19 21:51:58.197444814 +0000 UTC m=+1418.468887284" lastFinishedPulling="2026-02-19 21:52:00.686766488 +0000 UTC m=+1420.958208958" observedRunningTime="2026-02-19 21:52:01.431520544 +0000 UTC m=+1421.702963024" watchObservedRunningTime="2026-02-19 21:52:01.435180852 +0000 UTC m=+1421.706623322" Feb 19 21:52:01 crc kubenswrapper[4771]: I0219 21:52:01.444725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"889fbe6e-7e88-45af-bb42-0ddae2231257","Type":"ContainerStarted","Data":"bc7b7848d364235dd9f63d6ab0b8245aa7273f50cee50bf8c43a3e25d3f6c91c"} Feb 19 21:52:01 crc kubenswrapper[4771]: I0219 21:52:01.444756 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"889fbe6e-7e88-45af-bb42-0ddae2231257","Type":"ContainerStarted","Data":"22e4abfa81a3f53f14c5200317a7937d65c732608eda55cab395272540c9b54c"} Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.456325 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerStarted","Data":"1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60"} Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.456800 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.460252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gtz9t" event={"ID":"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5","Type":"ContainerStarted","Data":"4a5aa26a299aa0d998acf896b457654e74cd80a622f33a8d72e1c3135f3d7b1e"} Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.460336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gtz9t" event={"ID":"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5","Type":"ContainerStarted","Data":"b8727d919cb93629daaf646f62568774ef0e3eae1348f649546523164cc37c8a"} Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.490941 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.490917401 podStartE2EDuration="3.490917401s" podCreationTimestamp="2026-02-19 21:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:52:01.462898043 +0000 UTC m=+1421.734340513" watchObservedRunningTime="2026-02-19 21:52:02.490917401 +0000 UTC m=+1422.762359891" Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.491388 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8997121799999999 podStartE2EDuration="5.491382193s" podCreationTimestamp="2026-02-19 21:51:57 +0000 UTC" firstStartedPulling="2026-02-19 21:51:58.14488839 +0000 UTC m=+1418.416330870" lastFinishedPulling="2026-02-19 21:52:01.736558413 +0000 UTC m=+1422.008000883" observedRunningTime="2026-02-19 21:52:02.476443089 +0000 UTC m=+1422.747885599" watchObservedRunningTime="2026-02-19 21:52:02.491382193 +0000 UTC m=+1422.762824683" Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.507662 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gtz9t" podStartSLOduration=2.507644344 podStartE2EDuration="2.507644344s" podCreationTimestamp="2026-02-19 21:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:52:02.495220535 +0000 UTC m=+1422.766663015" watchObservedRunningTime="2026-02-19 21:52:02.507644344 +0000 UTC m=+1422.779086804" Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.675254 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.768690 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-qvsck"] Feb 19 21:52:02 crc kubenswrapper[4771]: I0219 21:52:02.768964 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" podUID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerName="dnsmasq-dns" containerID="cri-o://02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de" gracePeriod=10 Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.279962 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.357315 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6bcd\" (UniqueName: \"kubernetes.io/projected/49e59cf4-4766-4548-9eb8-cd04f34153ef-kube-api-access-s6bcd\") pod \"49e59cf4-4766-4548-9eb8-cd04f34153ef\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.357586 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-config\") pod \"49e59cf4-4766-4548-9eb8-cd04f34153ef\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.357662 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-swift-storage-0\") pod \"49e59cf4-4766-4548-9eb8-cd04f34153ef\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.357768 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-svc\") pod \"49e59cf4-4766-4548-9eb8-cd04f34153ef\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.357854 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-nb\") pod \"49e59cf4-4766-4548-9eb8-cd04f34153ef\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.357991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-sb\") pod \"49e59cf4-4766-4548-9eb8-cd04f34153ef\" (UID: \"49e59cf4-4766-4548-9eb8-cd04f34153ef\") " Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.372719 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e59cf4-4766-4548-9eb8-cd04f34153ef-kube-api-access-s6bcd" (OuterVolumeSpecName: "kube-api-access-s6bcd") pod "49e59cf4-4766-4548-9eb8-cd04f34153ef" (UID: "49e59cf4-4766-4548-9eb8-cd04f34153ef"). InnerVolumeSpecName "kube-api-access-s6bcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.408759 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49e59cf4-4766-4548-9eb8-cd04f34153ef" (UID: "49e59cf4-4766-4548-9eb8-cd04f34153ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.412608 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "49e59cf4-4766-4548-9eb8-cd04f34153ef" (UID: "49e59cf4-4766-4548-9eb8-cd04f34153ef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.423483 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49e59cf4-4766-4548-9eb8-cd04f34153ef" (UID: "49e59cf4-4766-4548-9eb8-cd04f34153ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.434717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-config" (OuterVolumeSpecName: "config") pod "49e59cf4-4766-4548-9eb8-cd04f34153ef" (UID: "49e59cf4-4766-4548-9eb8-cd04f34153ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.459607 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.460499 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.460511 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.460520 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6bcd\" (UniqueName: \"kubernetes.io/projected/49e59cf4-4766-4548-9eb8-cd04f34153ef-kube-api-access-s6bcd\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.460528 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.472608 4771 generic.go:334] "Generic (PLEG): container finished" podID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerID="02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de" exitCode=0 Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.473753 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.474335 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" event={"ID":"49e59cf4-4766-4548-9eb8-cd04f34153ef","Type":"ContainerDied","Data":"02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de"} Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.474391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-qvsck" event={"ID":"49e59cf4-4766-4548-9eb8-cd04f34153ef","Type":"ContainerDied","Data":"a1b747e41cc4c2ec78d1fc260f26a8d2f0eaf6d59caa4bb75df6c73eacc0b00c"} Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.474413 4771 scope.go:117] "RemoveContainer" containerID="02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.478457 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49e59cf4-4766-4548-9eb8-cd04f34153ef" (UID: "49e59cf4-4766-4548-9eb8-cd04f34153ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.496562 4771 scope.go:117] "RemoveContainer" containerID="24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.519332 4771 scope.go:117] "RemoveContainer" containerID="02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de" Feb 19 21:52:03 crc kubenswrapper[4771]: E0219 21:52:03.519870 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de\": container with ID starting with 02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de not found: ID does not exist" containerID="02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.519967 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de"} err="failed to get container status \"02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de\": rpc error: code = NotFound desc = could not find container \"02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de\": container with ID starting with 02134086f74c31bcfeead16c5d34bbdaf9d40a81e599230718f3924600e542de not found: ID does not exist" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.520159 4771 scope.go:117] "RemoveContainer" containerID="24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7" Feb 19 21:52:03 crc kubenswrapper[4771]: E0219 21:52:03.520594 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7\": container with ID starting with 24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7 not found: ID does not exist" containerID="24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.520626 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7"} err="failed to get container status \"24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7\": rpc error: code = NotFound desc = could not find container \"24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7\": container with ID starting with 24bcab22bda2a2b60eb37eda5742165f965a39cff7d44a45aef918b6f53780f7 not found: ID does not exist" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.562201 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e59cf4-4766-4548-9eb8-cd04f34153ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.814231 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-qvsck"] Feb 19 21:52:03 crc kubenswrapper[4771]: I0219 21:52:03.824905 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-qvsck"] Feb 19 21:52:04 crc kubenswrapper[4771]: I0219 21:52:04.449802 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e59cf4-4766-4548-9eb8-cd04f34153ef" path="/var/lib/kubelet/pods/49e59cf4-4766-4548-9eb8-cd04f34153ef/volumes" Feb 19 21:52:06 crc kubenswrapper[4771]: I0219 21:52:06.336562 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:52:06 crc kubenswrapper[4771]: I0219 21:52:06.337250 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:52:06 crc kubenswrapper[4771]: I0219 21:52:06.405435 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:52:06 crc kubenswrapper[4771]: I0219 21:52:06.520188 4771 generic.go:334] "Generic (PLEG): container finished" podID="32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" containerID="4a5aa26a299aa0d998acf896b457654e74cd80a622f33a8d72e1c3135f3d7b1e" exitCode=0 Feb 19 21:52:06 crc kubenswrapper[4771]: I0219 21:52:06.520259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gtz9t" event={"ID":"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5","Type":"ContainerDied","Data":"4a5aa26a299aa0d998acf896b457654e74cd80a622f33a8d72e1c3135f3d7b1e"} Feb 19 21:52:06 crc kubenswrapper[4771]: I0219 21:52:06.586844 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:52:06 crc kubenswrapper[4771]: I0219 21:52:06.658178 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pfm9t"] Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.017249 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.064962 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-config-data\") pod \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.065005 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-combined-ca-bundle\") pod \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.065155 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-scripts\") pod \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.065209 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x76nx\" (UniqueName: \"kubernetes.io/projected/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-kube-api-access-x76nx\") pod \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\" (UID: \"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5\") " Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.070995 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-scripts" (OuterVolumeSpecName: "scripts") pod "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" (UID: "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.073261 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-kube-api-access-x76nx" (OuterVolumeSpecName: "kube-api-access-x76nx") pod "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" (UID: "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5"). InnerVolumeSpecName "kube-api-access-x76nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.098756 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-config-data" (OuterVolumeSpecName: "config-data") pod "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" (UID: "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.131616 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" (UID: "32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.167291 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.167331 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x76nx\" (UniqueName: \"kubernetes.io/projected/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-kube-api-access-x76nx\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.167347 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.167359 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.541095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gtz9t" event={"ID":"32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5","Type":"ContainerDied","Data":"b8727d919cb93629daaf646f62568774ef0e3eae1348f649546523164cc37c8a"} Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.541504 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8727d919cb93629daaf646f62568774ef0e3eae1348f649546523164cc37c8a" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.541272 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gtz9t" Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.541733 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pfm9t" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="registry-server" containerID="cri-o://e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828" gracePeriod=2 Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.853459 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.853682 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-log" containerID="cri-o://22e4abfa81a3f53f14c5200317a7937d65c732608eda55cab395272540c9b54c" gracePeriod=30 Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.854080 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-api" containerID="cri-o://bc7b7848d364235dd9f63d6ab0b8245aa7273f50cee50bf8c43a3e25d3f6c91c" gracePeriod=30 Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.887609 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.887906 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" containerName="nova-scheduler-scheduler" containerID="cri-o://afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" gracePeriod=30 Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.895535 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.895748 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-log" containerID="cri-o://7259b0900abc0f8f057c7e0cc3ae36201abb5c977bdf9917bb917adf06c8bd91" gracePeriod=30 Feb 19 21:52:08 crc kubenswrapper[4771]: I0219 21:52:08.895882 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-metadata" containerID="cri-o://57fb31ab711e5d13d400bec76bdcbd0c4341b1fc0d21f6f6600f2084488fa088" gracePeriod=30 Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.089755 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.186656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhjw7\" (UniqueName: \"kubernetes.io/projected/4a5de993-201f-4ac4-a77f-df0a805dc205-kube-api-access-rhjw7\") pod \"4a5de993-201f-4ac4-a77f-df0a805dc205\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.186855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-catalog-content\") pod \"4a5de993-201f-4ac4-a77f-df0a805dc205\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.186893 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-utilities\") pod \"4a5de993-201f-4ac4-a77f-df0a805dc205\" (UID: \"4a5de993-201f-4ac4-a77f-df0a805dc205\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.188436 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-utilities" (OuterVolumeSpecName: "utilities") pod "4a5de993-201f-4ac4-a77f-df0a805dc205" (UID: "4a5de993-201f-4ac4-a77f-df0a805dc205"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.205202 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5de993-201f-4ac4-a77f-df0a805dc205-kube-api-access-rhjw7" (OuterVolumeSpecName: "kube-api-access-rhjw7") pod "4a5de993-201f-4ac4-a77f-df0a805dc205" (UID: "4a5de993-201f-4ac4-a77f-df0a805dc205"). InnerVolumeSpecName "kube-api-access-rhjw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.269300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a5de993-201f-4ac4-a77f-df0a805dc205" (UID: "4a5de993-201f-4ac4-a77f-df0a805dc205"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.289295 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.289332 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5de993-201f-4ac4-a77f-df0a805dc205-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.289365 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhjw7\" (UniqueName: \"kubernetes.io/projected/4a5de993-201f-4ac4-a77f-df0a805dc205-kube-api-access-rhjw7\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: E0219 21:52:09.513865 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:52:09 crc kubenswrapper[4771]: E0219 21:52:09.515939 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:52:09 crc kubenswrapper[4771]: E0219 21:52:09.518411 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:52:09 crc kubenswrapper[4771]: E0219 21:52:09.518457 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" containerName="nova-scheduler-scheduler" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.553124 4771 generic.go:334] "Generic (PLEG): container finished" podID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerID="e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828" exitCode=0 Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.553187 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfm9t" event={"ID":"4a5de993-201f-4ac4-a77f-df0a805dc205","Type":"ContainerDied","Data":"e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828"} Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.553214 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pfm9t" event={"ID":"4a5de993-201f-4ac4-a77f-df0a805dc205","Type":"ContainerDied","Data":"e24d27c730e0b2a4a25d9fe81bc25120b8705f445c7109e79ff4aa8806fb4d6f"} Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.553230 4771 scope.go:117] "RemoveContainer" containerID="e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.553350 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pfm9t" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.559718 4771 generic.go:334] "Generic (PLEG): container finished" podID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerID="bc7b7848d364235dd9f63d6ab0b8245aa7273f50cee50bf8c43a3e25d3f6c91c" exitCode=0 Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.559747 4771 generic.go:334] "Generic (PLEG): container finished" podID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerID="22e4abfa81a3f53f14c5200317a7937d65c732608eda55cab395272540c9b54c" exitCode=143 Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.559801 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"889fbe6e-7e88-45af-bb42-0ddae2231257","Type":"ContainerDied","Data":"bc7b7848d364235dd9f63d6ab0b8245aa7273f50cee50bf8c43a3e25d3f6c91c"} Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.559828 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"889fbe6e-7e88-45af-bb42-0ddae2231257","Type":"ContainerDied","Data":"22e4abfa81a3f53f14c5200317a7937d65c732608eda55cab395272540c9b54c"} Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.567166 4771 generic.go:334] "Generic (PLEG): container finished" podID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerID="7259b0900abc0f8f057c7e0cc3ae36201abb5c977bdf9917bb917adf06c8bd91" exitCode=143 Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.567201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a738e2ab-8437-4ca3-8beb-a7a74d112ea4","Type":"ContainerDied","Data":"7259b0900abc0f8f057c7e0cc3ae36201abb5c977bdf9917bb917adf06c8bd91"} Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.596512 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pfm9t"] Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.601201 4771 scope.go:117] "RemoveContainer" containerID="c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.611282 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pfm9t"] Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.631529 4771 scope.go:117] "RemoveContainer" containerID="cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.653991 4771 scope.go:117] "RemoveContainer" containerID="e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828" Feb 19 21:52:09 crc kubenswrapper[4771]: E0219 21:52:09.654464 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828\": container with ID starting with e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828 not found: ID does not exist" containerID="e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.654489 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828"} err="failed to get container status \"e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828\": rpc error: code = NotFound desc = could not find container \"e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828\": container with ID starting with e528f6f35ffa0b9700b529d1f6d108fd25ec2dd2fe8ae405dffa0b6560098828 not found: ID does not exist" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.654516 4771 scope.go:117] "RemoveContainer" containerID="c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2" Feb 19 21:52:09 crc kubenswrapper[4771]: E0219 21:52:09.655657 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2\": container with ID starting with c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2 not found: ID does not exist" containerID="c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.655702 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2"} err="failed to get container status \"c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2\": rpc error: code = NotFound desc = could not find container \"c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2\": container with ID starting with c49cdcc4fbbb9723c7e32acd227d589236f2b8ec77b33150b98caa48c95f58d2 not found: ID does not exist" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.655719 4771 scope.go:117] "RemoveContainer" containerID="cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f" Feb 19 21:52:09 crc kubenswrapper[4771]: E0219 21:52:09.655969 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f\": container with ID starting with cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f not found: ID does not exist" containerID="cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.655991 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f"} err="failed to get container status \"cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f\": rpc error: code = NotFound desc = could not find container \"cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f\": container with ID starting with cd54f824b9bdafa633b3297d1a71f7b23123e3201e90fbd5e18f01c2f96fe46f not found: ID does not exist" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.670065 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.796287 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-config-data\") pod \"889fbe6e-7e88-45af-bb42-0ddae2231257\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.796345 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-public-tls-certs\") pod \"889fbe6e-7e88-45af-bb42-0ddae2231257\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.796460 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-internal-tls-certs\") pod \"889fbe6e-7e88-45af-bb42-0ddae2231257\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.796487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-combined-ca-bundle\") pod \"889fbe6e-7e88-45af-bb42-0ddae2231257\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.796541 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc7v2\" (UniqueName: \"kubernetes.io/projected/889fbe6e-7e88-45af-bb42-0ddae2231257-kube-api-access-pc7v2\") pod \"889fbe6e-7e88-45af-bb42-0ddae2231257\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.796615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/889fbe6e-7e88-45af-bb42-0ddae2231257-logs\") pod \"889fbe6e-7e88-45af-bb42-0ddae2231257\" (UID: \"889fbe6e-7e88-45af-bb42-0ddae2231257\") " Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.797643 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889fbe6e-7e88-45af-bb42-0ddae2231257-logs" (OuterVolumeSpecName: "logs") pod "889fbe6e-7e88-45af-bb42-0ddae2231257" (UID: "889fbe6e-7e88-45af-bb42-0ddae2231257"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.829486 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/889fbe6e-7e88-45af-bb42-0ddae2231257-kube-api-access-pc7v2" (OuterVolumeSpecName: "kube-api-access-pc7v2") pod "889fbe6e-7e88-45af-bb42-0ddae2231257" (UID: "889fbe6e-7e88-45af-bb42-0ddae2231257"). InnerVolumeSpecName "kube-api-access-pc7v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.834985 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-config-data" (OuterVolumeSpecName: "config-data") pod "889fbe6e-7e88-45af-bb42-0ddae2231257" (UID: "889fbe6e-7e88-45af-bb42-0ddae2231257"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.835808 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "889fbe6e-7e88-45af-bb42-0ddae2231257" (UID: "889fbe6e-7e88-45af-bb42-0ddae2231257"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.862925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "889fbe6e-7e88-45af-bb42-0ddae2231257" (UID: "889fbe6e-7e88-45af-bb42-0ddae2231257"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.875720 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "889fbe6e-7e88-45af-bb42-0ddae2231257" (UID: "889fbe6e-7e88-45af-bb42-0ddae2231257"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.898601 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc7v2\" (UniqueName: \"kubernetes.io/projected/889fbe6e-7e88-45af-bb42-0ddae2231257-kube-api-access-pc7v2\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.898639 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/889fbe6e-7e88-45af-bb42-0ddae2231257-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.898649 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.898659 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.898667 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:09 crc kubenswrapper[4771]: I0219 21:52:09.898677 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/889fbe6e-7e88-45af-bb42-0ddae2231257-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.455567 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" path="/var/lib/kubelet/pods/4a5de993-201f-4ac4-a77f-df0a805dc205/volumes" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.580089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"889fbe6e-7e88-45af-bb42-0ddae2231257","Type":"ContainerDied","Data":"48eac43e1307352ae5f8df93f03d208f54b900bfbd319148d74d6f02386ca24b"} Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.580368 4771 scope.go:117] "RemoveContainer" containerID="bc7b7848d364235dd9f63d6ab0b8245aa7273f50cee50bf8c43a3e25d3f6c91c" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.580160 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.603241 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.610685 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.617580 4771 scope.go:117] "RemoveContainer" containerID="22e4abfa81a3f53f14c5200317a7937d65c732608eda55cab395272540c9b54c" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.629898 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.630786 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerName="init" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.630828 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerName="init" Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.630846 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-log" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.630852 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-log" Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.630864 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="registry-server" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.630895 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="registry-server" Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.630908 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerName="dnsmasq-dns" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.630914 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerName="dnsmasq-dns" Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.630925 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" containerName="nova-manage" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.630930 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" containerName="nova-manage" Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.630942 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="extract-utilities" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.630948 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="extract-utilities" Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.630984 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-api" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.630991 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-api" Feb 19 21:52:10 crc kubenswrapper[4771]: E0219 21:52:10.631004 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="extract-content" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.631010 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="extract-content" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.631577 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-log" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.631621 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" containerName="nova-manage" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.631633 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5de993-201f-4ac4-a77f-df0a805dc205" containerName="registry-server" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.631647 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" containerName="nova-api-api" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.631662 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e59cf4-4766-4548-9eb8-cd04f34153ef" containerName="dnsmasq-dns" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.633058 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.637676 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.638305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.638671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.649813 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.712773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-public-tls-certs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.713114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctnj\" (UniqueName: \"kubernetes.io/projected/63798ee6-b629-437f-9e15-5bd46b79894e-kube-api-access-rctnj\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.713367 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.713444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-config-data\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.713708 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63798ee6-b629-437f-9e15-5bd46b79894e-logs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.713738 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.815518 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.815826 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-config-data\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.816089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63798ee6-b629-437f-9e15-5bd46b79894e-logs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.816215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.816444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-public-tls-certs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.816589 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rctnj\" (UniqueName: \"kubernetes.io/projected/63798ee6-b629-437f-9e15-5bd46b79894e-kube-api-access-rctnj\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.816733 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63798ee6-b629-437f-9e15-5bd46b79894e-logs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.819742 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-config-data\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.820266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.820610 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-public-tls-certs\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.820657 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.835339 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rctnj\" (UniqueName: \"kubernetes.io/projected/63798ee6-b629-437f-9e15-5bd46b79894e-kube-api-access-rctnj\") pod \"nova-api-0\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " pod="openstack/nova-api-0" Feb 19 21:52:10 crc kubenswrapper[4771]: I0219 21:52:10.951095 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:52:11 crc kubenswrapper[4771]: I0219 21:52:11.412092 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:11 crc kubenswrapper[4771]: I0219 21:52:11.590750 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63798ee6-b629-437f-9e15-5bd46b79894e","Type":"ContainerStarted","Data":"c549507f26ed1faac44d987278eaf60a40ebcbb54aa748de7b8f3de70fdb58c6"} Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.194970 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:46326->10.217.0.203:8775: read: connection reset by peer" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.195079 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:46324->10.217.0.203:8775: read: connection reset by peer" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.480880 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="889fbe6e-7e88-45af-bb42-0ddae2231257" path="/var/lib/kubelet/pods/889fbe6e-7e88-45af-bb42-0ddae2231257/volumes" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.601804 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63798ee6-b629-437f-9e15-5bd46b79894e","Type":"ContainerStarted","Data":"e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b"} Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.601881 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63798ee6-b629-437f-9e15-5bd46b79894e","Type":"ContainerStarted","Data":"89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7"} Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.603401 4771 generic.go:334] "Generic (PLEG): container finished" podID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerID="57fb31ab711e5d13d400bec76bdcbd0c4341b1fc0d21f6f6600f2084488fa088" exitCode=0 Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.603450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a738e2ab-8437-4ca3-8beb-a7a74d112ea4","Type":"ContainerDied","Data":"57fb31ab711e5d13d400bec76bdcbd0c4341b1fc0d21f6f6600f2084488fa088"} Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.603478 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a738e2ab-8437-4ca3-8beb-a7a74d112ea4","Type":"ContainerDied","Data":"cc7cc23095d312ba90ae35b9151c2252826d2e5ceed5e6cc4738150d746c040c"} Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.603492 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc7cc23095d312ba90ae35b9151c2252826d2e5ceed5e6cc4738150d746c040c" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.623412 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.623395847 podStartE2EDuration="2.623395847s" podCreationTimestamp="2026-02-19 21:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:52:12.619763811 +0000 UTC m=+1432.891206321" watchObservedRunningTime="2026-02-19 21:52:12.623395847 +0000 UTC m=+1432.894838317" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.645616 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.769585 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67hwk\" (UniqueName: \"kubernetes.io/projected/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-kube-api-access-67hwk\") pod \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.769665 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-combined-ca-bundle\") pod \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.769732 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-nova-metadata-tls-certs\") pod \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.769772 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-config-data\") pod \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.769822 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-logs\") pod \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\" (UID: \"a738e2ab-8437-4ca3-8beb-a7a74d112ea4\") " Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.770686 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-logs" (OuterVolumeSpecName: "logs") pod "a738e2ab-8437-4ca3-8beb-a7a74d112ea4" (UID: "a738e2ab-8437-4ca3-8beb-a7a74d112ea4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.790183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-kube-api-access-67hwk" (OuterVolumeSpecName: "kube-api-access-67hwk") pod "a738e2ab-8437-4ca3-8beb-a7a74d112ea4" (UID: "a738e2ab-8437-4ca3-8beb-a7a74d112ea4"). InnerVolumeSpecName "kube-api-access-67hwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.852194 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a738e2ab-8437-4ca3-8beb-a7a74d112ea4" (UID: "a738e2ab-8437-4ca3-8beb-a7a74d112ea4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.871511 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.871550 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67hwk\" (UniqueName: \"kubernetes.io/projected/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-kube-api-access-67hwk\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.871564 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.892540 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-config-data" (OuterVolumeSpecName: "config-data") pod "a738e2ab-8437-4ca3-8beb-a7a74d112ea4" (UID: "a738e2ab-8437-4ca3-8beb-a7a74d112ea4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.911290 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a738e2ab-8437-4ca3-8beb-a7a74d112ea4" (UID: "a738e2ab-8437-4ca3-8beb-a7a74d112ea4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.956703 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.956977 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.974251 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:12 crc kubenswrapper[4771]: I0219 21:52:12.974297 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738e2ab-8437-4ca3-8beb-a7a74d112ea4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.548112 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.583746 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8l9d\" (UniqueName: \"kubernetes.io/projected/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-kube-api-access-s8l9d\") pod \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.583824 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-config-data\") pod \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.583856 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-combined-ca-bundle\") pod \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\" (UID: \"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c\") " Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.593740 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-kube-api-access-s8l9d" (OuterVolumeSpecName: "kube-api-access-s8l9d") pod "f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" (UID: "f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c"). InnerVolumeSpecName "kube-api-access-s8l9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.626941 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-config-data" (OuterVolumeSpecName: "config-data") pod "f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" (UID: "f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.627095 4771 generic.go:334] "Generic (PLEG): container finished" podID="f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" containerID="afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" exitCode=0 Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.627209 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.627244 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c","Type":"ContainerDied","Data":"afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879"} Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.627317 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c","Type":"ContainerDied","Data":"bc589f3dac6ccad53e93b17f0c58798882b96cab2941547b8ff8ce3a47a4d57a"} Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.627335 4771 scope.go:117] "RemoveContainer" containerID="afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.627537 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.631186 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" (UID: "f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.671299 4771 scope.go:117] "RemoveContainer" containerID="afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" Feb 19 21:52:13 crc kubenswrapper[4771]: E0219 21:52:13.675821 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879\": container with ID starting with afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879 not found: ID does not exist" containerID="afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.675868 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879"} err="failed to get container status \"afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879\": rpc error: code = NotFound desc = could not find container \"afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879\": container with ID starting with afbeb2b677d805f93f1767a27a443a882a1e7bb6c8dfcb9f8f51454345f4a879 not found: ID does not exist" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.685864 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.685895 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8l9d\" (UniqueName: \"kubernetes.io/projected/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-kube-api-access-s8l9d\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.685908 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.686859 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.697580 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.715800 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:52:13 crc kubenswrapper[4771]: E0219 21:52:13.717242 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-log" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.717266 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-log" Feb 19 21:52:13 crc kubenswrapper[4771]: E0219 21:52:13.717280 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-metadata" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.717289 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-metadata" Feb 19 21:52:13 crc kubenswrapper[4771]: E0219 21:52:13.717309 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" containerName="nova-scheduler-scheduler" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.717319 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" containerName="nova-scheduler-scheduler" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.717745 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-metadata" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.717770 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" containerName="nova-scheduler-scheduler" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.717785 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" containerName="nova-metadata-log" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.719292 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.723073 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.723198 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.726321 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.787547 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-config-data\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.787985 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.788031 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwmnx\" (UniqueName: \"kubernetes.io/projected/fbb34081-bc3a-451a-93c5-c28299467781-kube-api-access-dwmnx\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.788157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.788286 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb34081-bc3a-451a-93c5-c28299467781-logs\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.890590 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.890663 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwmnx\" (UniqueName: \"kubernetes.io/projected/fbb34081-bc3a-451a-93c5-c28299467781-kube-api-access-dwmnx\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.890794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.890879 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb34081-bc3a-451a-93c5-c28299467781-logs\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.890979 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-config-data\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.891431 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb34081-bc3a-451a-93c5-c28299467781-logs\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.897675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-config-data\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.901862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.904782 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.920934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwmnx\" (UniqueName: \"kubernetes.io/projected/fbb34081-bc3a-451a-93c5-c28299467781-kube-api-access-dwmnx\") pod \"nova-metadata-0\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " pod="openstack/nova-metadata-0" Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.977861 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:52:13 crc kubenswrapper[4771]: I0219 21:52:13.995434 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.007867 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.009813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.014388 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.015970 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.043549 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.099199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-config-data\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.099290 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.099364 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvvxk\" (UniqueName: \"kubernetes.io/projected/98bf8c7c-234c-47ea-86aa-4b1313f7b983-kube-api-access-kvvxk\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.201401 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-config-data\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.201473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.201509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvvxk\" (UniqueName: \"kubernetes.io/projected/98bf8c7c-234c-47ea-86aa-4b1313f7b983-kube-api-access-kvvxk\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.205669 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-config-data\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.214946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.235679 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvvxk\" (UniqueName: \"kubernetes.io/projected/98bf8c7c-234c-47ea-86aa-4b1313f7b983-kube-api-access-kvvxk\") pod \"nova-scheduler-0\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.348068 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.452870 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a738e2ab-8437-4ca3-8beb-a7a74d112ea4" path="/var/lib/kubelet/pods/a738e2ab-8437-4ca3-8beb-a7a74d112ea4/volumes" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.456395 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c" path="/var/lib/kubelet/pods/f126c0f8-98ff-4bf2-b24f-5e30b2f1b42c/volumes" Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.506607 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:52:14 crc kubenswrapper[4771]: W0219 21:52:14.508586 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbb34081_bc3a_451a_93c5_c28299467781.slice/crio-f2a0610da71872c83c49077ebf3d4acebbb7370e56300a4159bbb53e9cfd6771 WatchSource:0}: Error finding container f2a0610da71872c83c49077ebf3d4acebbb7370e56300a4159bbb53e9cfd6771: Status 404 returned error can't find the container with id f2a0610da71872c83c49077ebf3d4acebbb7370e56300a4159bbb53e9cfd6771 Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.636796 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbb34081-bc3a-451a-93c5-c28299467781","Type":"ContainerStarted","Data":"f2a0610da71872c83c49077ebf3d4acebbb7370e56300a4159bbb53e9cfd6771"} Feb 19 21:52:14 crc kubenswrapper[4771]: I0219 21:52:14.817580 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:52:14 crc kubenswrapper[4771]: W0219 21:52:14.821003 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98bf8c7c_234c_47ea_86aa_4b1313f7b983.slice/crio-81a0cb6ef754df2b0925b0afa093507c7e97ef1c5e12cb0fa5d717bf48b52b4f WatchSource:0}: Error finding container 81a0cb6ef754df2b0925b0afa093507c7e97ef1c5e12cb0fa5d717bf48b52b4f: Status 404 returned error can't find the container with id 81a0cb6ef754df2b0925b0afa093507c7e97ef1c5e12cb0fa5d717bf48b52b4f Feb 19 21:52:15 crc kubenswrapper[4771]: I0219 21:52:15.648864 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"98bf8c7c-234c-47ea-86aa-4b1313f7b983","Type":"ContainerStarted","Data":"6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30"} Feb 19 21:52:15 crc kubenswrapper[4771]: I0219 21:52:15.651218 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"98bf8c7c-234c-47ea-86aa-4b1313f7b983","Type":"ContainerStarted","Data":"81a0cb6ef754df2b0925b0afa093507c7e97ef1c5e12cb0fa5d717bf48b52b4f"} Feb 19 21:52:15 crc kubenswrapper[4771]: I0219 21:52:15.654340 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbb34081-bc3a-451a-93c5-c28299467781","Type":"ContainerStarted","Data":"b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1"} Feb 19 21:52:15 crc kubenswrapper[4771]: I0219 21:52:15.654397 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbb34081-bc3a-451a-93c5-c28299467781","Type":"ContainerStarted","Data":"8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d"} Feb 19 21:52:15 crc kubenswrapper[4771]: I0219 21:52:15.677533 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.677511939 podStartE2EDuration="2.677511939s" podCreationTimestamp="2026-02-19 21:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:52:15.668997074 +0000 UTC m=+1435.940439554" watchObservedRunningTime="2026-02-19 21:52:15.677511939 +0000 UTC m=+1435.948954409" Feb 19 21:52:15 crc kubenswrapper[4771]: I0219 21:52:15.701538 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.701521154 podStartE2EDuration="2.701521154s" podCreationTimestamp="2026-02-19 21:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:52:15.693672477 +0000 UTC m=+1435.965114957" watchObservedRunningTime="2026-02-19 21:52:15.701521154 +0000 UTC m=+1435.972963614" Feb 19 21:52:19 crc kubenswrapper[4771]: I0219 21:52:19.044333 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:52:19 crc kubenswrapper[4771]: I0219 21:52:19.045611 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:52:19 crc kubenswrapper[4771]: I0219 21:52:19.348223 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:52:20 crc kubenswrapper[4771]: I0219 21:52:20.951598 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:52:20 crc kubenswrapper[4771]: I0219 21:52:20.951995 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:52:21 crc kubenswrapper[4771]: I0219 21:52:21.971174 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:52:21 crc kubenswrapper[4771]: I0219 21:52:21.971181 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:52:24 crc kubenswrapper[4771]: I0219 21:52:24.043879 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:52:24 crc kubenswrapper[4771]: I0219 21:52:24.044286 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:52:24 crc kubenswrapper[4771]: I0219 21:52:24.348353 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:52:24 crc kubenswrapper[4771]: I0219 21:52:24.392375 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:52:24 crc kubenswrapper[4771]: I0219 21:52:24.815808 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:52:25 crc kubenswrapper[4771]: I0219 21:52:25.060248 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:52:25 crc kubenswrapper[4771]: I0219 21:52:25.060300 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:52:27 crc kubenswrapper[4771]: I0219 21:52:27.639237 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 21:52:31 crc kubenswrapper[4771]: I0219 21:52:31.032475 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:52:31 crc kubenswrapper[4771]: I0219 21:52:31.033180 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:52:31 crc kubenswrapper[4771]: I0219 21:52:31.035586 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:52:31 crc kubenswrapper[4771]: I0219 21:52:31.039793 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:52:31 crc kubenswrapper[4771]: I0219 21:52:31.835040 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:52:31 crc kubenswrapper[4771]: I0219 21:52:31.860697 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:52:34 crc kubenswrapper[4771]: I0219 21:52:34.055998 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:52:34 crc kubenswrapper[4771]: I0219 21:52:34.056487 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:52:34 crc kubenswrapper[4771]: I0219 21:52:34.062692 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:52:34 crc kubenswrapper[4771]: I0219 21:52:34.067753 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:52:42 crc kubenswrapper[4771]: I0219 21:52:42.956902 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:52:42 crc kubenswrapper[4771]: I0219 21:52:42.958228 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:52:54 crc kubenswrapper[4771]: I0219 21:52:54.883084 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 21:52:54 crc kubenswrapper[4771]: I0219 21:52:54.883839 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="8efa5dd0-e91b-456b-b295-e45608c03c36" containerName="openstackclient" containerID="cri-o://da66b35c116d28899b782ad55eca69eeed804e2fa899b61d3bf29d45b676830f" gracePeriod=2 Feb 19 21:52:54 crc kubenswrapper[4771]: I0219 21:52:54.895618 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.008452 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3163-account-create-update-qmkbx"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.032431 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3163-account-create-update-qmkbx"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.052622 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.052884 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="cinder-scheduler" containerID="cri-o://f5328ae68d5231323e69942e8f00f3eca7b2e290877a4c8daac72b1fb1c774f5" gracePeriod=30 Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.053419 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="probe" containerID="cri-o://53dafb13a5c390196d3cb19b500110a7ca153d256650b8c84a399c53e300934e" gracePeriod=30 Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.072296 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.113914 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-bf8786687-hjsp2"] Feb 19 21:52:55 crc kubenswrapper[4771]: E0219 21:52:55.114365 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efa5dd0-e91b-456b-b295-e45608c03c36" containerName="openstackclient" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.114382 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efa5dd0-e91b-456b-b295-e45608c03c36" containerName="openstackclient" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.114554 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efa5dd0-e91b-456b-b295-e45608c03c36" containerName="openstackclient" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.115498 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.160001 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8569f64fdb-ctb42"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.161546 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.178906 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3163-account-create-update-r6n24"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.180103 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.199296 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.200334 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-sv7nf"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.201501 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.221932 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.238463 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-bf8786687-hjsp2"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.261163 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8569f64fdb-ctb42"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.270954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe42dab7-ecba-4c58-8c1d-e9489760692a-logs\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.273239 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.273369 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld47v\" (UniqueName: \"kubernetes.io/projected/fe42dab7-ecba-4c58-8c1d-e9489760692a-kube-api-access-ld47v\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.273459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.273604 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data-custom\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.273727 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hws4f\" (UniqueName: \"kubernetes.io/projected/7a560423-c65f-4855-94f2-31e6763e0817-kube-api-access-hws4f\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.273897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-combined-ca-bundle\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.274036 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a560423-c65f-4855-94f2-31e6763e0817-logs\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.274200 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-combined-ca-bundle\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.274283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data-custom\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: E0219 21:52:55.274640 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:52:55 crc kubenswrapper[4771]: E0219 21:52:55.274687 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data podName:c671dcf6-b1eb-4c4e-ba71-ae115ce811da nodeName:}" failed. No retries permitted until 2026-02-19 21:52:55.774675456 +0000 UTC m=+1476.046117926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data") pod "rabbitmq-server-0" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da") : configmap "rabbitmq-config-data" not found Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.305079 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3163-account-create-update-r6n24"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.333300 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.333554 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api-log" containerID="cri-o://4db106a42f266426c6257e14ae4da294c4869692c1f876160b68b9126d307faf" gracePeriod=30 Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.333701 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api" containerID="cri-o://b9023b235acac843c6d06970035afa4b4d903cfed719a74a48514a2a10f8f915" gracePeriod=30 Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.370153 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sv7nf"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.375489 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts\") pod \"root-account-create-update-sv7nf\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.375678 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.375763 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld47v\" (UniqueName: \"kubernetes.io/projected/fe42dab7-ecba-4c58-8c1d-e9489760692a-kube-api-access-ld47v\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.375834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.375910 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data-custom\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.375983 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hws4f\" (UniqueName: \"kubernetes.io/projected/7a560423-c65f-4855-94f2-31e6763e0817-kube-api-access-hws4f\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.376170 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l668x\" (UniqueName: \"kubernetes.io/projected/95488d16-b990-4ef7-b263-8245bc54d569-kube-api-access-l668x\") pod \"glance-3163-account-create-update-r6n24\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.376301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-combined-ca-bundle\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.376395 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95488d16-b990-4ef7-b263-8245bc54d569-operator-scripts\") pod \"glance-3163-account-create-update-r6n24\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.376949 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a560423-c65f-4855-94f2-31e6763e0817-logs\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.377044 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-combined-ca-bundle\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.377120 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data-custom\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.377195 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe42dab7-ecba-4c58-8c1d-e9489760692a-logs\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.377259 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqnqg\" (UniqueName: \"kubernetes.io/projected/3e4401f3-2019-40a6-8829-5e22598f176a-kube-api-access-bqnqg\") pod \"root-account-create-update-sv7nf\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.388952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-combined-ca-bundle\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.395267 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.397929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a560423-c65f-4855-94f2-31e6763e0817-logs\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.399441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe42dab7-ecba-4c58-8c1d-e9489760692a-logs\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.403217 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.412673 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data-custom\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.416566 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data-custom\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.419495 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-combined-ca-bundle\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.425693 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-x6lqr"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.478825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95488d16-b990-4ef7-b263-8245bc54d569-operator-scripts\") pod \"glance-3163-account-create-update-r6n24\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.479103 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqnqg\" (UniqueName: \"kubernetes.io/projected/3e4401f3-2019-40a6-8829-5e22598f176a-kube-api-access-bqnqg\") pod \"root-account-create-update-sv7nf\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.479135 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts\") pod \"root-account-create-update-sv7nf\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.479203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l668x\" (UniqueName: \"kubernetes.io/projected/95488d16-b990-4ef7-b263-8245bc54d569-kube-api-access-l668x\") pod \"glance-3163-account-create-update-r6n24\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.480556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95488d16-b990-4ef7-b263-8245bc54d569-operator-scripts\") pod \"glance-3163-account-create-update-r6n24\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.481200 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts\") pod \"root-account-create-update-sv7nf\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.517219 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-x6lqr"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.539879 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-676bcb667b-l4th4"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.541383 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.542504 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld47v\" (UniqueName: \"kubernetes.io/projected/fe42dab7-ecba-4c58-8c1d-e9489760692a-kube-api-access-ld47v\") pod \"barbican-worker-bf8786687-hjsp2\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.558686 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l668x\" (UniqueName: \"kubernetes.io/projected/95488d16-b990-4ef7-b263-8245bc54d569-kube-api-access-l668x\") pod \"glance-3163-account-create-update-r6n24\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.571717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqnqg\" (UniqueName: \"kubernetes.io/projected/3e4401f3-2019-40a6-8829-5e22598f176a-kube-api-access-bqnqg\") pod \"root-account-create-update-sv7nf\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.572587 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hws4f\" (UniqueName: \"kubernetes.io/projected/7a560423-c65f-4855-94f2-31e6763e0817-kube-api-access-hws4f\") pod \"barbican-keystone-listener-8569f64fdb-ctb42\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.586820 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.586860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2cwj\" (UniqueName: \"kubernetes.io/projected/dede6da5-d7c3-45f8-9431-41ee414e21d3-kube-api-access-v2cwj\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.586880 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data-custom\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.586923 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-combined-ca-bundle\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.586945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-internal-tls-certs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.586960 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dede6da5-d7c3-45f8-9431-41ee414e21d3-logs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.587000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-public-tls-certs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.589637 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-j4b5p"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.592411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sv7nf" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.603596 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.607258 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-j4b5p"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.628438 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.641374 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-676bcb667b-l4th4"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rcqw\" (UniqueName: \"kubernetes.io/projected/a4200695-7cfa-41e9-a0ff-948f4a767b81-kube-api-access-8rcqw\") pod \"nova-cell0-c5fb-account-create-update-j4b5p\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724455 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2cwj\" (UniqueName: \"kubernetes.io/projected/dede6da5-d7c3-45f8-9431-41ee414e21d3-kube-api-access-v2cwj\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724511 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data-custom\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724551 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-combined-ca-bundle\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724581 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dede6da5-d7c3-45f8-9431-41ee414e21d3-logs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724598 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-internal-tls-certs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-public-tls-certs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.724830 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4200695-7cfa-41e9-a0ff-948f4a767b81-operator-scripts\") pod \"nova-cell0-c5fb-account-create-update-j4b5p\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.730968 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dede6da5-d7c3-45f8-9431-41ee414e21d3-logs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.738909 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-combined-ca-bundle\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.739412 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-internal-tls-certs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.776723 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.793028 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8fb6-account-create-update-gmz58"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.800577 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.811621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data-custom\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.815206 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.820341 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.823620 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-public-tls-certs\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.829262 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 21:52:55 crc kubenswrapper[4771]: E0219 21:52:55.829650 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:52:55 crc kubenswrapper[4771]: E0219 21:52:55.829714 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data podName:c671dcf6-b1eb-4c4e-ba71-ae115ce811da nodeName:}" failed. No retries permitted until 2026-02-19 21:52:56.829699389 +0000 UTC m=+1477.101141859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data") pod "rabbitmq-server-0" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da") : configmap "rabbitmq-config-data" not found Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.830063 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4200695-7cfa-41e9-a0ff-948f4a767b81-operator-scripts\") pod \"nova-cell0-c5fb-account-create-update-j4b5p\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.830108 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rcqw\" (UniqueName: \"kubernetes.io/projected/a4200695-7cfa-41e9-a0ff-948f4a767b81-kube-api-access-8rcqw\") pod \"nova-cell0-c5fb-account-create-update-j4b5p\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.831191 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4200695-7cfa-41e9-a0ff-948f4a767b81-operator-scripts\") pod \"nova-cell0-c5fb-account-create-update-j4b5p\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.835870 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2cwj\" (UniqueName: \"kubernetes.io/projected/dede6da5-d7c3-45f8-9431-41ee414e21d3-kube-api-access-v2cwj\") pod \"barbican-api-676bcb667b-l4th4\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.842307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.871535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rcqw\" (UniqueName: \"kubernetes.io/projected/a4200695-7cfa-41e9-a0ff-948f4a767b81-kube-api-access-8rcqw\") pod \"nova-cell0-c5fb-account-create-update-j4b5p\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.898822 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8fb6-account-create-update-gmz58"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.925998 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.933765 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqxll\" (UniqueName: \"kubernetes.io/projected/0d819b52-ac9a-4713-bac8-2097460f3108-kube-api-access-bqxll\") pod \"placement-8fb6-account-create-update-gmz58\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.933860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d819b52-ac9a-4713-bac8-2097460f3108-operator-scripts\") pod \"placement-8fb6-account-create-update-gmz58\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.933978 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-qbq2m"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.945002 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.952281 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.968196 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8fb6-account-create-update-pqq7x"] Feb 19 21:52:55 crc kubenswrapper[4771]: I0219 21:52:55.987656 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-2zlx2"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.003629 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.035899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d819b52-ac9a-4713-bac8-2097460f3108-operator-scripts\") pod \"placement-8fb6-account-create-update-gmz58\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.035963 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.036004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrzp\" (UniqueName: \"kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.036208 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqxll\" (UniqueName: \"kubernetes.io/projected/0d819b52-ac9a-4713-bac8-2097460f3108-kube-api-access-bqxll\") pod \"placement-8fb6-account-create-update-gmz58\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.036373 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.036441 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data podName:17cd62c3-f3af-4144-b6b1-ec2cafb424ad nodeName:}" failed. No retries permitted until 2026-02-19 21:52:56.536425775 +0000 UTC m=+1476.807868245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data") pod "rabbitmq-cell1-server-0" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.037646 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d819b52-ac9a-4713-bac8-2097460f3108-operator-scripts\") pod \"placement-8fb6-account-create-update-gmz58\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.039745 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.040064 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-qbq2m"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.077652 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqxll\" (UniqueName: \"kubernetes.io/projected/0d819b52-ac9a-4713-bac8-2097460f3108-kube-api-access-bqxll\") pod \"placement-8fb6-account-create-update-gmz58\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.100344 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8fb6-account-create-update-pqq7x"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.124727 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-2zlx2"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.138160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.138205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrzp\" (UniqueName: \"kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.138655 4771 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.138698 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:52:56.638685859 +0000 UTC m=+1476.910128329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : configmap "openstack-cell1-scripts" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.138807 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f977-account-create-update-d6pbn"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.140358 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.146309 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.155579 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.164413 4771 projected.go:194] Error preparing data for projected volume kube-api-access-8hrzp for pod openstack/nova-cell1-70bf-account-create-update-qbq2m: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.164470 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:52:56.664453819 +0000 UTC m=+1476.935896289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8hrzp" (UniqueName: "kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.198195 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f977-account-create-update-d6pbn"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.248002 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.249650 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="openstack-network-exporter" containerID="cri-o://a0ca6423b8cc74730d8f188d3660eb639df435604c4bb4ca1789f192ebeca7cd" gracePeriod=300 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.261106 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerID="4db106a42f266426c6257e14ae4da294c4869692c1f876160b68b9126d307faf" exitCode=143 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.261143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e","Type":"ContainerDied","Data":"4db106a42f266426c6257e14ae4da294c4869692c1f876160b68b9126d307faf"} Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.273048 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-jdsz9"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.289779 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-jdsz9"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.308302 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8302-account-create-update-hcgps"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.311512 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.317228 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.343102 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6938-account-create-update-jvz85"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.344834 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.347849 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.369522 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhbp\" (UniqueName: \"kubernetes.io/projected/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-kube-api-access-7vhbp\") pod \"neutron-f977-account-create-update-d6pbn\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.372858 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-operator-scripts\") pod \"neutron-f977-account-create-update-d6pbn\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.380337 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6938-account-create-update-jvz85"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.386504 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="ovsdbserver-sb" containerID="cri-o://44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070" gracePeriod=300 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.436055 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8302-account-create-update-hcgps"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.448617 4771 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement-b5d9cf798-b8q9j" secret="" err="secret \"placement-placement-dockercfg-h2sgl\" not found" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.463777 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138981ec-af45-4888-a757-b666d60513d3" path="/var/lib/kubelet/pods/138981ec-af45-4888-a757-b666d60513d3/volumes" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.468082 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61c8262a-4240-4885-9d99-cdd8e0ed4c44" path="/var/lib/kubelet/pods/61c8262a-4240-4885-9d99-cdd8e0ed4c44/volumes" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.468686 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7135669-613c-45c3-90fa-f8eee17faa38" path="/var/lib/kubelet/pods/d7135669-613c-45c3-90fa-f8eee17faa38/volumes" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.469191 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63630cc-8a7e-4c82-a69b-3babe4a33e43" path="/var/lib/kubelet/pods/e63630cc-8a7e-4c82-a69b-3babe4a33e43/volumes" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.469650 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68b419d-929c-4ea3-a95c-dd6a436248a3" path="/var/lib/kubelet/pods/f68b419d-929c-4ea3-a95c-dd6a436248a3/volumes" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.476496 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-operator-scripts\") pod \"neutron-f977-account-create-update-d6pbn\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.476605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbhkr\" (UniqueName: \"kubernetes.io/projected/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-kube-api-access-vbhkr\") pod \"barbican-6938-account-create-update-jvz85\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.476631 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhbp\" (UniqueName: \"kubernetes.io/projected/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-kube-api-access-7vhbp\") pod \"neutron-f977-account-create-update-d6pbn\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.476650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-operator-scripts\") pod \"barbican-6938-account-create-update-jvz85\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.476668 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb2j6\" (UniqueName: \"kubernetes.io/projected/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-kube-api-access-gb2j6\") pod \"cinder-8302-account-create-update-hcgps\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.476691 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-operator-scripts\") pod \"cinder-8302-account-create-update-hcgps\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.477507 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-operator-scripts\") pod \"neutron-f977-account-create-update-d6pbn\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.498299 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f977-account-create-update-n8dq9"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.527479 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-n8v7z"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.537983 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhbp\" (UniqueName: \"kubernetes.io/projected/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-kube-api-access-7vhbp\") pod \"neutron-f977-account-create-update-d6pbn\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.538066 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f977-account-create-update-n8dq9"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.545200 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-n8v7z"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.561327 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8302-account-create-update-gjj56"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.576130 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8302-account-create-update-gjj56"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.578323 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-operator-scripts\") pod \"cinder-8302-account-create-update-hcgps\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.578588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbhkr\" (UniqueName: \"kubernetes.io/projected/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-kube-api-access-vbhkr\") pod \"barbican-6938-account-create-update-jvz85\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.578629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-operator-scripts\") pod \"barbican-6938-account-create-update-jvz85\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.578649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb2j6\" (UniqueName: \"kubernetes.io/projected/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-kube-api-access-gb2j6\") pod \"cinder-8302-account-create-update-hcgps\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.579103 4771 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.579147 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:52:57.079133143 +0000 UTC m=+1477.350575613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-scripts" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.579659 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-operator-scripts\") pod \"cinder-8302-account-create-update-hcgps\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.579697 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-operator-scripts\") pod \"barbican-6938-account-create-update-jvz85\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.579752 4771 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.579781 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:52:57.0797738 +0000 UTC m=+1477.351216270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.583927 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4q67w"] Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.585079 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.585147 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data podName:17cd62c3-f3af-4144-b6b1-ec2cafb424ad nodeName:}" failed. No retries permitted until 2026-02-19 21:52:57.585130161 +0000 UTC m=+1477.856572631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data") pod "rabbitmq-cell1-server-0" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.599004 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6938-account-create-update-jnb4q"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.619572 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6938-account-create-update-jnb4q"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.621103 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4q67w"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.643243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb2j6\" (UniqueName: \"kubernetes.io/projected/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-kube-api-access-gb2j6\") pod \"cinder-8302-account-create-update-hcgps\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.646890 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cbm2n"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.648449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbhkr\" (UniqueName: \"kubernetes.io/projected/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-kube-api-access-vbhkr\") pod \"barbican-6938-account-create-update-jvz85\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.648511 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-cbm2n" podUID="5e98e55e-96cb-4936-8220-18db4047873e" containerName="openstack-network-exporter" containerID="cri-o://f081b25aaea300f6e060b204c908ec660e0941d0cadd8d054f4c1cb78c13b8e8" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.662344 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-94qm4"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.676015 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.691052 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.691222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrzp\" (UniqueName: \"kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.691995 4771 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.692137 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:52:57.692111439 +0000 UTC m=+1477.963553909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : configmap "openstack-cell1-scripts" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.696329 4771 projected.go:194] Error preparing data for projected volume kube-api-access-8hrzp for pod openstack/nova-cell1-70bf-account-create-update-qbq2m: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.696396 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:52:57.696378402 +0000 UTC m=+1477.967820872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hrzp" (UniqueName: "kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.707998 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-js7lh"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.749241 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.749797 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="ovn-northd" containerID="cri-o://afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.749997 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="openstack-network-exporter" containerID="cri-o://8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.790590 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gtz9t"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.797834 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gtz9t"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.806527 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-25p7k"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.814413 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-25p7k"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.814925 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.826465 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.859283 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.859733 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-server" containerID="cri-o://25638182c5db50cf9a8ae3158a1f97e75098b4e1dd04c4bf5390d848f2cf8116" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860133 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="swift-recon-cron" containerID="cri-o://50d7f0fe603dcd477e42bf6d5cd23293366abcc57115b1583128c070088d06e9" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860184 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="rsync" containerID="cri-o://f2531aed0329bf883f8ab6b773f55605090d3db1851e36c0cc40d24deefd3f69" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860217 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-expirer" containerID="cri-o://e632fa06b132d46b46064debfe1c40044370bb1e249759e09ae099922fe08194" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860248 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-updater" containerID="cri-o://37b58896182f7320a318dedbc51a682c252143cf1f5f7dc1cc5a4d0d41fd6a9e" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860278 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-auditor" containerID="cri-o://d59b28bcc6dd2c89348367ae07b2b6652ecabd2e3eb7e906cb4ce9dfbdfd12d9" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860307 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-replicator" containerID="cri-o://89041a4e9d9ebc9c0d58f88cef337d9f987ef35bcc4c225ef45ad3a183a71e6a" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860335 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-server" containerID="cri-o://56daf9ee26ebf4c3d4c6900b71838aba932c5f271a2526fe5d21871ad34fdb76" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860367 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-updater" containerID="cri-o://53da6e944f47c7e6208414ec28ec1aa3e842f3fe992ab24148a4550284b94cd8" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860398 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-auditor" containerID="cri-o://793312b4189c234b06ff32b07e8e0d5dfc74266fc0a1de9300ef68cf6768b6c2" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860427 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-replicator" containerID="cri-o://3e4e004099ba936950340e5e238183d77b40a39f4a97474d8ddb30dfaefb94a9" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860456 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-server" containerID="cri-o://eb415e751672ce67a387f56913bb8398561b2f551eec4c660857fbfeb0059682" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860487 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-reaper" containerID="cri-o://f711f4133d60e80e95d60ebf53d43892d134a7d9eb5c4126f1c2d737d2fa0c70" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860530 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-auditor" containerID="cri-o://0d8112d0d11bb87f970617cd7d38234c852cb777a0ea50e9330bd9eec7a4c5b5" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.860572 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-replicator" containerID="cri-o://b55b4a1bb4d3ccc6512f177b1e2310a9648853f7b9b1181b71f4d9e34527c221" gracePeriod=30 Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.879378 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bvxqv"] Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.895854 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: E0219 21:52:56.895917 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data podName:c671dcf6-b1eb-4c4e-ba71-ae115ce811da nodeName:}" failed. No retries permitted until 2026-02-19 21:52:58.895901957 +0000 UTC m=+1479.167344427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data") pod "rabbitmq-server-0" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da") : configmap "rabbitmq-config-data" not found Feb 19 21:52:56 crc kubenswrapper[4771]: I0219 21:52:56.956667 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bvxqv"] Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.104433 4771 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.104713 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:52:58.104699177 +0000 UTC m=+1478.376141647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-config-data" not found Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.105111 4771 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.105148 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:52:58.105137329 +0000 UTC m=+1478.376579799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-scripts" not found Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.168420 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-hs22q"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.256077 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-hs22q"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.301001 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.301517 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="openstack-network-exporter" containerID="cri-o://a6a3ea0a05c4cc921bbd48b56d16a1efbe4f22bdc52aaa4b06df43fbc2c61851" gracePeriod=300 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.323153 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-8s8dd"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.323509 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" containerName="dnsmasq-dns" containerID="cri-o://5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad" gracePeriod=10 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.329329 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.329569 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-log" containerID="cri-o://286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.329958 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-httpd" containerID="cri-o://3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379177 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="e632fa06b132d46b46064debfe1c40044370bb1e249759e09ae099922fe08194" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379355 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="37b58896182f7320a318dedbc51a682c252143cf1f5f7dc1cc5a4d0d41fd6a9e" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379407 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="d59b28bcc6dd2c89348367ae07b2b6652ecabd2e3eb7e906cb4ce9dfbdfd12d9" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379458 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="89041a4e9d9ebc9c0d58f88cef337d9f987ef35bcc4c225ef45ad3a183a71e6a" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379532 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="53da6e944f47c7e6208414ec28ec1aa3e842f3fe992ab24148a4550284b94cd8" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379589 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="793312b4189c234b06ff32b07e8e0d5dfc74266fc0a1de9300ef68cf6768b6c2" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379657 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="3e4e004099ba936950340e5e238183d77b40a39f4a97474d8ddb30dfaefb94a9" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379710 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="f711f4133d60e80e95d60ebf53d43892d134a7d9eb5c4126f1c2d737d2fa0c70" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379757 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="0d8112d0d11bb87f970617cd7d38234c852cb777a0ea50e9330bd9eec7a4c5b5" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379814 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="b55b4a1bb4d3ccc6512f177b1e2310a9648853f7b9b1181b71f4d9e34527c221" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.379957 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"e632fa06b132d46b46064debfe1c40044370bb1e249759e09ae099922fe08194"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380057 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"37b58896182f7320a318dedbc51a682c252143cf1f5f7dc1cc5a4d0d41fd6a9e"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"d59b28bcc6dd2c89348367ae07b2b6652ecabd2e3eb7e906cb4ce9dfbdfd12d9"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380244 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"89041a4e9d9ebc9c0d58f88cef337d9f987ef35bcc4c225ef45ad3a183a71e6a"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"53da6e944f47c7e6208414ec28ec1aa3e842f3fe992ab24148a4550284b94cd8"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380384 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"793312b4189c234b06ff32b07e8e0d5dfc74266fc0a1de9300ef68cf6768b6c2"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380446 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"3e4e004099ba936950340e5e238183d77b40a39f4a97474d8ddb30dfaefb94a9"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"f711f4133d60e80e95d60ebf53d43892d134a7d9eb5c4126f1c2d737d2fa0c70"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380576 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"0d8112d0d11bb87f970617cd7d38234c852cb777a0ea50e9330bd9eec7a4c5b5"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.380631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"b55b4a1bb4d3ccc6512f177b1e2310a9648853f7b9b1181b71f4d9e34527c221"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.397998 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-j7wbb"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.405546 4771 generic.go:334] "Generic (PLEG): container finished" podID="8efa5dd0-e91b-456b-b295-e45608c03c36" containerID="da66b35c116d28899b782ad55eca69eeed804e2fa899b61d3bf29d45b676830f" exitCode=137 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.434719 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d59b620-a9f9-4539-98e2-a4ad4d97d442/ovsdbserver-sb/0.log" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.434907 4771 generic.go:334] "Generic (PLEG): container finished" podID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerID="a0ca6423b8cc74730d8f188d3660eb639df435604c4bb4ca1789f192ebeca7cd" exitCode=2 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.434998 4771 generic.go:334] "Generic (PLEG): container finished" podID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerID="44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070" exitCode=143 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.434976 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d59b620-a9f9-4539-98e2-a4ad4d97d442","Type":"ContainerDied","Data":"a0ca6423b8cc74730d8f188d3660eb639df435604c4bb4ca1789f192ebeca7cd"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.435194 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d59b620-a9f9-4539-98e2-a4ad4d97d442","Type":"ContainerDied","Data":"44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.446885 4771 generic.go:334] "Generic (PLEG): container finished" podID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerID="53dafb13a5c390196d3cb19b500110a7ca153d256650b8c84a399c53e300934e" exitCode=0 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.451278 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e1653c4-41e5-403a-ae0d-9c1a5762f287","Type":"ContainerDied","Data":"53dafb13a5c390196d3cb19b500110a7ca153d256650b8c84a399c53e300934e"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.451322 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-j7wbb"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.473316 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cbm2n_5e98e55e-96cb-4936-8220-18db4047873e/openstack-network-exporter/0.log" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.473349 4771 generic.go:334] "Generic (PLEG): container finished" podID="5e98e55e-96cb-4936-8220-18db4047873e" containerID="f081b25aaea300f6e060b204c908ec660e0941d0cadd8d054f4c1cb78c13b8e8" exitCode=2 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.473392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cbm2n" event={"ID":"5e98e55e-96cb-4936-8220-18db4047873e","Type":"ContainerDied","Data":"f081b25aaea300f6e060b204c908ec660e0941d0cadd8d054f4c1cb78c13b8e8"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.473448 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="ovsdbserver-nb" containerID="cri-o://714fd91309191f6caf2b8f7b6db8af29e9a164f11470713344c8212bb188178d" gracePeriod=300 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.508516 4771 generic.go:334] "Generic (PLEG): container finished" podID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerID="8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292" exitCode=2 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.508558 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa","Type":"ContainerDied","Data":"8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292"} Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.533304 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.533517 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-log" containerID="cri-o://4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.533650 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-httpd" containerID="cri-o://8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.539872 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b5d9cf798-b8q9j"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.540085 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b5d9cf798-b8q9j" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-log" containerID="cri-o://eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.540194 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b5d9cf798-b8q9j" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-api" containerID="cri-o://0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.541524 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-37b58896182f7320a318dedbc51a682c252143cf1f5f7dc1cc5a4d0d41fd6a9e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e1653c4_41e5_403a_ae0d_9c1a5762f287.slice/crio-conmon-53dafb13a5c390196d3cb19b500110a7ca153d256650b8c84a399c53e300934e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d59b620_a9f9_4539_98e2_a4ad4d97d442.slice/crio-conmon-44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-0d8112d0d11bb87f970617cd7d38234c852cb777a0ea50e9330bd9eec7a4c5b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56b8bc52_11f9_4adf_8ba3_fbe39197b5aa.slice/crio-conmon-8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-3e4e004099ba936950340e5e238183d77b40a39f4a97474d8ddb30dfaefb94a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-f2531aed0329bf883f8ab6b773f55605090d3db1851e36c0cc40d24deefd3f69.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-89041a4e9d9ebc9c0d58f88cef337d9f987ef35bcc4c225ef45ad3a183a71e6a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8efa5dd0_e91b_456b_b295_e45608c03c36.slice/crio-conmon-da66b35c116d28899b782ad55eca69eeed804e2fa899b61d3bf29d45b676830f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e98e55e_96cb_4936_8220_18db4047873e.slice/crio-f081b25aaea300f6e060b204c908ec660e0941d0cadd8d054f4c1cb78c13b8e8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74f56b2b_3ff3_4a35_a237_180c42816388.slice/crio-5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8efa5dd0_e91b_456b_b295_e45608c03c36.slice/crio-da66b35c116d28899b782ad55eca69eeed804e2fa899b61d3bf29d45b676830f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-3e4e004099ba936950340e5e238183d77b40a39f4a97474d8ddb30dfaefb94a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-f711f4133d60e80e95d60ebf53d43892d134a7d9eb5c4126f1c2d737d2fa0c70.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-d59b28bcc6dd2c89348367ae07b2b6652ecabd2e3eb7e906cb4ce9dfbdfd12d9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-53da6e944f47c7e6208414ec28ec1aa3e842f3fe992ab24148a4550284b94cd8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-0d8112d0d11bb87f970617cd7d38234c852cb777a0ea50e9330bd9eec7a4c5b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-b55b4a1bb4d3ccc6512f177b1e2310a9648853f7b9b1181b71f4d9e34527c221.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d59b620_a9f9_4539_98e2_a4ad4d97d442.slice/crio-44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-d59b28bcc6dd2c89348367ae07b2b6652ecabd2e3eb7e906cb4ce9dfbdfd12d9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-conmon-793312b4189c234b06ff32b07e8e0d5dfc74266fc0a1de9300ef68cf6768b6c2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-793312b4189c234b06ff32b07e8e0d5dfc74266fc0a1de9300ef68cf6768b6c2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56b8bc52_11f9_4adf_8ba3_fbe39197b5aa.slice/crio-8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-f711f4133d60e80e95d60ebf53d43892d134a7d9eb5c4126f1c2d737d2fa0c70.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5580e95c_81dc_4c90_bb0c_9b27a4a8c971.slice/crio-89041a4e9d9ebc9c0d58f88cef337d9f987ef35bcc4c225ef45ad3a183a71e6a.scope\": RecentStats: unable to find data in memory cache]" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.548142 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.559294 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lbm9h"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.582650 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lbm9h"] Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.589235 4771 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 19 21:52:57 crc kubenswrapper[4771]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:52:57 crc kubenswrapper[4771]: + source /usr/local/bin/container-scripts/functions Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNBridge=br-int Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNEncapType=geneve Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNAvailabilityZones= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ EnableChassisAsGateway=true Feb 19 21:52:57 crc kubenswrapper[4771]: ++ PhysicalNetworks= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNHostName= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:52:57 crc kubenswrapper[4771]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:52:57 crc kubenswrapper[4771]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:52:57 crc kubenswrapper[4771]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:52:57 crc kubenswrapper[4771]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:52:57 crc kubenswrapper[4771]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:52:57 crc kubenswrapper[4771]: + sleep 0.5 Feb 19 21:52:57 crc kubenswrapper[4771]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:52:57 crc kubenswrapper[4771]: + cleanup_ovsdb_server_semaphore Feb 19 21:52:57 crc kubenswrapper[4771]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:52:57 crc kubenswrapper[4771]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:52:57 crc kubenswrapper[4771]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-94qm4" message=< Feb 19 21:52:57 crc kubenswrapper[4771]: Exiting ovsdb-server (5) [ OK ] Feb 19 21:52:57 crc kubenswrapper[4771]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:52:57 crc kubenswrapper[4771]: + source /usr/local/bin/container-scripts/functions Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNBridge=br-int Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNEncapType=geneve Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNAvailabilityZones= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ EnableChassisAsGateway=true Feb 19 21:52:57 crc kubenswrapper[4771]: ++ PhysicalNetworks= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNHostName= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:52:57 crc kubenswrapper[4771]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:52:57 crc kubenswrapper[4771]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:52:57 crc kubenswrapper[4771]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:52:57 crc kubenswrapper[4771]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:52:57 crc kubenswrapper[4771]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:52:57 crc kubenswrapper[4771]: + sleep 0.5 Feb 19 21:52:57 crc kubenswrapper[4771]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:52:57 crc kubenswrapper[4771]: + cleanup_ovsdb_server_semaphore Feb 19 21:52:57 crc kubenswrapper[4771]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:52:57 crc kubenswrapper[4771]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:52:57 crc kubenswrapper[4771]: > Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.589270 4771 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 19 21:52:57 crc kubenswrapper[4771]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:52:57 crc kubenswrapper[4771]: + source /usr/local/bin/container-scripts/functions Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNBridge=br-int Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNEncapType=geneve Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNAvailabilityZones= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ EnableChassisAsGateway=true Feb 19 21:52:57 crc kubenswrapper[4771]: ++ PhysicalNetworks= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ OVNHostName= Feb 19 21:52:57 crc kubenswrapper[4771]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:52:57 crc kubenswrapper[4771]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:52:57 crc kubenswrapper[4771]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:52:57 crc kubenswrapper[4771]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:52:57 crc kubenswrapper[4771]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:52:57 crc kubenswrapper[4771]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:52:57 crc kubenswrapper[4771]: + sleep 0.5 Feb 19 21:52:57 crc kubenswrapper[4771]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:52:57 crc kubenswrapper[4771]: + cleanup_ovsdb_server_semaphore Feb 19 21:52:57 crc kubenswrapper[4771]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:52:57 crc kubenswrapper[4771]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:52:57 crc kubenswrapper[4771]: > pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" containerID="cri-o://e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.589299 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" containerID="cri-o://e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.602263 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3163-account-create-update-r6n24"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.612858 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.639301 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c4d6b9785-889jt"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.639563 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c4d6b9785-889jt" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-api" containerID="cri-o://28873d69596e9435ccc225be2172f05b9359e76cc1602b3c9c36e2902b461021" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.639901 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c4d6b9785-889jt" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-httpd" containerID="cri-o://89ed7fd8b6567c5530e1c7e743cf565b8674ecca14b4cf2d6f7229ab2c36478e" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.665096 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.665159 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data podName:17cd62c3-f3af-4144-b6b1-ec2cafb424ad nodeName:}" failed. No retries permitted until 2026-02-19 21:52:59.665145214 +0000 UTC m=+1479.936587684 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data") pod "rabbitmq-cell1-server-0" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.683944 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.207:5353: connect: connection refused" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.743337 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.743700 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-log" containerID="cri-o://8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.743898 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-metadata" containerID="cri-o://b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.759586 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-pckmk"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.763478 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cbm2n_5e98e55e-96cb-4936-8220-18db4047873e/openstack-network-exporter/0.log" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.763561 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.768795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.768843 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrzp\" (UniqueName: \"kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.770278 4771 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.770340 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:52:59.770323394 +0000 UTC m=+1480.041765864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : configmap "openstack-cell1-scripts" not found Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.785130 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-pckmk"] Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.791868 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070 is running failed: container process not found" containerID="44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.791900 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerName="rabbitmq" containerID="cri-o://2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0" gracePeriod=604800 Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.802623 4771 projected.go:194] Error preparing data for projected volume kube-api-access-8hrzp for pod openstack/nova-cell1-70bf-account-create-update-qbq2m: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.802698 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:52:59.80268046 +0000 UTC m=+1480.074122930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hrzp" (UniqueName: "kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.800222 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070 is running failed: container process not found" containerID="44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.817418 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070 is running failed: container process not found" containerID="44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 19 21:52:57 crc kubenswrapper[4771]: E0219 21:52:57.817493 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="ovsdbserver-sb" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.851005 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" containerID="cri-o://daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" gracePeriod=29 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.870129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-metrics-certs-tls-certs\") pod \"5e98e55e-96cb-4936-8220-18db4047873e\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.870312 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9qft\" (UniqueName: \"kubernetes.io/projected/5e98e55e-96cb-4936-8220-18db4047873e-kube-api-access-x9qft\") pod \"5e98e55e-96cb-4936-8220-18db4047873e\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.870519 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovs-rundir\") pod \"5e98e55e-96cb-4936-8220-18db4047873e\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.870727 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e98e55e-96cb-4936-8220-18db4047873e-config\") pod \"5e98e55e-96cb-4936-8220-18db4047873e\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.870761 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-combined-ca-bundle\") pod \"5e98e55e-96cb-4936-8220-18db4047873e\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.871263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovn-rundir\") pod \"5e98e55e-96cb-4936-8220-18db4047873e\" (UID: \"5e98e55e-96cb-4936-8220-18db4047873e\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.872214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "5e98e55e-96cb-4936-8220-18db4047873e" (UID: "5e98e55e-96cb-4936-8220-18db4047873e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.873133 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "5e98e55e-96cb-4936-8220-18db4047873e" (UID: "5e98e55e-96cb-4936-8220-18db4047873e"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.874117 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e98e55e-96cb-4936-8220-18db4047873e-config" (OuterVolumeSpecName: "config") pod "5e98e55e-96cb-4936-8220-18db4047873e" (UID: "5e98e55e-96cb-4936-8220-18db4047873e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.874542 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.875845 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8fb6-account-create-update-gmz58"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.882642 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5s24g"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.884480 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e98e55e-96cb-4936-8220-18db4047873e-kube-api-access-x9qft" (OuterVolumeSpecName: "kube-api-access-x9qft") pod "5e98e55e-96cb-4936-8220-18db4047873e" (UID: "5e98e55e-96cb-4936-8220-18db4047873e"). InnerVolumeSpecName "kube-api-access-x9qft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.905940 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5s24g"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.912008 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.922667 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c7696df7-bt9vd"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.922884 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c7696df7-bt9vd" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-httpd" containerID="cri-o://4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.923373 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6c7696df7-bt9vd" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-server" containerID="cri-o://4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.940437 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.940633 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-log" containerID="cri-o://89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.941156 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-api" containerID="cri-o://e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b" gracePeriod=30 Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.951532 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d59b620-a9f9-4539-98e2-a4ad4d97d442/ovsdbserver-sb/0.log" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.951594 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.968109 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xgm28"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.980076 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5v8b\" (UniqueName: \"kubernetes.io/projected/8efa5dd0-e91b-456b-b295-e45608c03c36-kube-api-access-k5v8b\") pod \"8efa5dd0-e91b-456b-b295-e45608c03c36\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.980133 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config\") pod \"8efa5dd0-e91b-456b-b295-e45608c03c36\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.980197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config-secret\") pod \"8efa5dd0-e91b-456b-b295-e45608c03c36\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.980297 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-combined-ca-bundle\") pod \"8efa5dd0-e91b-456b-b295-e45608c03c36\" (UID: \"8efa5dd0-e91b-456b-b295-e45608c03c36\") " Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.980816 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9qft\" (UniqueName: \"kubernetes.io/projected/5e98e55e-96cb-4936-8220-18db4047873e-kube-api-access-x9qft\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.980835 4771 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5e98e55e-96cb-4936-8220-18db4047873e-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.980845 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e98e55e-96cb-4936-8220-18db4047873e-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.981424 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xgm28"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.992399 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-71de-account-create-update-2djdv"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.996955 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-71de-account-create-update-2djdv"] Feb 19 21:52:57 crc kubenswrapper[4771]: I0219 21:52:57.999880 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efa5dd0-e91b-456b-b295-e45608c03c36-kube-api-access-k5v8b" (OuterVolumeSpecName: "kube-api-access-k5v8b") pod "8efa5dd0-e91b-456b-b295-e45608c03c36" (UID: "8efa5dd0-e91b-456b-b295-e45608c03c36"). InnerVolumeSpecName "kube-api-access-k5v8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.005463 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qwjfh"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.013807 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qwjfh"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.045691 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-qbq2m"] Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.046392 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8hrzp operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" podUID="8e57507e-3fcb-4193-9810-727d1f3e4405" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.059215 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-56s9g"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.068119 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-56s9g"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.071105 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e98e55e-96cb-4936-8220-18db4047873e" (UID: "5e98e55e-96cb-4936-8220-18db4047873e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.079254 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.079390 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f977-account-create-update-d6pbn"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.081822 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-metrics-certs-tls-certs\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.081933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.081978 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdbserver-sb-tls-certs\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.082078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-combined-ca-bundle\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.082103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tmwv\" (UniqueName: \"kubernetes.io/projected/9d59b620-a9f9-4539-98e2-a4ad4d97d442-kube-api-access-9tmwv\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.082125 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-config\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.082166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-scripts\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.083104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-config" (OuterVolumeSpecName: "config") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.084517 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.086006 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-scripts" (OuterVolumeSpecName: "scripts") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.082217 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdb-rundir\") pod \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\" (UID: \"9d59b620-a9f9-4539-98e2-a4ad4d97d442\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.092746 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.092778 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.092788 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d59b620-a9f9-4539-98e2-a4ad4d97d442-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.092799 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.092811 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5v8b\" (UniqueName: \"kubernetes.io/projected/8efa5dd0-e91b-456b-b295-e45608c03c36-kube-api-access-k5v8b\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.092888 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.094315 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d59b620-a9f9-4539-98e2-a4ad4d97d442-kube-api-access-9tmwv" (OuterVolumeSpecName: "kube-api-access-9tmwv") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "kube-api-access-9tmwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.112532 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerName="galera" containerID="cri-o://8bb34e090626c241dae7a24235616f7cf7f7c5d4a8bd8d21f038334a18fdd9f1" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.112642 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.112878 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8c8e505e-8c04-4299-a2d1-89a5cb544b81" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.123448 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5p7sb"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.137927 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8302-account-create-update-hcgps"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.144242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8efa5dd0-e91b-456b-b295-e45608c03c36" (UID: "8efa5dd0-e91b-456b-b295-e45608c03c36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.144338 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.144860 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8efa5dd0-e91b-456b-b295-e45608c03c36" (UID: "8efa5dd0-e91b-456b-b295-e45608c03c36"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.150130 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5p7sb"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.164797 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zpprp"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.172833 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zpprp"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.182589 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-j4b5p"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.193960 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dgk76"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.194264 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-nb\") pod \"74f56b2b-3ff3-4a35-a237-180c42816388\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.194334 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-config\") pod \"74f56b2b-3ff3-4a35-a237-180c42816388\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.194427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-sb\") pod \"74f56b2b-3ff3-4a35-a237-180c42816388\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.194508 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-svc\") pod \"74f56b2b-3ff3-4a35-a237-180c42816388\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.194582 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-swift-storage-0\") pod \"74f56b2b-3ff3-4a35-a237-180c42816388\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.194647 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7hp4\" (UniqueName: \"kubernetes.io/projected/74f56b2b-3ff3-4a35-a237-180c42816388-kube-api-access-v7hp4\") pod \"74f56b2b-3ff3-4a35-a237-180c42816388\" (UID: \"74f56b2b-3ff3-4a35-a237-180c42816388\") " Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.195127 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tmwv\" (UniqueName: \"kubernetes.io/projected/9d59b620-a9f9-4539-98e2-a4ad4d97d442-kube-api-access-9tmwv\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.195152 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.195162 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.195170 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.195189 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.195470 4771 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.195527 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:53:00.195511525 +0000 UTC m=+1480.466953995 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-config-data" not found Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.195755 4771 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.195933 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:53:00.195912956 +0000 UTC m=+1480.467355436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-scripts" not found Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.202905 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dgk76"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.204622 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f56b2b-3ff3-4a35-a237-180c42816388-kube-api-access-v7hp4" (OuterVolumeSpecName: "kube-api-access-v7hp4") pod "74f56b2b-3ff3-4a35-a237-180c42816388" (UID: "74f56b2b-3ff3-4a35-a237-180c42816388"). InnerVolumeSpecName "kube-api-access-v7hp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.213055 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.223181 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6938-account-create-update-jvz85"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.230985 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-bf8786687-hjsp2"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.249288 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7965bd8f87-ttph6"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.249516 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7965bd8f87-ttph6" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker-log" containerID="cri-o://4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.249875 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7965bd8f87-ttph6" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker" containerID="cri-o://0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.255793 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.255967 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" containerName="nova-scheduler-scheduler" containerID="cri-o://6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.264285 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8c658556-bs9wt"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.264690 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener-log" containerID="cri-o://8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.264714 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener" containerID="cri-o://28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.270608 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8569f64fdb-ctb42"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.279857 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-676bcb667b-l4th4"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.297005 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7hp4\" (UniqueName: \"kubernetes.io/projected/74f56b2b-3ff3-4a35-a237-180c42816388-kube-api-access-v7hp4\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.303116 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-dbcb5799b-x6nw7"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.303341 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-dbcb5799b-x6nw7" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api-log" containerID="cri-o://4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.303385 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-dbcb5799b-x6nw7" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api" containerID="cri-o://8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.340477 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.340874 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="d56d3128-877b-4d8d-a48f-42a7b83e9347" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.348719 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jpvrl"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.352091 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jpvrl"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.359143 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rl7qm"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.364806 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.365111 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="4938b4a1-32f8-4e7a-b334-8ce3b649fb46" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" gracePeriod=30 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.373117 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sv7nf"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.379112 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rl7qm"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.433394 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-bf8786687-hjsp2"] Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.442303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "5e98e55e-96cb-4936-8220-18db4047873e" (UID: "5e98e55e-96cb-4936-8220-18db4047873e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.502715 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a50dde-d805-4a5a-8340-1d5b5ede00eb" path="/var/lib/kubelet/pods/11a50dde-d805-4a5a-8340-1d5b5ede00eb/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.504377 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2edbc87a-44f8-47ab-a23f-c9acd74e7e5a" path="/var/lib/kubelet/pods/2edbc87a-44f8-47ab-a23f-c9acd74e7e5a/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.505229 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5" path="/var/lib/kubelet/pods/32464c1b-bbca-48ed-a9c4-f7e0b41cdbc5/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.506480 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33435064-417d-458a-ba36-b343066cfbed" path="/var/lib/kubelet/pods/33435064-417d-458a-ba36-b343066cfbed/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.506655 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e98e55e-96cb-4936-8220-18db4047873e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.507272 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480beb32-325d-4cdd-bda8-428a55bcf4d4" path="/var/lib/kubelet/pods/480beb32-325d-4cdd-bda8-428a55bcf4d4/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.508424 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f7e5bf4-6d90-4348-bba6-0d062714aace" path="/var/lib/kubelet/pods/4f7e5bf4-6d90-4348-bba6-0d062714aace/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.509344 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="679ad423-9540-41da-8fc6-02004a615642" path="/var/lib/kubelet/pods/679ad423-9540-41da-8fc6-02004a615642/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.510623 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76514a37-a198-4afe-b306-46a87ed1f21f" path="/var/lib/kubelet/pods/76514a37-a198-4afe-b306-46a87ed1f21f/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.518858 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80696dba-d7cc-41e6-ba67-4299cdd675ff" path="/var/lib/kubelet/pods/80696dba-d7cc-41e6-ba67-4299cdd675ff/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.520103 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3d5744-034c-43b6-851e-ffc1fb4eca48" path="/var/lib/kubelet/pods/8a3d5744-034c-43b6-851e-ffc1fb4eca48/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.520930 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989cc18d-2aca-4d05-a256-8908d0c5ac60" path="/var/lib/kubelet/pods/989cc18d-2aca-4d05-a256-8908d0c5ac60/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.521569 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af945941-8fd6-4601-a954-899b8fb66625" path="/var/lib/kubelet/pods/af945941-8fd6-4601-a954-899b8fb66625/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.523325 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd70791-ecce-405a-ba45-968a1e967ef3" path="/var/lib/kubelet/pods/afd70791-ecce-405a-ba45-968a1e967ef3/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.524063 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6263f86-8184-4c2a-b2b0-80cfecba212d" path="/var/lib/kubelet/pods/b6263f86-8184-4c2a-b2b0-80cfecba212d/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.524969 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc00d259-80b7-4598-93f5-00f2ac227c87" path="/var/lib/kubelet/pods/bc00d259-80b7-4598-93f5-00f2ac227c87/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.528089 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb551a27-bdd4-433e-9d91-48ce01e7592c" path="/var/lib/kubelet/pods/cb551a27-bdd4-433e-9d91-48ce01e7592c/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.531409 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02" path="/var/lib/kubelet/pods/d2cfedeb-5cb5-4fc7-9bcf-eedd63c5bd02/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.535366 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fcae0a-de14-4853-83b4-7862f9fd5d53" path="/var/lib/kubelet/pods/d2fcae0a-de14-4853-83b4-7862f9fd5d53/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.536924 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d938b633-a2fc-4b3a-a2d1-6e080af4a3aa" path="/var/lib/kubelet/pods/d938b633-a2fc-4b3a-a2d1-6e080af4a3aa/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.538219 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc226cf-f7e9-47c7-b553-b6645f504d4d" path="/var/lib/kubelet/pods/dcc226cf-f7e9-47c7-b553-b6645f504d4d/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.541900 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b" path="/var/lib/kubelet/pods/ecc8f9b0-edc8-4048-91e7-1d0bfcdc006b/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: W0219 21:52:58.543231 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d819b52_ac9a_4713_bac8_2097460f3108.slice/crio-32195ebea052eeaf3dea950a1101743708d94b441072e4fb64fca2cba5386c8f WatchSource:0}: Error finding container 32195ebea052eeaf3dea950a1101743708d94b441072e4fb64fca2cba5386c8f: Status 404 returned error can't find the container with id 32195ebea052eeaf3dea950a1101743708d94b441072e4fb64fca2cba5386c8f Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.544957 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc115457-bb0d-4eb1-accb-43b6c88c79ef" path="/var/lib/kubelet/pods/fc115457-bb0d-4eb1-accb-43b6c88c79ef/volumes" Feb 19 21:52:58 crc kubenswrapper[4771]: W0219 21:52:58.545150 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4200695_7cfa_41e9_a0ff_948f4a767b81.slice/crio-a526fae19ac19109c2d7f2282715cd2f8cf2169cbbd16b1879dd131ea82434db WatchSource:0}: Error finding container a526fae19ac19109c2d7f2282715cd2f8cf2169cbbd16b1879dd131ea82434db: Status 404 returned error can't find the container with id a526fae19ac19109c2d7f2282715cd2f8cf2169cbbd16b1879dd131ea82434db Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.549913 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8efa5dd0-e91b-456b-b295-e45608c03c36" (UID: "8efa5dd0-e91b-456b-b295-e45608c03c36"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.565503 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.577435 4771 generic.go:334] "Generic (PLEG): container finished" podID="b5526752-6549-40aa-8443-9ad3572799d2" containerID="89ed7fd8b6567c5530e1c7e743cf565b8674ecca14b4cf2d6f7229ab2c36478e" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.593940 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74f56b2b-3ff3-4a35-a237-180c42816388" (UID: "74f56b2b-3ff3-4a35-a237-180c42816388"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.606521 4771 generic.go:334] "Generic (PLEG): container finished" podID="b64cc52a-4f20-4e05-b444-c46f97727527" containerID="4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.608110 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8efa5dd0-e91b-456b-b295-e45608c03c36-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.608209 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.608264 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.624420 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:52:58 crc kubenswrapper[4771]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: if [ -n "nova_cell0" ]; then Feb 19 21:52:58 crc kubenswrapper[4771]: GRANT_DATABASE="nova_cell0" Feb 19 21:52:58 crc kubenswrapper[4771]: else Feb 19 21:52:58 crc kubenswrapper[4771]: GRANT_DATABASE="*" Feb 19 21:52:58 crc kubenswrapper[4771]: fi Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: # going for maximum compatibility here: Feb 19 21:52:58 crc kubenswrapper[4771]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:52:58 crc kubenswrapper[4771]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:52:58 crc kubenswrapper[4771]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:52:58 crc kubenswrapper[4771]: # support updates Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.627442 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" podUID="a4200695-7cfa-41e9-a0ff-948f4a767b81" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.628162 4771 generic.go:334] "Generic (PLEG): container finished" podID="fbb34081-bc3a-451a-93c5-c28299467781" containerID="8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.629633 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:52:58 crc kubenswrapper[4771]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: if [ -n "placement" ]; then Feb 19 21:52:58 crc kubenswrapper[4771]: GRANT_DATABASE="placement" Feb 19 21:52:58 crc kubenswrapper[4771]: else Feb 19 21:52:58 crc kubenswrapper[4771]: GRANT_DATABASE="*" Feb 19 21:52:58 crc kubenswrapper[4771]: fi Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: # going for maximum compatibility here: Feb 19 21:52:58 crc kubenswrapper[4771]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:52:58 crc kubenswrapper[4771]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:52:58 crc kubenswrapper[4771]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:52:58 crc kubenswrapper[4771]: # support updates Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.630780 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-8fb6-account-create-update-gmz58" podUID="0d819b52-ac9a-4713-bac8-2097460f3108" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.631533 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-config" (OuterVolumeSpecName: "config") pod "74f56b2b-3ff3-4a35-a237-180c42816388" (UID: "74f56b2b-3ff3-4a35-a237-180c42816388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.635487 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74f56b2b-3ff3-4a35-a237-180c42816388" (UID: "74f56b2b-3ff3-4a35-a237-180c42816388"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.643710 4771 generic.go:334] "Generic (PLEG): container finished" podID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerID="4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.654195 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.655240 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:52:58 crc kubenswrapper[4771]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: if [ -n "glance" ]; then Feb 19 21:52:58 crc kubenswrapper[4771]: GRANT_DATABASE="glance" Feb 19 21:52:58 crc kubenswrapper[4771]: else Feb 19 21:52:58 crc kubenswrapper[4771]: GRANT_DATABASE="*" Feb 19 21:52:58 crc kubenswrapper[4771]: fi Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: # going for maximum compatibility here: Feb 19 21:52:58 crc kubenswrapper[4771]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:52:58 crc kubenswrapper[4771]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:52:58 crc kubenswrapper[4771]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:52:58 crc kubenswrapper[4771]: # support updates Feb 19 21:52:58 crc kubenswrapper[4771]: Feb 19 21:52:58 crc kubenswrapper[4771]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.659371 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d59b620-a9f9-4539-98e2-a4ad4d97d442/ovsdbserver-sb/0.log" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.659598 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.660500 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-3163-account-create-update-r6n24" podUID="95488d16-b990-4ef7-b263-8245bc54d569" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.688708 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerName="rabbitmq" containerID="cri-o://9649db40b00ac2459bb1317904d312d05a0f8a16cff8030457aea979837d6541" gracePeriod=604800 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.692270 4771 generic.go:334] "Generic (PLEG): container finished" podID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.700503 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cbm2n_5e98e55e-96cb-4936-8220-18db4047873e/openstack-network-exporter/0.log" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.700636 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cbm2n" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.703272 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9d59b620-a9f9-4539-98e2-a4ad4d97d442" (UID: "9d59b620-a9f9-4539-98e2-a4ad4d97d442"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.715928 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.715966 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.715980 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d59b620-a9f9-4539-98e2-a4ad4d97d442-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.715992 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.730978 4771 scope.go:117] "RemoveContainer" containerID="801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.744164 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74f56b2b-3ff3-4a35-a237-180c42816388" (UID: "74f56b2b-3ff3-4a35-a237-180c42816388"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.751189 4771 generic.go:334] "Generic (PLEG): container finished" podID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerID="f5328ae68d5231323e69942e8f00f3eca7b2e290877a4c8daac72b1fb1c774f5" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.773725 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74f56b2b-3ff3-4a35-a237-180c42816388" (UID: "74f56b2b-3ff3-4a35-a237-180c42816388"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.779219 4771 generic.go:334] "Generic (PLEG): container finished" podID="74f56b2b-3ff3-4a35-a237-180c42816388" containerID="5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.779360 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.785183 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerID="4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.788772 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d/ovsdbserver-nb/0.log" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.788805 4771 generic.go:334] "Generic (PLEG): container finished" podID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerID="a6a3ea0a05c4cc921bbd48b56d16a1efbe4f22bdc52aaa4b06df43fbc2c61851" exitCode=2 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.788820 4771 generic.go:334] "Generic (PLEG): container finished" podID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerID="714fd91309191f6caf2b8f7b6db8af29e9a164f11470713344c8212bb188178d" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.790616 4771 generic.go:334] "Generic (PLEG): container finished" podID="13181273-c19b-46fa-a087-da5695148b1f" containerID="4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.792450 4771 generic.go:334] "Generic (PLEG): container finished" podID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerID="286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.795075 4771 generic.go:334] "Generic (PLEG): container finished" podID="63798ee6-b629-437f-9e15-5bd46b79894e" containerID="89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.797238 4771 generic.go:334] "Generic (PLEG): container finished" podID="a201b832-5d71-4947-98ba-02adf91bccd5" containerID="8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.800381 4771 generic.go:334] "Generic (PLEG): container finished" podID="680c2961-e33c-44b1-aadd-37556bf4839c" containerID="eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2" exitCode=143 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.810241 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="f2531aed0329bf883f8ab6b773f55605090d3db1851e36c0cc40d24deefd3f69" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.810270 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="56daf9ee26ebf4c3d4c6900b71838aba932c5f271a2526fe5d21871ad34fdb76" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.810297 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="eb415e751672ce67a387f56913bb8398561b2f551eec4c660857fbfeb0059682" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.810308 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="25638182c5db50cf9a8ae3158a1f97e75098b4e1dd04c4bf5390d848f2cf8116" exitCode=0 Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.813188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.814153 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.817668 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: I0219 21:52:58.817694 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74f56b2b-3ff3-4a35-a237-180c42816388-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.919420 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:52:58 crc kubenswrapper[4771]: E0219 21:52:58.919681 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data podName:c671dcf6-b1eb-4c4e-ba71-ae115ce811da nodeName:}" failed. No retries permitted until 2026-02-19 21:53:02.91966634 +0000 UTC m=+1483.191108810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data") pod "rabbitmq-server-0" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da") : configmap "rabbitmq-config-data" not found Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4d6b9785-889jt" event={"ID":"b5526752-6549-40aa-8443-9ad3572799d2","Type":"ContainerDied","Data":"89ed7fd8b6567c5530e1c7e743cf565b8674ecca14b4cf2d6f7229ab2c36478e"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032327 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b64cc52a-4f20-4e05-b444-c46f97727527","Type":"ContainerDied","Data":"4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032340 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbb34081-bc3a-451a-93c5-c28299467781","Type":"ContainerDied","Data":"8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7965bd8f87-ttph6" event={"ID":"1b36b1b4-e091-4032-90aa-7f983b5c4b4f","Type":"ContainerDied","Data":"4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032369 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d59b620-a9f9-4539-98e2-a4ad4d97d442","Type":"ContainerDied","Data":"7addba05b8e742f3985c3a196710e220c7dbb9b61feaf9e7ae1906d6b18daf76"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032383 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerDied","Data":"e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" event={"ID":"7a560423-c65f-4855-94f2-31e6763e0817","Type":"ContainerStarted","Data":"58226702a8394cad17c25b445adb5fce46c4251260fa56f89f284fd6f1d546fa"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032407 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cbm2n" event={"ID":"5e98e55e-96cb-4936-8220-18db4047873e","Type":"ContainerDied","Data":"f864a1f2e7e627f7cad94478d02f1b0dfde0bba863d56d5cc3ee767858da1331"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032418 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8569f64fdb-ctb42"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sv7nf" event={"ID":"3e4401f3-2019-40a6-8829-5e22598f176a","Type":"ContainerStarted","Data":"801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032446 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8fb6-account-create-update-gmz58"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032457 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sv7nf" event={"ID":"3e4401f3-2019-40a6-8829-5e22598f176a","Type":"ContainerStarted","Data":"7291a0c26bbe5ef8bc71ff230695a118a89ae97ecec6d1667f035f97b5c5534f"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032465 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-676bcb667b-l4th4"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e1653c4-41e5-403a-ae0d-9c1a5762f287","Type":"ContainerDied","Data":"f5328ae68d5231323e69942e8f00f3eca7b2e290877a4c8daac72b1fb1c774f5"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" event={"ID":"74f56b2b-3ff3-4a35-a237-180c42816388","Type":"ContainerDied","Data":"5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032499 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-8s8dd" event={"ID":"74f56b2b-3ff3-4a35-a237-180c42816388","Type":"ContainerDied","Data":"65b4d5b7ce8c048cfbcd21e9d2203d18c1e36c93a47b658f2f3f5931c5ee7355"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dbcb5799b-x6nw7" event={"ID":"1e5f8191-842e-4a39-ac28-b0042c51f813","Type":"ContainerDied","Data":"4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032517 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-j4b5p"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf8786687-hjsp2" event={"ID":"fe42dab7-ecba-4c58-8c1d-e9489760692a","Type":"ContainerStarted","Data":"b125e97d3aa387a74d59769b99c8a8b7c2519de1b84fed4d8cf928e890c90ec4"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032538 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3163-account-create-update-r6n24"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032550 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf8786687-hjsp2" event={"ID":"fe42dab7-ecba-4c58-8c1d-e9489760692a","Type":"ContainerStarted","Data":"677e88010bf1996d567e664fa70b5a198a44a50288ec87470089047e4ac8f1ba"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d","Type":"ContainerDied","Data":"a6a3ea0a05c4cc921bbd48b56d16a1efbe4f22bdc52aaa4b06df43fbc2c61851"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032571 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d","Type":"ContainerDied","Data":"714fd91309191f6caf2b8f7b6db8af29e9a164f11470713344c8212bb188178d"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032580 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d","Type":"ContainerDied","Data":"4f68625c29b2b8b745321c052292050f23abb03d1b28fba389eca7332250cc3c"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032588 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f68625c29b2b8b745321c052292050f23abb03d1b28fba389eca7332250cc3c" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c7696df7-bt9vd" event={"ID":"13181273-c19b-46fa-a087-da5695148b1f","Type":"ContainerDied","Data":"4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"306067f7-88de-4cdb-8ca2-3540ada9b006","Type":"ContainerDied","Data":"286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032617 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63798ee6-b629-437f-9e15-5bd46b79894e","Type":"ContainerDied","Data":"89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" event={"ID":"a201b832-5d71-4947-98ba-02adf91bccd5","Type":"ContainerDied","Data":"8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b5d9cf798-b8q9j" event={"ID":"680c2961-e33c-44b1-aadd-37556bf4839c","Type":"ContainerDied","Data":"eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"f2531aed0329bf883f8ab6b773f55605090d3db1851e36c0cc40d24deefd3f69"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032664 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"56daf9ee26ebf4c3d4c6900b71838aba932c5f271a2526fe5d21871ad34fdb76"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032673 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"eb415e751672ce67a387f56913bb8398561b2f551eec4c660857fbfeb0059682"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032682 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"25638182c5db50cf9a8ae3158a1f97e75098b4e1dd04c4bf5390d848f2cf8116"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.032700 4771 scope.go:117] "RemoveContainer" containerID="a0ca6423b8cc74730d8f188d3660eb639df435604c4bb4ca1789f192ebeca7cd" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.118513 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d/ovsdbserver-nb/0.log" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.118579 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.185336 4771 scope.go:117] "RemoveContainer" containerID="44a17110699d7788904c20ac06cbba2638c9992c0ae1681e7fba006c02f61070" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.222994 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-metrics-certs-tls-certs\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.223090 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8xr7\" (UniqueName: \"kubernetes.io/projected/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-kube-api-access-k8xr7\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.223136 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-combined-ca-bundle\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.223224 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.223290 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-config\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.223308 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-scripts\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.223422 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdbserver-nb-tls-certs\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.223443 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdb-rundir\") pod \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\" (UID: \"9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.226432 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.227012 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-scripts" (OuterVolumeSpecName: "scripts") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.227447 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-config" (OuterVolumeSpecName: "config") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.239229 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-kube-api-access-k8xr7" (OuterVolumeSpecName: "kube-api-access-k8xr7") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "kube-api-access-k8xr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.239733 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.265794 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.278163 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.282214 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.284887 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cbm2n"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.293772 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-cbm2n"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.299548 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.303133 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-8s8dd"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.322508 4771 scope.go:117] "RemoveContainer" containerID="f081b25aaea300f6e060b204c908ec660e0941d0cadd8d054f4c1cb78c13b8e8" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.325409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlvtj\" (UniqueName: \"kubernetes.io/projected/4e1653c4-41e5-403a-ae0d-9c1a5762f287-kube-api-access-wlvtj\") pod \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.325661 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-8s8dd"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.325784 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-config-data\") pod \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.325863 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data-custom\") pod \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.325922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-combined-ca-bundle\") pod \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.325951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data\") pod \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.325999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e1653c4-41e5-403a-ae0d-9c1a5762f287-etc-machine-id\") pod \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.326043 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-vencrypt-tls-certs\") pod \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.326084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-scripts\") pod \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.327003 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e1653c4-41e5-403a-ae0d-9c1a5762f287-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4e1653c4-41e5-403a-ae0d-9c1a5762f287" (UID: "4e1653c4-41e5-403a-ae0d-9c1a5762f287"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.328833 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jfl4\" (UniqueName: \"kubernetes.io/projected/8c8e505e-8c04-4299-a2d1-89a5cb544b81-kube-api-access-4jfl4\") pod \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.328866 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-combined-ca-bundle\") pod \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\" (UID: \"4e1653c4-41e5-403a-ae0d-9c1a5762f287\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.328949 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-nova-novncproxy-tls-certs\") pod \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\" (UID: \"8c8e505e-8c04-4299-a2d1-89a5cb544b81\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.329484 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.329500 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8xr7\" (UniqueName: \"kubernetes.io/projected/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-kube-api-access-k8xr7\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.329510 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.329527 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.329535 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e1653c4-41e5-403a-ae0d-9c1a5762f287-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.329543 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.329553 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.337705 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-scripts" (OuterVolumeSpecName: "scripts") pod "4e1653c4-41e5-403a-ae0d-9c1a5762f287" (UID: "4e1653c4-41e5-403a-ae0d-9c1a5762f287"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.339998 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4e1653c4-41e5-403a-ae0d-9c1a5762f287" (UID: "4e1653c4-41e5-403a-ae0d-9c1a5762f287"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.342670 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1653c4-41e5-403a-ae0d-9c1a5762f287-kube-api-access-wlvtj" (OuterVolumeSpecName: "kube-api-access-wlvtj") pod "4e1653c4-41e5-403a-ae0d-9c1a5762f287" (UID: "4e1653c4-41e5-403a-ae0d-9c1a5762f287"). InnerVolumeSpecName "kube-api-access-wlvtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.349196 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8e505e-8c04-4299-a2d1-89a5cb544b81-kube-api-access-4jfl4" (OuterVolumeSpecName: "kube-api-access-4jfl4") pod "8c8e505e-8c04-4299-a2d1-89a5cb544b81" (UID: "8c8e505e-8c04-4299-a2d1-89a5cb544b81"). InnerVolumeSpecName "kube-api-access-4jfl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.355195 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.356507 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.357509 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.357575 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" containerName="nova-scheduler-scheduler" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.367163 4771 scope.go:117] "RemoveContainer" containerID="5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.379293 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.396472 4771 scope.go:117] "RemoveContainer" containerID="ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-internal-tls-certs\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434458 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-combined-ca-bundle\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434482 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-etc-swift\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434504 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-config-data\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434534 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-run-httpd\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434651 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjhlb\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-kube-api-access-sjhlb\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434675 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-public-tls-certs\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.434753 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-log-httpd\") pod \"13181273-c19b-46fa-a087-da5695148b1f\" (UID: \"13181273-c19b-46fa-a087-da5695148b1f\") " Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.435531 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlvtj\" (UniqueName: \"kubernetes.io/projected/4e1653c4-41e5-403a-ae0d-9c1a5762f287-kube-api-access-wlvtj\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.435549 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.435559 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.435569 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jfl4\" (UniqueName: \"kubernetes.io/projected/8c8e505e-8c04-4299-a2d1-89a5cb544b81-kube-api-access-4jfl4\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.442877 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.451611 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.472708 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-kube-api-access-sjhlb" (OuterVolumeSpecName: "kube-api-access-sjhlb") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "kube-api-access-sjhlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.491494 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f977-account-create-update-d6pbn"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.492757 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:52:59 crc kubenswrapper[4771]: W0219 21:52:59.513823 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd791a5dc_fbdc_43be_9442_ff3f7c4cbb17.slice/crio-aa9fb4350ff771f51fc5724a03058047295937d2ecd3ad21d7baceb41c037126 WatchSource:0}: Error finding container aa9fb4350ff771f51fc5724a03058047295937d2ecd3ad21d7baceb41c037126: Status 404 returned error can't find the container with id aa9fb4350ff771f51fc5724a03058047295937d2ecd3ad21d7baceb41c037126 Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.516076 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:52:59 crc kubenswrapper[4771]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: if [ -n "neutron" ]; then Feb 19 21:52:59 crc kubenswrapper[4771]: GRANT_DATABASE="neutron" Feb 19 21:52:59 crc kubenswrapper[4771]: else Feb 19 21:52:59 crc kubenswrapper[4771]: GRANT_DATABASE="*" Feb 19 21:52:59 crc kubenswrapper[4771]: fi Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: # going for maximum compatibility here: Feb 19 21:52:59 crc kubenswrapper[4771]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:52:59 crc kubenswrapper[4771]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:52:59 crc kubenswrapper[4771]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:52:59 crc kubenswrapper[4771]: # support updates Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.518155 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-f977-account-create-update-d6pbn" podUID="d791a5dc-fbdc-43be-9442-ff3f7c4cbb17" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.538948 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjhlb\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-kube-api-access-sjhlb\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.538993 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.539003 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13181273-c19b-46fa-a087-da5695148b1f-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.539012 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13181273-c19b-46fa-a087-da5695148b1f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.604417 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.604723 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-central-agent" containerID="cri-o://88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98" gracePeriod=30 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.605263 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="proxy-httpd" containerID="cri-o://1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60" gracePeriod=30 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.609296 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-notification-agent" containerID="cri-o://30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab" gracePeriod=30 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.609454 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="sg-core" containerID="cri-o://b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38" gracePeriod=30 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.632030 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.632220 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a9c230c6-1af8-440e-806d-b3b1e98544c0" containerName="kube-state-metrics" containerID="cri-o://e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92" gracePeriod=30 Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.640453 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.673427 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.677325 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.677388 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="ovn-northd" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.787113 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.790389 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.790506 4771 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.790552 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:03.790540403 +0000 UTC m=+1484.061982873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : configmap "openstack-cell1-scripts" not found Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.790597 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.790615 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data podName:17cd62c3-f3af-4144-b6b1-ec2cafb424ad nodeName:}" failed. No retries permitted until 2026-02-19 21:53:03.790609945 +0000 UTC m=+1484.062052415 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data") pod "rabbitmq-cell1-server-0" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.824801 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.831200 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8302-account-create-update-hcgps"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.832955 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6938-account-create-update-jvz85"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.857343 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.857765 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="1c812f8b-6ad1-4873-8999-e649acd07d91" containerName="memcached" containerID="cri-o://afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484" gracePeriod=30 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.892253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3163-account-create-update-r6n24" event={"ID":"95488d16-b990-4ef7-b263-8245bc54d569","Type":"ContainerStarted","Data":"0a9df7f2571cce70912a882b90c559fbbcf0dbe0bda014e96f74e0213d1d3291"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.900699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrzp\" (UniqueName: \"kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp\") pod \"nova-cell1-70bf-account-create-update-qbq2m\" (UID: \"8e57507e-3fcb-4193-9810-727d1f3e4405\") " pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.900871 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.904041 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:52:59 crc kubenswrapper[4771]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: if [ -n "cinder" ]; then Feb 19 21:52:59 crc kubenswrapper[4771]: GRANT_DATABASE="cinder" Feb 19 21:52:59 crc kubenswrapper[4771]: else Feb 19 21:52:59 crc kubenswrapper[4771]: GRANT_DATABASE="*" Feb 19 21:52:59 crc kubenswrapper[4771]: fi Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: # going for maximum compatibility here: Feb 19 21:52:59 crc kubenswrapper[4771]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:52:59 crc kubenswrapper[4771]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:52:59 crc kubenswrapper[4771]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:52:59 crc kubenswrapper[4771]: # support updates Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.904268 4771 projected.go:194] Error preparing data for projected volume kube-api-access-8hrzp for pod openstack/nova-cell1-70bf-account-create-update-qbq2m: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.904308 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp podName:8e57507e-3fcb-4193-9810-727d1f3e4405 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:03.904290531 +0000 UTC m=+1484.175733001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hrzp" (UniqueName: "kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp") pod "nova-cell1-70bf-account-create-update-qbq2m" (UID: "8e57507e-3fcb-4193-9810-727d1f3e4405") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.905145 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-8302-account-create-update-hcgps" podUID="0f06a8fe-d882-40f7-ac43-4eb2753d0ae4" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.918929 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:52:59 crc kubenswrapper[4771]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: if [ -n "barbican" ]; then Feb 19 21:52:59 crc kubenswrapper[4771]: GRANT_DATABASE="barbican" Feb 19 21:52:59 crc kubenswrapper[4771]: else Feb 19 21:52:59 crc kubenswrapper[4771]: GRANT_DATABASE="*" Feb 19 21:52:59 crc kubenswrapper[4771]: fi Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: # going for maximum compatibility here: Feb 19 21:52:59 crc kubenswrapper[4771]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:52:59 crc kubenswrapper[4771]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:52:59 crc kubenswrapper[4771]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:52:59 crc kubenswrapper[4771]: # support updates Feb 19 21:52:59 crc kubenswrapper[4771]: Feb 19 21:52:59 crc kubenswrapper[4771]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.920938 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9a3b-account-create-update-w4lvm"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.921081 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-6938-account-create-update-jvz85" podUID="af3b6c92-f6c8-4ea1-9bbb-3a5737468982" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.924096 4771 generic.go:334] "Generic (PLEG): container finished" podID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerID="8bb34e090626c241dae7a24235616f7cf7f7c5d4a8bd8d21f038334a18fdd9f1" exitCode=0 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.924231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2aa7dd4a-4295-4120-bff7-fc2c82f76aed","Type":"ContainerDied","Data":"8bb34e090626c241dae7a24235616f7cf7f7c5d4a8bd8d21f038334a18fdd9f1"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.924315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2aa7dd4a-4295-4120-bff7-fc2c82f76aed","Type":"ContainerDied","Data":"f5593b65aa4e9e23a17ff553a791eebc4181c912f7a11d4093a4c8d42397715e"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.924378 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5593b65aa4e9e23a17ff553a791eebc4181c912f7a11d4093a4c8d42397715e" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.928892 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" event={"ID":"7a560423-c65f-4855-94f2-31e6763e0817","Type":"ContainerStarted","Data":"b0c885cf9d242a07ffb1ee39664e6caefa677fa8a5dc413188745bc487f46952"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.936101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676bcb667b-l4th4" event={"ID":"dede6da5-d7c3-45f8-9431-41ee414e21d3","Type":"ContainerStarted","Data":"34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.936146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676bcb667b-l4th4" event={"ID":"dede6da5-d7c3-45f8-9431-41ee414e21d3","Type":"ContainerStarted","Data":"ed027fc6dde39b02857ed6027b796e33b283ca89ef88b0f9f8ce1177cd3e7b4a"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.938543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e1653c4-41e5-403a-ae0d-9c1a5762f287","Type":"ContainerDied","Data":"76c714da43fed73cdc5e041d1d5609867a3027b67b042963530d08b46a3c974f"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.938732 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.940107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8fb6-account-create-update-gmz58" event={"ID":"0d819b52-ac9a-4713-bac8-2097460f3108","Type":"ContainerStarted","Data":"32195ebea052eeaf3dea950a1101743708d94b441072e4fb64fca2cba5386c8f"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.967413 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9a3b-account-create-update-w4lvm"] Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.974681 4771 generic.go:334] "Generic (PLEG): container finished" podID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerID="b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38" exitCode=2 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.974786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerDied","Data":"b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.977055 4771 generic.go:334] "Generic (PLEG): container finished" podID="8c8e505e-8c04-4299-a2d1-89a5cb544b81" containerID="a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af" exitCode=0 Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.977255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c8e505e-8c04-4299-a2d1-89a5cb544b81","Type":"ContainerDied","Data":"a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.977361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c8e505e-8c04-4299-a2d1-89a5cb544b81","Type":"ContainerDied","Data":"d986d42ca294547e762a0387a3d9465d27f94eec9a401ab17d189dd82ae584b7"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.978088 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.979961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f977-account-create-update-d6pbn" event={"ID":"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17","Type":"ContainerStarted","Data":"aa9fb4350ff771f51fc5724a03058047295937d2ecd3ad21d7baceb41c037126"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.982874 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" event={"ID":"a4200695-7cfa-41e9-a0ff-948f4a767b81","Type":"ContainerStarted","Data":"a526fae19ac19109c2d7f2282715cd2f8cf2169cbbd16b1879dd131ea82434db"} Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994117 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9a3b-account-create-update-pstqn"] Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994636 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e98e55e-96cb-4936-8220-18db4047873e" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994657 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e98e55e-96cb-4936-8220-18db4047873e" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994667 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-httpd" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994674 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-httpd" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994691 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="ovsdbserver-nb" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994698 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="ovsdbserver-nb" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994711 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="cinder-scheduler" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994716 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="cinder-scheduler" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994727 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="probe" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994733 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="probe" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994747 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994752 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994766 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-server" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994772 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-server" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994782 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8e505e-8c04-4299-a2d1-89a5cb544b81" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994788 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8e505e-8c04-4299-a2d1-89a5cb544b81" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994801 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" containerName="init" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994807 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" containerName="init" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994822 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" containerName="dnsmasq-dns" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994827 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" containerName="dnsmasq-dns" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994840 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994846 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: E0219 21:52:59.994856 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="ovsdbserver-sb" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.994861 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="ovsdbserver-sb" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995099 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="cinder-scheduler" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995113 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" containerName="probe" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995121 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="ovsdbserver-nb" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995396 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8e505e-8c04-4299-a2d1-89a5cb544b81" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995411 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-httpd" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995427 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e98e55e-96cb-4936-8220-18db4047873e" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995439 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" containerName="dnsmasq-dns" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995448 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995454 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" containerName="ovsdbserver-sb" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995461 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" containerName="openstack-network-exporter" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.995474 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="13181273-c19b-46fa-a087-da5695148b1f" containerName="proxy-server" Feb 19 21:52:59 crc kubenswrapper[4771]: I0219 21:52:59.996088 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.006100 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.011638 4771 generic.go:334] "Generic (PLEG): container finished" podID="13181273-c19b-46fa-a087-da5695148b1f" containerID="4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df" exitCode=0 Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.011730 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c7696df7-bt9vd" event={"ID":"13181273-c19b-46fa-a087-da5695148b1f","Type":"ContainerDied","Data":"4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df"} Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.011757 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6c7696df7-bt9vd" event={"ID":"13181273-c19b-46fa-a087-da5695148b1f","Type":"ContainerDied","Data":"615ad1804139378182b1a5e5ee653bd158cf7f61b79685e05b5fcf774e8f84e5"} Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.011841 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6c7696df7-bt9vd" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.012191 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "8c8e505e-8c04-4299-a2d1-89a5cb544b81" (UID: "8c8e505e-8c04-4299-a2d1-89a5cb544b81"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.014773 4771 generic.go:334] "Generic (PLEG): container finished" podID="d56d3128-877b-4d8d-a48f-42a7b83e9347" containerID="4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6" exitCode=0 Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.014891 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d56d3128-877b-4d8d-a48f-42a7b83e9347","Type":"ContainerDied","Data":"4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6"} Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.016316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-config-data" (OuterVolumeSpecName: "config-data") pod "8c8e505e-8c04-4299-a2d1-89a5cb544b81" (UID: "8c8e505e-8c04-4299-a2d1-89a5cb544b81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.017459 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf8786687-hjsp2" event={"ID":"fe42dab7-ecba-4c58-8c1d-e9489760692a","Type":"ContainerStarted","Data":"cdce0b0c182a30ee60311ed46bbd115e0505327a1cb4095a7212d2553f1b9638"} Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.017639 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-bf8786687-hjsp2" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker-log" containerID="cri-o://b125e97d3aa387a74d59769b99c8a8b7c2519de1b84fed4d8cf928e890c90ec4" gracePeriod=30 Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.017930 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-bf8786687-hjsp2" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker" containerID="cri-o://cdce0b0c182a30ee60311ed46bbd115e0505327a1cb4095a7212d2553f1b9638" gracePeriod=30 Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.026086 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-c4gbj"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.030561 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.032688 4771 generic.go:334] "Generic (PLEG): container finished" podID="3e4401f3-2019-40a6-8829-5e22598f176a" containerID="801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139" exitCode=1 Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.032826 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.035511 4771 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-sv7nf" secret="" err="secret \"galera-openstack-dockercfg-ljqtn\" not found" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.035541 4771 scope.go:117] "RemoveContainer" containerID="e3a28c6e19caec26422bd2d82f956078d8831cbf0eb0eb927d8feeec18b08e83" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.035729 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-sv7nf_openstack(3e4401f3-2019-40a6-8829-5e22598f176a)\"" pod="openstack/root-account-create-update-sv7nf" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.035941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sv7nf" event={"ID":"3e4401f3-2019-40a6-8829-5e22598f176a","Type":"ContainerDied","Data":"801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139"} Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.035966 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sv7nf" event={"ID":"3e4401f3-2019-40a6-8829-5e22598f176a","Type":"ContainerStarted","Data":"e3a28c6e19caec26422bd2d82f956078d8831cbf0eb0eb927d8feeec18b08e83"} Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.036007 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-70bf-account-create-update-qbq2m" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.043994 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-c4gbj"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.050681 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9a3b-account-create-update-pstqn"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.054184 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-97gqw"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.060907 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-97gqw"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.066306 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7b9474445c-zjs86"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.066522 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7b9474445c-zjs86" podUID="4b605459-c654-463d-b66c-ec804185ea7d" containerName="keystone-api" containerID="cri-o://2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e" gracePeriod=30 Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.085262 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6 is running failed: container process not found" containerID="4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.095179 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.104269 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6 is running failed: container process not found" containerID="4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.104497 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c8e505e-8c04-4299-a2d1-89a5cb544b81" (UID: "8c8e505e-8c04-4299-a2d1-89a5cb544b81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.106571 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.106718 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7xt2\" (UniqueName: \"kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.106886 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.106898 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.106909 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.106917 4771 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.106975 4771 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.107029 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts podName:3e4401f3-2019-40a6-8829-5e22598f176a nodeName:}" failed. No retries permitted until 2026-02-19 21:53:00.60700229 +0000 UTC m=+1480.878444760 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts") pod "root-account-create-update-sv7nf" (UID: "3e4401f3-2019-40a6-8829-5e22598f176a") : configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.107469 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" (UID: "9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.108145 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6 is running failed: container process not found" containerID="4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.108213 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="d56d3128-877b-4d8d-a48f-42a7b83e9347" containerName="nova-cell1-conductor-conductor" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.108282 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e1653c4-41e5-403a-ae0d-9c1a5762f287" (UID: "4e1653c4-41e5-403a-ae0d-9c1a5762f287"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.131474 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.170:8776/healthcheck\": read tcp 10.217.0.2:49376->10.217.0.170:8776: read: connection reset by peer" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.135718 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.153450 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.165575 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9a3b-account-create-update-pstqn"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.173477 4771 scope.go:117] "RemoveContainer" containerID="5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.174340 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad\": container with ID starting with 5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad not found: ID does not exist" containerID="5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.174424 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad"} err="failed to get container status \"5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad\": rpc error: code = NotFound desc = could not find container \"5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad\": container with ID starting with 5436bc1f2e8f396c416435878245901759800628bfa8cf66afcfed9191fdc3ad not found: ID does not exist" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.174504 4771 scope.go:117] "RemoveContainer" containerID="ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.175902 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1\": container with ID starting with ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1 not found: ID does not exist" containerID="ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.175942 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1"} err="failed to get container status \"ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1\": rpc error: code = NotFound desc = could not find container \"ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1\": container with ID starting with ded96d4dc557e48c78ef35060fe48b8b96aa60024c64c6dfc90fd8381489d9b1 not found: ID does not exist" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.175966 4771 scope.go:117] "RemoveContainer" containerID="da66b35c116d28899b782ad55eca69eeed804e2fa899b61d3bf29d45b676830f" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.180752 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7jgvs"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.181552 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "8c8e505e-8c04-4299-a2d1-89a5cb544b81" (UID: "8c8e505e-8c04-4299-a2d1-89a5cb544b81"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.190958 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7jgvs"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.202501 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sv7nf"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.208683 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-operator-scripts\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.208722 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-generated\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.208749 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kolla-config\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.208808 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.208826 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7d4v\" (UniqueName: \"kubernetes.io/projected/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kube-api-access-h7d4v\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.208931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-default\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.209005 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-galera-tls-certs\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.209036 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-combined-ca-bundle\") pod \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\" (UID: \"2aa7dd4a-4295-4120-bff7-fc2c82f76aed\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.209260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xt2\" (UniqueName: \"kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.209381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.209443 4771 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e505e-8c04-4299-a2d1-89a5cb544b81-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.209454 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.209463 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.209537 4771 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.209579 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:53:04.209566421 +0000 UTC m=+1484.481008881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-config-data" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.222397 4771 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.222569 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts podName:680c2961-e33c-44b1-aadd-37556bf4839c nodeName:}" failed. No retries permitted until 2026-02-19 21:53:04.222552875 +0000 UTC m=+1484.493995345 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts") pod "placement-b5d9cf798-b8q9j" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c") : secret "placement-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.225582 4771 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.231449 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts podName:c7b86c07-52cb-4911-8b7b-205eaa247226 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:00.7314317 +0000 UTC m=+1481.002874170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts") pod "keystone-9a3b-account-create-update-pstqn" (UID: "c7b86c07-52cb-4911-8b7b-205eaa247226") : configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.225763 4771 projected.go:194] Error preparing data for projected volume kube-api-access-b7xt2 for pod openstack/keystone-9a3b-account-create-update-pstqn: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.232094 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2 podName:c7b86c07-52cb-4911-8b7b-205eaa247226 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:00.732084877 +0000 UTC m=+1481.003527347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b7xt2" (UniqueName: "kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2") pod "keystone-9a3b-account-create-update-pstqn" (UID: "c7b86c07-52cb-4911-8b7b-205eaa247226") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.235842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.238211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.241285 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.242717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kube-api-access-h7d4v" (OuterVolumeSpecName: "kube-api-access-h7d4v") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "kube-api-access-h7d4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.243233 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.244507 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.250808 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.263999 4771 scope.go:117] "RemoveContainer" containerID="53dafb13a5c390196d3cb19b500110a7ca153d256650b8c84a399c53e300934e" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.268281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-config-data" (OuterVolumeSpecName: "config-data") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.270440 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.279831 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-b7xt2 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-9a3b-account-create-update-pstqn" podUID="c7b86c07-52cb-4911-8b7b-205eaa247226" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.281846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.281958 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13181273-c19b-46fa-a087-da5695148b1f" (UID: "13181273-c19b-46fa-a087-da5695148b1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.295422 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-bf8786687-hjsp2" podStartSLOduration=6.295404091 podStartE2EDuration="6.295404091s" podCreationTimestamp="2026-02-19 21:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:53:00.044106037 +0000 UTC m=+1480.315548497" watchObservedRunningTime="2026-02-19 21:53:00.295404091 +0000 UTC m=+1480.566846551" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.312831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhzvm\" (UniqueName: \"kubernetes.io/projected/d56d3128-877b-4d8d-a48f-42a7b83e9347-kube-api-access-jhzvm\") pod \"d56d3128-877b-4d8d-a48f-42a7b83e9347\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.312895 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-config-data\") pod \"d56d3128-877b-4d8d-a48f-42a7b83e9347\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313011 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-combined-ca-bundle\") pod \"d56d3128-877b-4d8d-a48f-42a7b83e9347\" (UID: \"d56d3128-877b-4d8d-a48f-42a7b83e9347\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313432 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313446 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7d4v\" (UniqueName: \"kubernetes.io/projected/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kube-api-access-h7d4v\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313458 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313467 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313476 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313485 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313493 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313501 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313509 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313518 4771 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.313525 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13181273-c19b-46fa-a087-da5695148b1f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.328274 4771 scope.go:117] "RemoveContainer" containerID="f5328ae68d5231323e69942e8f00f3eca7b2e290877a4c8daac72b1fb1c774f5" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.338239 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56d3128-877b-4d8d-a48f-42a7b83e9347-kube-api-access-jhzvm" (OuterVolumeSpecName: "kube-api-access-jhzvm") pod "d56d3128-877b-4d8d-a48f-42a7b83e9347" (UID: "d56d3128-877b-4d8d-a48f-42a7b83e9347"). InnerVolumeSpecName "kube-api-access-jhzvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.344054 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-qbq2m"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.350307 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-70bf-account-create-update-qbq2m"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.379311 4771 scope.go:117] "RemoveContainer" containerID="a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.383049 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.391566 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.406203 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data" (OuterVolumeSpecName: "config-data") pod "4e1653c4-41e5-403a-ae0d-9c1a5762f287" (UID: "4e1653c4-41e5-403a-ae0d-9c1a5762f287"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.408221 4771 scope.go:117] "RemoveContainer" containerID="a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.409583 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af\": container with ID starting with a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af not found: ID does not exist" containerID="a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.409611 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af"} err="failed to get container status \"a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af\": rpc error: code = NotFound desc = could not find container \"a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af\": container with ID starting with a7f2c2b24db3f0a53b8e771b7820e66eeb2147c00e4926ee662d80a05fcdb0af not found: ID does not exist" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.409630 4771 scope.go:117] "RemoveContainer" containerID="4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.414454 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6c7696df7-bt9vd"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.415581 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhzvm\" (UniqueName: \"kubernetes.io/projected/d56d3128-877b-4d8d-a48f-42a7b83e9347-kube-api-access-jhzvm\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.415601 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1653c4-41e5-403a-ae0d-9c1a5762f287-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.420650 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6c7696df7-bt9vd"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.427470 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.428958 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.436303 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.517180 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerName="galera" containerID="cri-o://c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" gracePeriod=30 Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.523232 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.523264 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e57507e-3fcb-4193-9810-727d1f3e4405-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.523275 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hrzp\" (UniqueName: \"kubernetes.io/projected/8e57507e-3fcb-4193-9810-727d1f3e4405-kube-api-access-8hrzp\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.538987 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d56d3128-877b-4d8d-a48f-42a7b83e9347" (UID: "d56d3128-877b-4d8d-a48f-42a7b83e9347"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.558267 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f997b82-0501-49eb-820a-26d053e81b02" path="/var/lib/kubelet/pods/0f997b82-0501-49eb-820a-26d053e81b02/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.559831 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13181273-c19b-46fa-a087-da5695148b1f" path="/var/lib/kubelet/pods/13181273-c19b-46fa-a087-da5695148b1f/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.564254 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40643c92-54ba-4c6b-bad6-c8d0f7f8529b" path="/var/lib/kubelet/pods/40643c92-54ba-4c6b-bad6-c8d0f7f8529b/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.564898 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e98e55e-96cb-4936-8220-18db4047873e" path="/var/lib/kubelet/pods/5e98e55e-96cb-4936-8220-18db4047873e/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.565671 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fea7660-6ec1-4456-b730-44b1e3ad5941" path="/var/lib/kubelet/pods/5fea7660-6ec1-4456-b730-44b1e3ad5941/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.571750 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.572182 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.577146 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.577995 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f56b2b-3ff3-4a35-a237-180c42816388" path="/var/lib/kubelet/pods/74f56b2b-3ff3-4a35-a237-180c42816388/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.580472 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8e505e-8c04-4299-a2d1-89a5cb544b81" path="/var/lib/kubelet/pods/8c8e505e-8c04-4299-a2d1-89a5cb544b81/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.580936 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e57507e-3fcb-4193-9810-727d1f3e4405" path="/var/lib/kubelet/pods/8e57507e-3fcb-4193-9810-727d1f3e4405/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.581389 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8efa5dd0-e91b-456b-b295-e45608c03c36" path="/var/lib/kubelet/pods/8efa5dd0-e91b-456b-b295-e45608c03c36/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.582131 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.582509 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.582579 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.592718 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.597082 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d" path="/var/lib/kubelet/pods/9fbc55cb-d769-4fa4-a18e-a8ba6234fc0d/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.605858 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.605907 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.618570 4771 scope.go:117] "RemoveContainer" containerID="4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.620334 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2aa7dd4a-4295-4120-bff7-fc2c82f76aed" (UID: "2aa7dd4a-4295-4120-bff7-fc2c82f76aed"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.626129 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-js7lh" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:53:00 crc kubenswrapper[4771]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Feb 19 21:53:00 crc kubenswrapper[4771]: > Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.632030 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cc086f-16e1-41b6-b21b-a18c032ea3f3" path="/var/lib/kubelet/pods/e2cc086f-16e1-41b6-b21b-a18c032ea3f3/volumes" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.668452 4771 scope.go:117] "RemoveContainer" containerID="4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.678074 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df\": container with ID starting with 4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df not found: ID does not exist" containerID="4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.678134 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df"} err="failed to get container status \"4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df\": rpc error: code = NotFound desc = could not find container \"4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df\": container with ID starting with 4089b69d37ba7a9d25647229be098fa32e608365472091eecc659495e2b785df not found: ID does not exist" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.678159 4771 scope.go:117] "RemoveContainer" containerID="4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.678401 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-config-data" (OuterVolumeSpecName: "config-data") pod "d56d3128-877b-4d8d-a48f-42a7b83e9347" (UID: "d56d3128-877b-4d8d-a48f-42a7b83e9347"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.680129 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51\": container with ID starting with 4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51 not found: ID does not exist" containerID="4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.680171 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51"} err="failed to get container status \"4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51\": rpc error: code = NotFound desc = could not find container \"4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51\": container with ID starting with 4fb7efc2a99cc8ce93a0cfea58d11bdeffd869210f8d9844a7277b1516028b51 not found: ID does not exist" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.680194 4771 scope.go:117] "RemoveContainer" containerID="801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.681228 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.681748 4771 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa7dd4a-4295-4120-bff7-fc2c82f76aed-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.681796 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.681806 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d56d3128-877b-4d8d-a48f-42a7b83e9347-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.681768 4771 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.681860 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts podName:3e4401f3-2019-40a6-8829-5e22598f176a nodeName:}" failed. No retries permitted until 2026-02-19 21:53:01.681844868 +0000 UTC m=+1481.953287338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts") pod "root-account-create-update-sv7nf" (UID: "3e4401f3-2019-40a6-8829-5e22598f176a") : configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.685997 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.694953 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.700864 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.706994 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.710344 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.735146 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783029 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxmcw\" (UniqueName: \"kubernetes.io/projected/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-api-access-jxmcw\") pod \"a9c230c6-1af8-440e-806d-b3b1e98544c0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-config\") pod \"a9c230c6-1af8-440e-806d-b3b1e98544c0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783163 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95488d16-b990-4ef7-b263-8245bc54d569-operator-scripts\") pod \"95488d16-b990-4ef7-b263-8245bc54d569\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783199 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-operator-scripts\") pod \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783276 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vhbp\" (UniqueName: \"kubernetes.io/projected/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-kube-api-access-7vhbp\") pod \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\" (UID: \"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783299 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-certs\") pod \"a9c230c6-1af8-440e-806d-b3b1e98544c0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783375 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4200695-7cfa-41e9-a0ff-948f4a767b81-operator-scripts\") pod \"a4200695-7cfa-41e9-a0ff-948f4a767b81\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783422 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d819b52-ac9a-4713-bac8-2097460f3108-operator-scripts\") pod \"0d819b52-ac9a-4713-bac8-2097460f3108\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783442 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l668x\" (UniqueName: \"kubernetes.io/projected/95488d16-b990-4ef7-b263-8245bc54d569-kube-api-access-l668x\") pod \"95488d16-b990-4ef7-b263-8245bc54d569\" (UID: \"95488d16-b990-4ef7-b263-8245bc54d569\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783459 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-combined-ca-bundle\") pod \"a9c230c6-1af8-440e-806d-b3b1e98544c0\" (UID: \"a9c230c6-1af8-440e-806d-b3b1e98544c0\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783492 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqxll\" (UniqueName: \"kubernetes.io/projected/0d819b52-ac9a-4713-bac8-2097460f3108-kube-api-access-bqxll\") pod \"0d819b52-ac9a-4713-bac8-2097460f3108\" (UID: \"0d819b52-ac9a-4713-bac8-2097460f3108\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783514 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rcqw\" (UniqueName: \"kubernetes.io/projected/a4200695-7cfa-41e9-a0ff-948f4a767b81-kube-api-access-8rcqw\") pod \"a4200695-7cfa-41e9-a0ff-948f4a767b81\" (UID: \"a4200695-7cfa-41e9-a0ff-948f4a767b81\") " Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xt2\" (UniqueName: \"kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.783885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.783986 4771 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: E0219 21:53:00.784042 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts podName:c7b86c07-52cb-4911-8b7b-205eaa247226 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:01.784028779 +0000 UTC m=+1482.055471249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts") pod "keystone-9a3b-account-create-update-pstqn" (UID: "c7b86c07-52cb-4911-8b7b-205eaa247226") : configmap "openstack-scripts" not found Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.787598 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d819b52-ac9a-4713-bac8-2097460f3108-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d819b52-ac9a-4713-bac8-2097460f3108" (UID: "0d819b52-ac9a-4713-bac8-2097460f3108"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.787846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d791a5dc-fbdc-43be-9442-ff3f7c4cbb17" (UID: "d791a5dc-fbdc-43be-9442-ff3f7c4cbb17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.803259 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4200695-7cfa-41e9-a0ff-948f4a767b81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4200695-7cfa-41e9-a0ff-948f4a767b81" (UID: "a4200695-7cfa-41e9-a0ff-948f4a767b81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:00 crc kubenswrapper[4771]: I0219 21:53:00.804478 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95488d16-b990-4ef7-b263-8245bc54d569-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95488d16-b990-4ef7-b263-8245bc54d569" (UID: "95488d16-b990-4ef7-b263-8245bc54d569"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.214281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4200695-7cfa-41e9-a0ff-948f4a767b81-kube-api-access-8rcqw" (OuterVolumeSpecName: "kube-api-access-8rcqw") pod "a4200695-7cfa-41e9-a0ff-948f4a767b81" (UID: "a4200695-7cfa-41e9-a0ff-948f4a767b81"). InnerVolumeSpecName "kube-api-access-8rcqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.214588 4771 projected.go:194] Error preparing data for projected volume kube-api-access-b7xt2 for pod openstack/keystone-9a3b-account-create-update-pstqn: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.214635 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2 podName:c7b86c07-52cb-4911-8b7b-205eaa247226 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:02.214620502 +0000 UTC m=+1482.486062972 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b7xt2" (UniqueName: "kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2") pod "keystone-9a3b-account-create-update-pstqn" (UID: "c7b86c07-52cb-4911-8b7b-205eaa247226") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.215427 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-kube-api-access-7vhbp" (OuterVolumeSpecName: "kube-api-access-7vhbp") pod "d791a5dc-fbdc-43be-9442-ff3f7c4cbb17" (UID: "d791a5dc-fbdc-43be-9442-ff3f7c4cbb17"). InnerVolumeSpecName "kube-api-access-7vhbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.215646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95488d16-b990-4ef7-b263-8245bc54d569-kube-api-access-l668x" (OuterVolumeSpecName: "kube-api-access-l668x") pod "95488d16-b990-4ef7-b263-8245bc54d569" (UID: "95488d16-b990-4ef7-b263-8245bc54d569"). InnerVolumeSpecName "kube-api-access-l668x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.215749 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-api-access-jxmcw" (OuterVolumeSpecName: "kube-api-access-jxmcw") pod "a9c230c6-1af8-440e-806d-b3b1e98544c0" (UID: "a9c230c6-1af8-440e-806d-b3b1e98544c0"). InnerVolumeSpecName "kube-api-access-jxmcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216158 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d819b52-ac9a-4713-bac8-2097460f3108-kube-api-access-bqxll" (OuterVolumeSpecName: "kube-api-access-bqxll") pod "0d819b52-ac9a-4713-bac8-2097460f3108" (UID: "0d819b52-ac9a-4713-bac8-2097460f3108"). InnerVolumeSpecName "kube-api-access-bqxll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216906 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqxll\" (UniqueName: \"kubernetes.io/projected/0d819b52-ac9a-4713-bac8-2097460f3108-kube-api-access-bqxll\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216923 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rcqw\" (UniqueName: \"kubernetes.io/projected/a4200695-7cfa-41e9-a0ff-948f4a767b81-kube-api-access-8rcqw\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216933 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxmcw\" (UniqueName: \"kubernetes.io/projected/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-api-access-jxmcw\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216942 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95488d16-b990-4ef7-b263-8245bc54d569-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216951 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216960 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vhbp\" (UniqueName: \"kubernetes.io/projected/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17-kube-api-access-7vhbp\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216969 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4200695-7cfa-41e9-a0ff-948f4a767b81-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216977 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d819b52-ac9a-4713-bac8-2097460f3108-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.216986 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l668x\" (UniqueName: \"kubernetes.io/projected/95488d16-b990-4ef7-b263-8245bc54d569-kube-api-access-l668x\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.235191 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9c230c6-1af8-440e-806d-b3b1e98544c0" (UID: "a9c230c6-1af8-440e-806d-b3b1e98544c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.253824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f977-account-create-update-d6pbn" event={"ID":"d791a5dc-fbdc-43be-9442-ff3f7c4cbb17","Type":"ContainerDied","Data":"aa9fb4350ff771f51fc5724a03058047295937d2ecd3ad21d7baceb41c037126"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.253909 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f977-account-create-update-d6pbn" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.269551 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8302-account-create-update-hcgps" event={"ID":"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4","Type":"ContainerStarted","Data":"87799ad628c2f094b7e11dc2020dce84ae951545a698d9791ac50e63a97c5c59"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.283247 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "a9c230c6-1af8-440e-806d-b3b1e98544c0" (UID: "a9c230c6-1af8-440e-806d-b3b1e98544c0"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.292413 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8fb6-account-create-update-gmz58" event={"ID":"0d819b52-ac9a-4713-bac8-2097460f3108","Type":"ContainerDied","Data":"32195ebea052eeaf3dea950a1101743708d94b441072e4fb64fca2cba5386c8f"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.292538 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8fb6-account-create-update-gmz58" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.319240 4771 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.319295 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.331283 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": read tcp 10.217.0.2:39868->10.217.0.213:8775: read: connection reset by peer" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.331318 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": read tcp 10.217.0.2:39866->10.217.0.213:8775: read: connection reset by peer" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.334392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" event={"ID":"7a560423-c65f-4855-94f2-31e6763e0817","Type":"ContainerStarted","Data":"0fba45e8f2db0d7874562de5ae6e82ab65e3497ea143bc8a1322500183785f07"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.334502 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener-log" containerID="cri-o://b0c885cf9d242a07ffb1ee39664e6caefa677fa8a5dc413188745bc487f46952" gracePeriod=30 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.334803 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener" containerID="cri-o://0fba45e8f2db0d7874562de5ae6e82ab65e3497ea143bc8a1322500183785f07" gracePeriod=30 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.334906 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "a9c230c6-1af8-440e-806d-b3b1e98544c0" (UID: "a9c230c6-1af8-440e-806d-b3b1e98544c0"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.357865 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f977-account-create-update-d6pbn"] Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.372061 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6938-account-create-update-jvz85" event={"ID":"af3b6c92-f6c8-4ea1-9bbb-3a5737468982","Type":"ContainerStarted","Data":"895040f1766d1fa5597837e4eba962840f265b555a65a21226c737e078ce4e14"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.373829 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" podStartSLOduration=7.373812161 podStartE2EDuration="7.373812161s" podCreationTimestamp="2026-02-19 21:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:53:01.361223648 +0000 UTC m=+1481.632666118" watchObservedRunningTime="2026-02-19 21:53:01.373812161 +0000 UTC m=+1481.645254631" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.375774 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f977-account-create-update-d6pbn"] Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.424203 4771 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c230c6-1af8-440e-806d-b3b1e98544c0-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.439346 4771 generic.go:334] "Generic (PLEG): container finished" podID="a9c230c6-1af8-440e-806d-b3b1e98544c0" containerID="e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92" exitCode=2 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.439557 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9c230c6-1af8-440e-806d-b3b1e98544c0","Type":"ContainerDied","Data":"e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.439647 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9c230c6-1af8-440e-806d-b3b1e98544c0","Type":"ContainerDied","Data":"b73186e1941ca07d3ecc1fb16194d53e4a327da4e24c68bbbb6283d86aef3fa3"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.439805 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.476135 4771 generic.go:334] "Generic (PLEG): container finished" podID="3e4401f3-2019-40a6-8829-5e22598f176a" containerID="e3a28c6e19caec26422bd2d82f956078d8831cbf0eb0eb927d8feeec18b08e83" exitCode=1 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.476342 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sv7nf" event={"ID":"3e4401f3-2019-40a6-8829-5e22598f176a","Type":"ContainerDied","Data":"e3a28c6e19caec26422bd2d82f956078d8831cbf0eb0eb927d8feeec18b08e83"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.476977 4771 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-sv7nf" secret="" err="secret \"galera-openstack-dockercfg-ljqtn\" not found" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.477099 4771 scope.go:117] "RemoveContainer" containerID="e3a28c6e19caec26422bd2d82f956078d8831cbf0eb0eb927d8feeec18b08e83" Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.477432 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-sv7nf_openstack(3e4401f3-2019-40a6-8829-5e22598f176a)\"" pod="openstack/root-account-create-update-sv7nf" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.500403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" event={"ID":"a4200695-7cfa-41e9-a0ff-948f4a767b81","Type":"ContainerDied","Data":"a526fae19ac19109c2d7f2282715cd2f8cf2169cbbd16b1879dd131ea82434db"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.500480 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c5fb-account-create-update-j4b5p" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.518883 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerID="b9023b235acac843c6d06970035afa4b4d903cfed719a74a48514a2a10f8f915" exitCode=0 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.518939 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e","Type":"ContainerDied","Data":"b9023b235acac843c6d06970035afa4b4d903cfed719a74a48514a2a10f8f915"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.529821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676bcb667b-l4th4" event={"ID":"dede6da5-d7c3-45f8-9431-41ee414e21d3","Type":"ContainerStarted","Data":"ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.530050 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" containerID="cri-o://34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5" gracePeriod=30 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.530272 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.530291 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.530539 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" containerID="cri-o://ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d" gracePeriod=30 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.553667 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3163-account-create-update-r6n24" event={"ID":"95488d16-b990-4ef7-b263-8245bc54d569","Type":"ContainerDied","Data":"0a9df7f2571cce70912a882b90c559fbbcf0dbe0bda014e96f74e0213d1d3291"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.554034 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3163-account-create-update-r6n24" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.565546 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-676bcb667b-l4th4" podStartSLOduration=6.56552204 podStartE2EDuration="6.56552204s" podCreationTimestamp="2026-02-19 21:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:53:01.560752003 +0000 UTC m=+1481.832194473" watchObservedRunningTime="2026-02-19 21:53:01.56552204 +0000 UTC m=+1481.836964510" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.619061 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-dbcb5799b-x6nw7" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:49732->10.217.0.165:9311: read: connection reset by peer" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.619214 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-dbcb5799b-x6nw7" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:49716->10.217.0.165:9311: read: connection reset by peer" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.632116 4771 scope.go:117] "RemoveContainer" containerID="e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.662878 4771 generic.go:334] "Generic (PLEG): container finished" podID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerID="b125e97d3aa387a74d59769b99c8a8b7c2519de1b84fed4d8cf928e890c90ec4" exitCode=143 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.662945 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf8786687-hjsp2" event={"ID":"fe42dab7-ecba-4c58-8c1d-e9489760692a","Type":"ContainerDied","Data":"b125e97d3aa387a74d59769b99c8a8b7c2519de1b84fed4d8cf928e890c90ec4"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.696232 4771 generic.go:334] "Generic (PLEG): container finished" podID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerID="1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60" exitCode=0 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.696262 4771 generic.go:334] "Generic (PLEG): container finished" podID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerID="88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98" exitCode=0 Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.696317 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerDied","Data":"1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.696343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerDied","Data":"88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98"} Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.763642 4771 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.765398 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts podName:3e4401f3-2019-40a6-8829-5e22598f176a nodeName:}" failed. No retries permitted until 2026-02-19 21:53:03.765356163 +0000 UTC m=+1484.036798623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts") pod "root-account-create-update-sv7nf" (UID: "3e4401f3-2019-40a6-8829-5e22598f176a") : configmap "openstack-scripts" not found Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.779635 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.781366 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.781545 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.783360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d56d3128-877b-4d8d-a48f-42a7b83e9347","Type":"ContainerDied","Data":"6d2fdee37dded2467744f12f3354205ed44f3de4f244de30200cc7739d19d674"} Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.783446 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.796708 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.818675 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.818973 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4938b4a1-32f8-4e7a-b334-8ce3b649fb46" containerName="nova-cell0-conductor-conductor" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.846189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.846352 4771 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.846393 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts podName:c7b86c07-52cb-4911-8b7b-205eaa247226 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:03.846382354 +0000 UTC m=+1484.117824824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts") pod "keystone-9a3b-account-create-update-pstqn" (UID: "c7b86c07-52cb-4911-8b7b-205eaa247226") : configmap "openstack-scripts" not found Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.924184 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.924588 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.924783 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.929109 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.945252 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.950152 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.956298 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.956625 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 19 21:53:01 crc kubenswrapper[4771]: E0219 21:53:01.956718 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerName="galera" Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.976751 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:53:01 crc kubenswrapper[4771]: I0219 21:53:01.983083 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.001452 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-j4b5p"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.013099 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c5fb-account-create-update-j4b5p"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.024549 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8fb6-account-create-update-gmz58"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.030265 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8fb6-account-create-update-gmz58"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.053969 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data-custom\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054011 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8bp4\" (UniqueName: \"kubernetes.io/projected/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-kube-api-access-q8bp4\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-internal-tls-certs\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054107 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-public-tls-certs\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-logs\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054154 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-operator-scripts\") pod \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-etc-machine-id\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054216 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb2j6\" (UniqueName: \"kubernetes.io/projected/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-kube-api-access-gb2j6\") pod \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\" (UID: \"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054273 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054320 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-combined-ca-bundle\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.054395 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-scripts\") pod \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\" (UID: \"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.063836 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.064331 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-logs" (OuterVolumeSpecName: "logs") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.065172 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f06a8fe-d882-40f7-ac43-4eb2753d0ae4" (UID: "0f06a8fe-d882-40f7-ac43-4eb2753d0ae4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.075051 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.083096 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-kube-api-access-gb2j6" (OuterVolumeSpecName: "kube-api-access-gb2j6") pod "0f06a8fe-d882-40f7-ac43-4eb2753d0ae4" (UID: "0f06a8fe-d882-40f7-ac43-4eb2753d0ae4"). InnerVolumeSpecName "kube-api-access-gb2j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.086226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-scripts" (OuterVolumeSpecName: "scripts") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.097171 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3163-account-create-update-r6n24"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.104158 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3163-account-create-update-r6n24"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.110977 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.111518 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-kube-api-access-q8bp4" (OuterVolumeSpecName: "kube-api-access-q8bp4") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "kube-api-access-q8bp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.117618 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.129707 4771 scope.go:117] "RemoveContainer" containerID="e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92" Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.142641 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92\": container with ID starting with e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92 not found: ID does not exist" containerID="e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.142783 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92"} err="failed to get container status \"e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92\": rpc error: code = NotFound desc = could not find container \"e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92\": container with ID starting with e2ae3a9d4b9f08b7e252e462ec38248ab2c97d77e6d8c365867b6a5bacaf4a92 not found: ID does not exist" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.142867 4771 scope.go:117] "RemoveContainer" containerID="801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139" Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.143628 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139\": container with ID starting with 801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139 not found: ID does not exist" containerID="801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.143720 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139"} err="failed to get container status \"801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139\": rpc error: code = NotFound desc = could not find container \"801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139\": container with ID starting with 801c4c42d1302381d548e93067fd8c446586e9190ea7638281496d5dfe531139 not found: ID does not exist" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.143783 4771 scope.go:117] "RemoveContainer" containerID="4ada860d49b0d5b3925f45481ef51dd9420a46af24cb6992437b8bb7c7ce56a6" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.157212 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.157247 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.157256 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8bp4\" (UniqueName: \"kubernetes.io/projected/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-kube-api-access-q8bp4\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.157265 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.157273 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.157284 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.157293 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb2j6\" (UniqueName: \"kubernetes.io/projected/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4-kube-api-access-gb2j6\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.175504 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.212231 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.230202 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.263893 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbhkr\" (UniqueName: \"kubernetes.io/projected/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-kube-api-access-vbhkr\") pod \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.264053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-operator-scripts\") pod \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\" (UID: \"af3b6c92-f6c8-4ea1-9bbb-3a5737468982\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.264444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xt2\" (UniqueName: \"kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2\") pod \"keystone-9a3b-account-create-update-pstqn\" (UID: \"c7b86c07-52cb-4911-8b7b-205eaa247226\") " pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.264578 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.265872 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af3b6c92-f6c8-4ea1-9bbb-3a5737468982" (UID: "af3b6c92-f6c8-4ea1-9bbb-3a5737468982"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.266994 4771 projected.go:194] Error preparing data for projected volume kube-api-access-b7xt2 for pod openstack/keystone-9a3b-account-create-update-pstqn: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.267061 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2 podName:c7b86c07-52cb-4911-8b7b-205eaa247226 nodeName:}" failed. No retries permitted until 2026-02-19 21:53:04.267043556 +0000 UTC m=+1484.538486026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b7xt2" (UniqueName: "kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2") pod "keystone-9a3b-account-create-update-pstqn" (UID: "c7b86c07-52cb-4911-8b7b-205eaa247226") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.269983 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-kube-api-access-vbhkr" (OuterVolumeSpecName: "kube-api-access-vbhkr") pod "af3b6c92-f6c8-4ea1-9bbb-3a5737468982" (UID: "af3b6c92-f6c8-4ea1-9bbb-3a5737468982"). InnerVolumeSpecName "kube-api-access-vbhkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.271135 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.283129 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data" (OuterVolumeSpecName: "config-data") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.306410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" (UID: "6d6a922b-a8fd-4f68-95e9-398e5a38bc6e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.335613 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.367605 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/680c2961-e33c-44b1-aadd-37556bf4839c-logs\") pod \"680c2961-e33c-44b1-aadd-37556bf4839c\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.367656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts\") pod \"680c2961-e33c-44b1-aadd-37556bf4839c\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.367678 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkh2p\" (UniqueName: \"kubernetes.io/projected/680c2961-e33c-44b1-aadd-37556bf4839c-kube-api-access-rkh2p\") pod \"680c2961-e33c-44b1-aadd-37556bf4839c\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.367696 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-public-tls-certs\") pod \"680c2961-e33c-44b1-aadd-37556bf4839c\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.367719 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data\") pod \"680c2961-e33c-44b1-aadd-37556bf4839c\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.367760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-internal-tls-certs\") pod \"680c2961-e33c-44b1-aadd-37556bf4839c\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.367867 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-combined-ca-bundle\") pod \"680c2961-e33c-44b1-aadd-37556bf4839c\" (UID: \"680c2961-e33c-44b1-aadd-37556bf4839c\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.368284 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.368296 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.368305 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.368313 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbhkr\" (UniqueName: \"kubernetes.io/projected/af3b6c92-f6c8-4ea1-9bbb-3a5737468982-kube-api-access-vbhkr\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.368324 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.368775 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/680c2961-e33c-44b1-aadd-37556bf4839c-logs" (OuterVolumeSpecName: "logs") pod "680c2961-e33c-44b1-aadd-37556bf4839c" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.408248 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680c2961-e33c-44b1-aadd-37556bf4839c-kube-api-access-rkh2p" (OuterVolumeSpecName: "kube-api-access-rkh2p") pod "680c2961-e33c-44b1-aadd-37556bf4839c" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c"). InnerVolumeSpecName "kube-api-access-rkh2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.437038 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts" (OuterVolumeSpecName: "scripts") pod "680c2961-e33c-44b1-aadd-37556bf4839c" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.441721 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data" (OuterVolumeSpecName: "config-data") pod "680c2961-e33c-44b1-aadd-37556bf4839c" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.457061 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d819b52-ac9a-4713-bac8-2097460f3108" path="/var/lib/kubelet/pods/0d819b52-ac9a-4713-bac8-2097460f3108/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.457575 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" path="/var/lib/kubelet/pods/2aa7dd4a-4295-4120-bff7-fc2c82f76aed/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.458127 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1653c4-41e5-403a-ae0d-9c1a5762f287" path="/var/lib/kubelet/pods/4e1653c4-41e5-403a-ae0d-9c1a5762f287/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.459362 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95488d16-b990-4ef7-b263-8245bc54d569" path="/var/lib/kubelet/pods/95488d16-b990-4ef7-b263-8245bc54d569/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.459687 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4200695-7cfa-41e9-a0ff-948f4a767b81" path="/var/lib/kubelet/pods/a4200695-7cfa-41e9-a0ff-948f4a767b81/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.460282 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c230c6-1af8-440e-806d-b3b1e98544c0" path="/var/lib/kubelet/pods/a9c230c6-1af8-440e-806d-b3b1e98544c0/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.460788 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56d3128-877b-4d8d-a48f-42a7b83e9347" path="/var/lib/kubelet/pods/d56d3128-877b-4d8d-a48f-42a7b83e9347/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.461639 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d791a5dc-fbdc-43be-9442-ff3f7c4cbb17" path="/var/lib/kubelet/pods/d791a5dc-fbdc-43be-9442-ff3f7c4cbb17/volumes" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469090 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-config-data\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469138 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469169 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-httpd-run\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469189 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-public-tls-certs\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469215 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp9jm\" (UniqueName: \"kubernetes.io/projected/306067f7-88de-4cdb-8ca2-3540ada9b006-kube-api-access-pp9jm\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469264 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-logs\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469388 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-scripts\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469406 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-combined-ca-bundle\") pod \"306067f7-88de-4cdb-8ca2-3540ada9b006\" (UID: \"306067f7-88de-4cdb-8ca2-3540ada9b006\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469477 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469733 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469744 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/680c2961-e33c-44b1-aadd-37556bf4839c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469753 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469761 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkh2p\" (UniqueName: \"kubernetes.io/projected/680c2961-e33c-44b1-aadd-37556bf4839c-kube-api-access-rkh2p\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.469770 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.470087 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-logs" (OuterVolumeSpecName: "logs") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.471057 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "680c2961-e33c-44b1-aadd-37556bf4839c" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.472745 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306067f7-88de-4cdb-8ca2-3540ada9b006-kube-api-access-pp9jm" (OuterVolumeSpecName: "kube-api-access-pp9jm") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "kube-api-access-pp9jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.473110 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-scripts" (OuterVolumeSpecName: "scripts") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.486279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.493784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "680c2961-e33c-44b1-aadd-37556bf4839c" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.531536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.537134 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-config-data" (OuterVolumeSpecName: "config-data") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.571576 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/306067f7-88de-4cdb-8ca2-3540ada9b006-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.571994 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.572136 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.572257 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.572372 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.572474 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.572851 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.573277 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp9jm\" (UniqueName: \"kubernetes.io/projected/306067f7-88de-4cdb-8ca2-3540ada9b006-kube-api-access-pp9jm\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.573095 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "680c2961-e33c-44b1-aadd-37556bf4839c" (UID: "680c2961-e33c-44b1-aadd-37556bf4839c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.576136 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "306067f7-88de-4cdb-8ca2-3540ada9b006" (UID: "306067f7-88de-4cdb-8ca2-3540ada9b006"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.615407 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.674723 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.674752 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/306067f7-88de-4cdb-8ca2-3540ada9b006-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.674761 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680c2961-e33c-44b1-aadd-37556bf4839c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.723416 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.743894 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.757115 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.759463 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.765501 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.776136 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-config-data\") pod \"1c812f8b-6ad1-4873-8999-e649acd07d91\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.776389 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-combined-ca-bundle\") pod \"1c812f8b-6ad1-4873-8999-e649acd07d91\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.776550 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb4c5\" (UniqueName: \"kubernetes.io/projected/1c812f8b-6ad1-4873-8999-e649acd07d91-kube-api-access-rb4c5\") pod \"1c812f8b-6ad1-4873-8999-e649acd07d91\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.776658 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-kolla-config\") pod \"1c812f8b-6ad1-4873-8999-e649acd07d91\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.776817 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-memcached-tls-certs\") pod \"1c812f8b-6ad1-4873-8999-e649acd07d91\" (UID: \"1c812f8b-6ad1-4873-8999-e649acd07d91\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.777388 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-config-data" (OuterVolumeSpecName: "config-data") pod "1c812f8b-6ad1-4873-8999-e649acd07d91" (UID: "1c812f8b-6ad1-4873-8999-e649acd07d91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.777990 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1c812f8b-6ad1-4873-8999-e649acd07d91" (UID: "1c812f8b-6ad1-4873-8999-e649acd07d91"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.791293 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.795179 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c812f8b-6ad1-4873-8999-e649acd07d91-kube-api-access-rb4c5" (OuterVolumeSpecName: "kube-api-access-rb4c5") pod "1c812f8b-6ad1-4873-8999-e649acd07d91" (UID: "1c812f8b-6ad1-4873-8999-e649acd07d91"). InnerVolumeSpecName "kube-api-access-rb4c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.837889 4771 generic.go:334] "Generic (PLEG): container finished" podID="7a560423-c65f-4855-94f2-31e6763e0817" containerID="b0c885cf9d242a07ffb1ee39664e6caefa677fa8a5dc413188745bc487f46952" exitCode=143 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.837954 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" event={"ID":"7a560423-c65f-4855-94f2-31e6763e0817","Type":"ContainerDied","Data":"b0c885cf9d242a07ffb1ee39664e6caefa677fa8a5dc413188745bc487f46952"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.840993 4771 generic.go:334] "Generic (PLEG): container finished" podID="680c2961-e33c-44b1-aadd-37556bf4839c" containerID="0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.841061 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b5d9cf798-b8q9j" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.841081 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b5d9cf798-b8q9j" event={"ID":"680c2961-e33c-44b1-aadd-37556bf4839c","Type":"ContainerDied","Data":"0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.842057 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b5d9cf798-b8q9j" event={"ID":"680c2961-e33c-44b1-aadd-37556bf4839c","Type":"ContainerDied","Data":"9fa692d6c39236751821891e0aac5e950e9beaa94fe51a185f962ab38554c09b"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.842077 4771 scope.go:117] "RemoveContainer" containerID="0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.856484 4771 generic.go:334] "Generic (PLEG): container finished" podID="b64cc52a-4f20-4e05-b444-c46f97727527" containerID="8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.856612 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.857077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b64cc52a-4f20-4e05-b444-c46f97727527","Type":"ContainerDied","Data":"8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.857106 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b64cc52a-4f20-4e05-b444-c46f97727527","Type":"ContainerDied","Data":"148169a0c855d492d63ceb6c713da40443055b3a4aa0abada920d121f7ae1f20"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.870818 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "1c812f8b-6ad1-4873-8999-e649acd07d91" (UID: "1c812f8b-6ad1-4873-8999-e649acd07d91"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.872496 4771 generic.go:334] "Generic (PLEG): container finished" podID="fbb34081-bc3a-451a-93c5-c28299467781" containerID="b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.872507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbb34081-bc3a-451a-93c5-c28299467781","Type":"ContainerDied","Data":"b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.872582 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.872581 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbb34081-bc3a-451a-93c5-c28299467781","Type":"ContainerDied","Data":"f2a0610da71872c83c49077ebf3d4acebbb7370e56300a4159bbb53e9cfd6771"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.873605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8302-account-create-update-hcgps" event={"ID":"0f06a8fe-d882-40f7-ac43-4eb2753d0ae4","Type":"ContainerDied","Data":"87799ad628c2f094b7e11dc2020dce84ae951545a698d9791ac50e63a97c5c59"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.873685 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8302-account-create-update-hcgps" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data-custom\") pod \"1e5f8191-842e-4a39-ac28-b0042c51f813\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-scripts\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877665 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877690 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-internal-tls-certs\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877737 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rctnj\" (UniqueName: \"kubernetes.io/projected/63798ee6-b629-437f-9e15-5bd46b79894e-kube-api-access-rctnj\") pod \"63798ee6-b629-437f-9e15-5bd46b79894e\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-combined-ca-bundle\") pod \"63798ee6-b629-437f-9e15-5bd46b79894e\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877790 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwmnx\" (UniqueName: \"kubernetes.io/projected/fbb34081-bc3a-451a-93c5-c28299467781-kube-api-access-dwmnx\") pod \"fbb34081-bc3a-451a-93c5-c28299467781\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877806 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-config-data\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877824 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data\") pod \"1e5f8191-842e-4a39-ac28-b0042c51f813\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877856 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-combined-ca-bundle\") pod \"fbb34081-bc3a-451a-93c5-c28299467781\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877875 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz9vt\" (UniqueName: \"kubernetes.io/projected/b64cc52a-4f20-4e05-b444-c46f97727527-kube-api-access-vz9vt\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877901 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-combined-ca-bundle\") pod \"1e5f8191-842e-4a39-ac28-b0042c51f813\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877925 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-combined-ca-bundle\") pod \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877952 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-httpd-run\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877977 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-config-data\") pod \"63798ee6-b629-437f-9e15-5bd46b79894e\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.877994 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-nova-metadata-tls-certs\") pod \"fbb34081-bc3a-451a-93c5-c28299467781\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-public-tls-certs\") pod \"63798ee6-b629-437f-9e15-5bd46b79894e\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878113 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-config-data\") pod \"fbb34081-bc3a-451a-93c5-c28299467781\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-config-data\") pod \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878159 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb34081-bc3a-451a-93c5-c28299467781-logs\") pod \"fbb34081-bc3a-451a-93c5-c28299467781\" (UID: \"fbb34081-bc3a-451a-93c5-c28299467781\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878176 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-internal-tls-certs\") pod \"63798ee6-b629-437f-9e15-5bd46b79894e\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878203 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnwjq\" (UniqueName: \"kubernetes.io/projected/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-kube-api-access-lnwjq\") pod \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\" (UID: \"4938b4a1-32f8-4e7a-b334-8ce3b649fb46\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878244 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-internal-tls-certs\") pod \"1e5f8191-842e-4a39-ac28-b0042c51f813\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63798ee6-b629-437f-9e15-5bd46b79894e-logs\") pod \"63798ee6-b629-437f-9e15-5bd46b79894e\" (UID: \"63798ee6-b629-437f-9e15-5bd46b79894e\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878297 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-public-tls-certs\") pod \"1e5f8191-842e-4a39-ac28-b0042c51f813\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878323 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc2b6\" (UniqueName: \"kubernetes.io/projected/1e5f8191-842e-4a39-ac28-b0042c51f813-kube-api-access-jc2b6\") pod \"1e5f8191-842e-4a39-ac28-b0042c51f813\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878338 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8191-842e-4a39-ac28-b0042c51f813-logs\") pod \"1e5f8191-842e-4a39-ac28-b0042c51f813\" (UID: \"1e5f8191-842e-4a39-ac28-b0042c51f813\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878369 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-logs\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-combined-ca-bundle\") pod \"b64cc52a-4f20-4e05-b444-c46f97727527\" (UID: \"b64cc52a-4f20-4e05-b444-c46f97727527\") " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878703 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878715 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb4c5\" (UniqueName: \"kubernetes.io/projected/1c812f8b-6ad1-4873-8999-e649acd07d91-kube-api-access-rb4c5\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878724 4771 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c812f8b-6ad1-4873-8999-e649acd07d91-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.878732 4771 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.881229 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.883256 4771 generic.go:334] "Generic (PLEG): container finished" podID="63798ee6-b629-437f-9e15-5bd46b79894e" containerID="e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.883366 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63798ee6-b629-437f-9e15-5bd46b79894e","Type":"ContainerDied","Data":"e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.883426 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63798ee6-b629-437f-9e15-5bd46b79894e","Type":"ContainerDied","Data":"c549507f26ed1faac44d987278eaf60a40ebcbb54aa748de7b8f3de70fdb58c6"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.883545 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.889037 4771 scope.go:117] "RemoveContainer" containerID="eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.890838 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.894129 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1e5f8191-842e-4a39-ac28-b0042c51f813" (UID: "1e5f8191-842e-4a39-ac28-b0042c51f813"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.895460 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b5d9cf798-b8q9j"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.897980 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbb34081-bc3a-451a-93c5-c28299467781-logs" (OuterVolumeSpecName: "logs") pod "fbb34081-bc3a-451a-93c5-c28299467781" (UID: "fbb34081-bc3a-451a-93c5-c28299467781"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.898039 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63798ee6-b629-437f-9e15-5bd46b79894e-logs" (OuterVolumeSpecName: "logs") pod "63798ee6-b629-437f-9e15-5bd46b79894e" (UID: "63798ee6-b629-437f-9e15-5bd46b79894e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.898168 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerID="8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.898221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dbcb5799b-x6nw7" event={"ID":"1e5f8191-842e-4a39-ac28-b0042c51f813","Type":"ContainerDied","Data":"8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.898329 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dbcb5799b-x6nw7" event={"ID":"1e5f8191-842e-4a39-ac28-b0042c51f813","Type":"ContainerDied","Data":"5a5a4b5b442b3e603403d45fee1756c39fb4aab801c8ec3dcf50203a3b761331"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.898389 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dbcb5799b-x6nw7" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.899006 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5f8191-842e-4a39-ac28-b0042c51f813-logs" (OuterVolumeSpecName: "logs") pod "1e5f8191-842e-4a39-ac28-b0042c51f813" (UID: "1e5f8191-842e-4a39-ac28-b0042c51f813"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.899451 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-logs" (OuterVolumeSpecName: "logs") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.902588 4771 generic.go:334] "Generic (PLEG): container finished" podID="1c812f8b-6ad1-4873-8999-e649acd07d91" containerID="afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.902650 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1c812f8b-6ad1-4873-8999-e649acd07d91","Type":"ContainerDied","Data":"afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.902674 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1c812f8b-6ad1-4873-8999-e649acd07d91","Type":"ContainerDied","Data":"3dd228c80bb649fb6014c610a3c0ae5edc562d347d6ed009d5224811b3a7967b"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.902722 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.903241 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-scripts" (OuterVolumeSpecName: "scripts") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.904138 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b5d9cf798-b8q9j"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.904168 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5f8191-842e-4a39-ac28-b0042c51f813-kube-api-access-jc2b6" (OuterVolumeSpecName: "kube-api-access-jc2b6") pod "1e5f8191-842e-4a39-ac28-b0042c51f813" (UID: "1e5f8191-842e-4a39-ac28-b0042c51f813"). InnerVolumeSpecName "kube-api-access-jc2b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.912189 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-kube-api-access-lnwjq" (OuterVolumeSpecName: "kube-api-access-lnwjq") pod "4938b4a1-32f8-4e7a-b334-8ce3b649fb46" (UID: "4938b4a1-32f8-4e7a-b334-8ce3b649fb46"). InnerVolumeSpecName "kube-api-access-lnwjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.919147 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63798ee6-b629-437f-9e15-5bd46b79894e-kube-api-access-rctnj" (OuterVolumeSpecName: "kube-api-access-rctnj") pod "63798ee6-b629-437f-9e15-5bd46b79894e" (UID: "63798ee6-b629-437f-9e15-5bd46b79894e"). InnerVolumeSpecName "kube-api-access-rctnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.923802 4771 generic.go:334] "Generic (PLEG): container finished" podID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerID="34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5" exitCode=143 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.923858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676bcb667b-l4th4" event={"ID":"dede6da5-d7c3-45f8-9431-41ee414e21d3","Type":"ContainerDied","Data":"34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.929280 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b64cc52a-4f20-4e05-b444-c46f97727527-kube-api-access-vz9vt" (OuterVolumeSpecName: "kube-api-access-vz9vt") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "kube-api-access-vz9vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.933962 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.935308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb34081-bc3a-451a-93c5-c28299467781-kube-api-access-dwmnx" (OuterVolumeSpecName: "kube-api-access-dwmnx") pod "fbb34081-bc3a-451a-93c5-c28299467781" (UID: "fbb34081-bc3a-451a-93c5-c28299467781"). InnerVolumeSpecName "kube-api-access-dwmnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.943324 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8302-account-create-update-hcgps"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.947225 4771 scope.go:117] "RemoveContainer" containerID="0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771" Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.951307 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771\": container with ID starting with 0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771 not found: ID does not exist" containerID="0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.951344 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771"} err="failed to get container status \"0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771\": rpc error: code = NotFound desc = could not find container \"0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771\": container with ID starting with 0fc20aee619b9325f24b310a6f1d789c51dc112b538931a3d9694d133be3c771 not found: ID does not exist" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.951370 4771 scope.go:117] "RemoveContainer" containerID="eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2" Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.951795 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2\": container with ID starting with eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2 not found: ID does not exist" containerID="eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.951825 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2"} err="failed to get container status \"eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2\": rpc error: code = NotFound desc = could not find container \"eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2\": container with ID starting with eb43f6f3655c19b6c7697b899856b810742c1fb5be7a244baa241dac749757e2 not found: ID does not exist" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.951849 4771 scope.go:117] "RemoveContainer" containerID="8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.952181 4771 generic.go:334] "Generic (PLEG): container finished" podID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerID="3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.952327 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.952491 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"306067f7-88de-4cdb-8ca2-3540ada9b006","Type":"ContainerDied","Data":"3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.952563 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"306067f7-88de-4cdb-8ca2-3540ada9b006","Type":"ContainerDied","Data":"fe0314b3b1991ef8347004b1f018c7add23f262bf102c2bbac1322abfbbc647e"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.957592 4771 generic.go:334] "Generic (PLEG): container finished" podID="4938b4a1-32f8-4e7a-b334-8ce3b649fb46" containerID="8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" exitCode=0 Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.957637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4938b4a1-32f8-4e7a-b334-8ce3b649fb46","Type":"ContainerDied","Data":"8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.957656 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4938b4a1-32f8-4e7a-b334-8ce3b649fb46","Type":"ContainerDied","Data":"2ac11aa5ae2dfb868022f8538bffff387954bd82a4f1bd15cbccd53efc9ff707"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.957705 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.960508 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8302-account-create-update-hcgps"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.976461 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6938-account-create-update-jvz85" event={"ID":"af3b6c92-f6c8-4ea1-9bbb-3a5737468982","Type":"ContainerDied","Data":"895040f1766d1fa5597837e4eba962840f265b555a65a21226c737e078ce4e14"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.976516 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6938-account-create-update-jvz85" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.981976 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a3b-account-create-update-pstqn" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.982195 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.982282 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d6a922b-a8fd-4f68-95e9-398e5a38bc6e","Type":"ContainerDied","Data":"a5c466f91dea18e630a557c52810ffe412a14d61ac7179f7205d3432342a3d4c"} Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.990744 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.992296 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc2b6\" (UniqueName: \"kubernetes.io/projected/1e5f8191-842e-4a39-ac28-b0042c51f813-kube-api-access-jc2b6\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.992662 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:53:02 crc kubenswrapper[4771]: E0219 21:53:02.992718 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data podName:c671dcf6-b1eb-4c4e-ba71-ae115ce811da nodeName:}" failed. No retries permitted until 2026-02-19 21:53:10.99270099 +0000 UTC m=+1491.264143460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data") pod "rabbitmq-server-0" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da") : configmap "rabbitmq-config-data" not found Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995569 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8191-842e-4a39-ac28-b0042c51f813-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995624 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995634 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995643 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995654 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995778 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995792 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rctnj\" (UniqueName: \"kubernetes.io/projected/63798ee6-b629-437f-9e15-5bd46b79894e-kube-api-access-rctnj\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995802 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwmnx\" (UniqueName: \"kubernetes.io/projected/fbb34081-bc3a-451a-93c5-c28299467781-kube-api-access-dwmnx\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995811 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz9vt\" (UniqueName: \"kubernetes.io/projected/b64cc52a-4f20-4e05-b444-c46f97727527-kube-api-access-vz9vt\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995820 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b64cc52a-4f20-4e05-b444-c46f97727527-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.995829 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb34081-bc3a-451a-93c5-c28299467781-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.996439 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnwjq\" (UniqueName: \"kubernetes.io/projected/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-kube-api-access-lnwjq\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.996450 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63798ee6-b629-437f-9e15-5bd46b79894e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:02 crc kubenswrapper[4771]: I0219 21:53:02.998066 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63798ee6-b629-437f-9e15-5bd46b79894e" (UID: "63798ee6-b629-437f-9e15-5bd46b79894e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.000909 4771 scope.go:117] "RemoveContainer" containerID="4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.002568 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.042373 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e5f8191-842e-4a39-ac28-b0042c51f813" (UID: "1e5f8191-842e-4a39-ac28-b0042c51f813"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.043953 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-config-data" (OuterVolumeSpecName: "config-data") pod "4938b4a1-32f8-4e7a-b334-8ce3b649fb46" (UID: "4938b4a1-32f8-4e7a-b334-8ce3b649fb46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.049143 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9a3b-account-create-update-pstqn"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.050086 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c812f8b-6ad1-4873-8999-e649acd07d91" (UID: "1c812f8b-6ad1-4873-8999-e649acd07d91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.063264 4771 scope.go:117] "RemoveContainer" containerID="8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.063763 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb\": container with ID starting with 8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb not found: ID does not exist" containerID="8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.063805 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb"} err="failed to get container status \"8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb\": rpc error: code = NotFound desc = could not find container \"8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb\": container with ID starting with 8b945e6a3e1e277746f4c218de4fe2d1728efc9e11cf6ff7c2d6c4f70b2480eb not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.063834 4771 scope.go:117] "RemoveContainer" containerID="4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.064156 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d\": container with ID starting with 4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d not found: ID does not exist" containerID="4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.064175 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d"} err="failed to get container status \"4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d\": rpc error: code = NotFound desc = could not find container \"4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d\": container with ID starting with 4e21c2780bd18a3c8deea6118fa56b7894b203c40f4ee7b9824e6a9fc7ca007d not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.064189 4771 scope.go:117] "RemoveContainer" containerID="b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.075528 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.082999 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9a3b-account-create-update-pstqn"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.098072 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.098105 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c812f8b-6ad1-4873-8999-e649acd07d91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.098118 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.098129 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.098138 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.098567 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6938-account-create-update-jvz85"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.102511 4771 scope.go:117] "RemoveContainer" containerID="8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.109463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbb34081-bc3a-451a-93c5-c28299467781" (UID: "fbb34081-bc3a-451a-93c5-c28299467781"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.111561 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6938-account-create-update-jvz85"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.113298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "63798ee6-b629-437f-9e15-5bd46b79894e" (UID: "63798ee6-b629-437f-9e15-5bd46b79894e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.119541 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.122811 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-config-data" (OuterVolumeSpecName: "config-data") pod "63798ee6-b629-437f-9e15-5bd46b79894e" (UID: "63798ee6-b629-437f-9e15-5bd46b79894e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.124533 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.126056 4771 scope.go:117] "RemoveContainer" containerID="b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.126368 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1\": container with ID starting with b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1 not found: ID does not exist" containerID="b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.126398 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1"} err="failed to get container status \"b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1\": rpc error: code = NotFound desc = could not find container \"b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1\": container with ID starting with b29ab453a90c816c6fbf6ae0be2bcfde527a9bef841c685ef1b541ba048926a1 not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.126418 4771 scope.go:117] "RemoveContainer" containerID="8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.126585 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d\": container with ID starting with 8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d not found: ID does not exist" containerID="8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.126605 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d"} err="failed to get container status \"8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d\": rpc error: code = NotFound desc = could not find container \"8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d\": container with ID starting with 8d64fe0590373c274ae8bc873ea599a07b7db1a96e05e29a9261ae28c160547d not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.126617 4771 scope.go:117] "RemoveContainer" containerID="e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.127779 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fbb34081-bc3a-451a-93c5-c28299467781" (UID: "fbb34081-bc3a-451a-93c5-c28299467781"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.129823 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-config-data" (OuterVolumeSpecName: "config-data") pod "fbb34081-bc3a-451a-93c5-c28299467781" (UID: "fbb34081-bc3a-451a-93c5-c28299467781"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.141266 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.149643 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1e5f8191-842e-4a39-ac28-b0042c51f813" (UID: "1e5f8191-842e-4a39-ac28-b0042c51f813"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.150373 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4938b4a1-32f8-4e7a-b334-8ce3b649fb46" (UID: "4938b4a1-32f8-4e7a-b334-8ce3b649fb46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.152654 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data" (OuterVolumeSpecName: "config-data") pod "1e5f8191-842e-4a39-ac28-b0042c51f813" (UID: "1e5f8191-842e-4a39-ac28-b0042c51f813"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.153366 4771 scope.go:117] "RemoveContainer" containerID="89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.163663 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1e5f8191-842e-4a39-ac28-b0042c51f813" (UID: "1e5f8191-842e-4a39-ac28-b0042c51f813"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.192520 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63798ee6-b629-437f-9e15-5bd46b79894e" (UID: "63798ee6-b629-437f-9e15-5bd46b79894e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.197295 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-config-data" (OuterVolumeSpecName: "config-data") pod "b64cc52a-4f20-4e05-b444-c46f97727527" (UID: "b64cc52a-4f20-4e05-b444-c46f97727527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199310 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199337 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199346 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b86c07-52cb-4911-8b7b-205eaa247226-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199354 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199365 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7xt2\" (UniqueName: \"kubernetes.io/projected/c7b86c07-52cb-4911-8b7b-205eaa247226-kube-api-access-b7xt2\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199375 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b64cc52a-4f20-4e05-b444-c46f97727527-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199385 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8191-842e-4a39-ac28-b0042c51f813-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199393 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199401 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4938b4a1-32f8-4e7a-b334-8ce3b649fb46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199411 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199419 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199427 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199436 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb34081-bc3a-451a-93c5-c28299467781-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199444 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63798ee6-b629-437f-9e15-5bd46b79894e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199555 4771 scope.go:117] "RemoveContainer" containerID="e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.199939 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b\": container with ID starting with e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b not found: ID does not exist" containerID="e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199960 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b"} err="failed to get container status \"e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b\": rpc error: code = NotFound desc = could not find container \"e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b\": container with ID starting with e4eb557b846aa743a1abd6bdbdac62d73bdcdbe22fe6ea22b84dbc9890de351b not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.199980 4771 scope.go:117] "RemoveContainer" containerID="89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.200323 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7\": container with ID starting with 89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7 not found: ID does not exist" containerID="89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.200339 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7"} err="failed to get container status \"89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7\": rpc error: code = NotFound desc = could not find container \"89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7\": container with ID starting with 89180e497b8520075a64283bac4652ec441417d40df8608e6046fb1463b8e6a7 not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.200350 4771 scope.go:117] "RemoveContainer" containerID="8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.251594 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.261179 4771 scope.go:117] "RemoveContainer" containerID="4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.262210 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.271760 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.279162 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.287093 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-dbcb5799b-x6nw7"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.292216 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-dbcb5799b-x6nw7"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.324483 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.326715 4771 scope.go:117] "RemoveContainer" containerID="8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.327593 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5\": container with ID starting with 8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5 not found: ID does not exist" containerID="8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.327627 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5"} err="failed to get container status \"8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5\": rpc error: code = NotFound desc = could not find container \"8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5\": container with ID starting with 8815498a7bc3b894d1f8d39ee434770da9c4f90cb314b1e1876cce9ed8580ae5 not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.327661 4771 scope.go:117] "RemoveContainer" containerID="4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.328970 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba\": container with ID starting with 4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba not found: ID does not exist" containerID="4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.329002 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba"} err="failed to get container status \"4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba\": rpc error: code = NotFound desc = could not find container \"4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba\": container with ID starting with 4a2e46dbfeb2e3270473c3fc7740eaa5f9a60c7ecc8ac923e918b9f746e6b8ba not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.329036 4771 scope.go:117] "RemoveContainer" containerID="afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.329469 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.351373 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sv7nf" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.370338 4771 scope.go:117] "RemoveContainer" containerID="afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.370780 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484\": container with ID starting with afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484 not found: ID does not exist" containerID="afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.370812 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484"} err="failed to get container status \"afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484\": rpc error: code = NotFound desc = could not find container \"afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484\": container with ID starting with afd23958f30e6bee71ba1ce2d19fc2fe1323e921a29dbf76c79c3eacf769c484 not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.370835 4771 scope.go:117] "RemoveContainer" containerID="3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.403820 4771 scope.go:117] "RemoveContainer" containerID="286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.421163 4771 scope.go:117] "RemoveContainer" containerID="3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.421675 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0\": container with ID starting with 3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0 not found: ID does not exist" containerID="3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.421705 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0"} err="failed to get container status \"3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0\": rpc error: code = NotFound desc = could not find container \"3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0\": container with ID starting with 3da5b3abc617c20915b52495ebcc17f8a760ffd297ddee5abbdf5f32b216adb0 not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.421725 4771 scope.go:117] "RemoveContainer" containerID="286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.422027 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267\": container with ID starting with 286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267 not found: ID does not exist" containerID="286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.422047 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267"} err="failed to get container status \"286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267\": rpc error: code = NotFound desc = could not find container \"286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267\": container with ID starting with 286869aa8dbaca81c5b9a140c23dc6ed34b70606b4abff07fc12b1b19edd6267 not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.422062 4771 scope.go:117] "RemoveContainer" containerID="8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.445330 4771 scope.go:117] "RemoveContainer" containerID="8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.445707 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc\": container with ID starting with 8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc not found: ID does not exist" containerID="8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.445736 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc"} err="failed to get container status \"8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc\": rpc error: code = NotFound desc = could not find container \"8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc\": container with ID starting with 8fbde4e87d1899b2db861444b16492a441490948eb84a875421f2c14dba7f8bc not found: ID does not exist" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.445757 4771 scope.go:117] "RemoveContainer" containerID="b9023b235acac843c6d06970035afa4b4d903cfed719a74a48514a2a10f8f915" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.515687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts\") pod \"3e4401f3-2019-40a6-8829-5e22598f176a\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.515932 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqnqg\" (UniqueName: \"kubernetes.io/projected/3e4401f3-2019-40a6-8829-5e22598f176a-kube-api-access-bqnqg\") pod \"3e4401f3-2019-40a6-8829-5e22598f176a\" (UID: \"3e4401f3-2019-40a6-8829-5e22598f176a\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.517291 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e4401f3-2019-40a6-8829-5e22598f176a" (UID: "3e4401f3-2019-40a6-8829-5e22598f176a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.551394 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4401f3-2019-40a6-8829-5e22598f176a-kube-api-access-bqnqg" (OuterVolumeSpecName: "kube-api-access-bqnqg") pod "3e4401f3-2019-40a6-8829-5e22598f176a" (UID: "3e4401f3-2019-40a6-8829-5e22598f176a"). InnerVolumeSpecName "kube-api-access-bqnqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.618028 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqnqg\" (UniqueName: \"kubernetes.io/projected/3e4401f3-2019-40a6-8829-5e22598f176a-kube-api-access-bqnqg\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.618056 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4401f3-2019-40a6-8829-5e22598f176a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.624971 4771 scope.go:117] "RemoveContainer" containerID="4db106a42f266426c6257e14ae4da294c4869692c1f876160b68b9126d307faf" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.638308 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.641836 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.648872 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.654285 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.662906 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.718798 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-generated\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.718883 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgd2h\" (UniqueName: \"kubernetes.io/projected/4368f372-87b3-4f85-95f9-72c2046b0cc7-kube-api-access-vgd2h\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.718913 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-kolla-config\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.718942 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-combined-ca-bundle\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.718997 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.719078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-default\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.719135 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-galera-tls-certs\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.719173 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-operator-scripts\") pod \"4368f372-87b3-4f85-95f9-72c2046b0cc7\" (UID: \"4368f372-87b3-4f85-95f9-72c2046b0cc7\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.720052 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.721302 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.722657 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.724004 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.729175 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4368f372-87b3-4f85-95f9-72c2046b0cc7-kube-api-access-vgd2h" (OuterVolumeSpecName: "kube-api-access-vgd2h") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "kube-api-access-vgd2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.735888 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.737889 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.764159 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.770968 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4368f372-87b3-4f85-95f9-72c2046b0cc7" (UID: "4368f372-87b3-4f85-95f9-72c2046b0cc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-scripts\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820381 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-internal-tls-certs\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820466 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-config-data\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820522 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmbcg\" (UniqueName: \"kubernetes.io/projected/4b605459-c654-463d-b66c-ec804185ea7d-kube-api-access-kmbcg\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820562 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-fernet-keys\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820607 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-combined-ca-bundle\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820627 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-public-tls-certs\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.820677 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-credential-keys\") pod \"4b605459-c654-463d-b66c-ec804185ea7d\" (UID: \"4b605459-c654-463d-b66c-ec804185ea7d\") " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821042 4771 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821059 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821069 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821080 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgd2h\" (UniqueName: \"kubernetes.io/projected/4368f372-87b3-4f85-95f9-72c2046b0cc7-kube-api-access-vgd2h\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821090 4771 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821100 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4368f372-87b3-4f85-95f9-72c2046b0cc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821119 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.821129 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4368f372-87b3-4f85-95f9-72c2046b0cc7-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.822188 4771 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:53:03 crc kubenswrapper[4771]: E0219 21:53:03.822258 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data podName:17cd62c3-f3af-4144-b6b1-ec2cafb424ad nodeName:}" failed. No retries permitted until 2026-02-19 21:53:11.822241381 +0000 UTC m=+1492.093683841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data") pod "rabbitmq-cell1-server-0" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.824687 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.826696 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b605459-c654-463d-b66c-ec804185ea7d-kube-api-access-kmbcg" (OuterVolumeSpecName: "kube-api-access-kmbcg") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "kube-api-access-kmbcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.826781 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.826875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-scripts" (OuterVolumeSpecName: "scripts") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.839897 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.849872 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-config-data" (OuterVolumeSpecName: "config-data") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.851183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.873168 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.884967 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b605459-c654-463d-b66c-ec804185ea7d" (UID: "4b605459-c654-463d-b66c-ec804185ea7d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922531 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922567 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922580 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922593 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922604 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmbcg\" (UniqueName: \"kubernetes.io/projected/4b605459-c654-463d-b66c-ec804185ea7d-kube-api-access-kmbcg\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922615 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922626 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922637 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:03 crc kubenswrapper[4771]: I0219 21:53:03.922647 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b605459-c654-463d-b66c-ec804185ea7d-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:03.997324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sv7nf" event={"ID":"3e4401f3-2019-40a6-8829-5e22598f176a","Type":"ContainerDied","Data":"7291a0c26bbe5ef8bc71ff230695a118a89ae97ecec6d1667f035f97b5c5534f"} Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:03.997406 4771 scope.go:117] "RemoveContainer" containerID="e3a28c6e19caec26422bd2d82f956078d8831cbf0eb0eb927d8feeec18b08e83" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:03.997355 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sv7nf" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.002937 4771 generic.go:334] "Generic (PLEG): container finished" podID="4b605459-c654-463d-b66c-ec804185ea7d" containerID="2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e" exitCode=0 Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.003066 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9474445c-zjs86" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.003415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9474445c-zjs86" event={"ID":"4b605459-c654-463d-b66c-ec804185ea7d","Type":"ContainerDied","Data":"2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e"} Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.003448 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9474445c-zjs86" event={"ID":"4b605459-c654-463d-b66c-ec804185ea7d","Type":"ContainerDied","Data":"5469734f697a546fed81543e7f4451be5f08040e6f65ecae3a88729a12770722"} Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.013719 4771 generic.go:334] "Generic (PLEG): container finished" podID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerID="c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" exitCode=0 Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.013797 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.013819 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4368f372-87b3-4f85-95f9-72c2046b0cc7","Type":"ContainerDied","Data":"c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9"} Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.013864 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4368f372-87b3-4f85-95f9-72c2046b0cc7","Type":"ContainerDied","Data":"26afb60f64e505e9dcd57b481fa42cfb8da3ba5693b405f2a7d85720dc87b465"} Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.078644 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sv7nf"] Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.082837 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-sv7nf"] Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.091477 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.097248 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.101868 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7b9474445c-zjs86"] Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.106086 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7b9474445c-zjs86"] Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.125977 4771 scope.go:117] "RemoveContainer" containerID="2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.164128 4771 scope.go:117] "RemoveContainer" containerID="2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e" Feb 19 21:53:04 crc kubenswrapper[4771]: E0219 21:53:04.164716 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e\": container with ID starting with 2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e not found: ID does not exist" containerID="2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.164776 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e"} err="failed to get container status \"2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e\": rpc error: code = NotFound desc = could not find container \"2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e\": container with ID starting with 2be0241bf861b325fab54cb0693c2f3ab12c3ec289d40b86c7c4ed86cbfb0b4e not found: ID does not exist" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.164803 4771 scope.go:117] "RemoveContainer" containerID="c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.197736 4771 scope.go:117] "RemoveContainer" containerID="ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.248233 4771 scope.go:117] "RemoveContainer" containerID="c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" Feb 19 21:53:04 crc kubenswrapper[4771]: E0219 21:53:04.249341 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9\": container with ID starting with c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9 not found: ID does not exist" containerID="c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.249378 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9"} err="failed to get container status \"c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9\": rpc error: code = NotFound desc = could not find container \"c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9\": container with ID starting with c8fe40b6e4c563654b64d6d76c3fe73e33182912a3fe36bdff74b8f686ef7ca9 not found: ID does not exist" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.249428 4771 scope.go:117] "RemoveContainer" containerID="ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7" Feb 19 21:53:04 crc kubenswrapper[4771]: E0219 21:53:04.249907 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7\": container with ID starting with ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7 not found: ID does not exist" containerID="ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.249976 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7"} err="failed to get container status \"ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7\": rpc error: code = NotFound desc = could not find container \"ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7\": container with ID starting with ed539aa3723d89b02133cc21481b1361923e0a503c310d4f23ca409d917de9f7 not found: ID does not exist" Feb 19 21:53:04 crc kubenswrapper[4771]: E0219 21:53:04.357511 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:53:04 crc kubenswrapper[4771]: E0219 21:53:04.361085 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:53:04 crc kubenswrapper[4771]: E0219 21:53:04.371328 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:53:04 crc kubenswrapper[4771]: E0219 21:53:04.371385 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" containerName="nova-scheduler-scheduler" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.492012 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f06a8fe-d882-40f7-ac43-4eb2753d0ae4" path="/var/lib/kubelet/pods/0f06a8fe-d882-40f7-ac43-4eb2753d0ae4/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.494072 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c812f8b-6ad1-4873-8999-e649acd07d91" path="/var/lib/kubelet/pods/1c812f8b-6ad1-4873-8999-e649acd07d91/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.497379 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" path="/var/lib/kubelet/pods/1e5f8191-842e-4a39-ac28-b0042c51f813/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.500519 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" path="/var/lib/kubelet/pods/306067f7-88de-4cdb-8ca2-3540ada9b006/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.502376 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" path="/var/lib/kubelet/pods/3e4401f3-2019-40a6-8829-5e22598f176a/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.512726 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" path="/var/lib/kubelet/pods/4368f372-87b3-4f85-95f9-72c2046b0cc7/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.513464 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4938b4a1-32f8-4e7a-b334-8ce3b649fb46" path="/var/lib/kubelet/pods/4938b4a1-32f8-4e7a-b334-8ce3b649fb46/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.513907 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b605459-c654-463d-b66c-ec804185ea7d" path="/var/lib/kubelet/pods/4b605459-c654-463d-b66c-ec804185ea7d/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.515936 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" path="/var/lib/kubelet/pods/63798ee6-b629-437f-9e15-5bd46b79894e/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.517395 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" path="/var/lib/kubelet/pods/680c2961-e33c-44b1-aadd-37556bf4839c/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.520052 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" path="/var/lib/kubelet/pods/6d6a922b-a8fd-4f68-95e9-398e5a38bc6e/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.522113 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3b6c92-f6c8-4ea1-9bbb-3a5737468982" path="/var/lib/kubelet/pods/af3b6c92-f6c8-4ea1-9bbb-3a5737468982/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.524093 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" path="/var/lib/kubelet/pods/b64cc52a-4f20-4e05-b444-c46f97727527/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.527126 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b86c07-52cb-4911-8b7b-205eaa247226" path="/var/lib/kubelet/pods/c7b86c07-52cb-4911-8b7b-205eaa247226/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.527570 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbb34081-bc3a-451a-93c5-c28299467781" path="/var/lib/kubelet/pods/fbb34081-bc3a-451a-93c5-c28299467781/volumes" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.528960 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56b8bc52-11f9-4adf-8ba3-fbe39197b5aa/ovn-northd/0.log" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.529055 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.587416 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.635919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-metrics-certs-tls-certs\") pod \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.635990 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-config\") pod \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.636057 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-northd-tls-certs\") pod \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.636172 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm77m\" (UniqueName: \"kubernetes.io/projected/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-kube-api-access-xm77m\") pod \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.636232 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-combined-ca-bundle\") pod \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.636279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-scripts\") pod \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.636354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-rundir\") pod \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\" (UID: \"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.637171 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" (UID: "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.638903 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-scripts" (OuterVolumeSpecName: "scripts") pod "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" (UID: "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.638956 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-config" (OuterVolumeSpecName: "config") pod "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" (UID: "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.642445 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-kube-api-access-xm77m" (OuterVolumeSpecName: "kube-api-access-xm77m") pod "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" (UID: "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa"). InnerVolumeSpecName "kube-api-access-xm77m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.657215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" (UID: "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.694450 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" (UID: "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.714829 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" (UID: "56b8bc52-11f9-4adf-8ba3-fbe39197b5aa"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-confd\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738328 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-server-conf\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738357 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnr6x\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-kube-api-access-dnr6x\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738387 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738465 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-pod-info\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-erlang-cookie\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738502 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-plugins\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738518 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-tls\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738554 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-erlang-cookie-secret\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738582 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-plugins-conf\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.738600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\" (UID: \"c671dcf6-b1eb-4c4e-ba71-ae115ce811da\") " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739037 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm77m\" (UniqueName: \"kubernetes.io/projected/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-kube-api-access-xm77m\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739051 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739060 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739068 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739077 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739085 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739093 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.739822 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.740215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.740556 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.741917 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.742646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.743787 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-pod-info" (OuterVolumeSpecName: "pod-info") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.743842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-kube-api-access-dnr6x" (OuterVolumeSpecName: "kube-api-access-dnr6x") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "kube-api-access-dnr6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.745160 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.767329 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data" (OuterVolumeSpecName: "config-data") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.774524 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-server-conf" (OuterVolumeSpecName: "server-conf") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.806857 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c671dcf6-b1eb-4c4e-ba71-ae115ce811da" (UID: "c671dcf6-b1eb-4c4e-ba71-ae115ce811da"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841277 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841308 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841319 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841329 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841337 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841347 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841367 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841376 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841385 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841393 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnr6x\" (UniqueName: \"kubernetes.io/projected/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-kube-api-access-dnr6x\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.841400 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c671dcf6-b1eb-4c4e-ba71-ae115ce811da-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.859056 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 19 21:53:04 crc kubenswrapper[4771]: I0219 21:53:04.943283 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.033773 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56b8bc52-11f9-4adf-8ba3-fbe39197b5aa/ovn-northd/0.log" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.033830 4771 generic.go:334] "Generic (PLEG): container finished" podID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerID="afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" exitCode=139 Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.033940 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.034097 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa","Type":"ContainerDied","Data":"afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d"} Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.034141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56b8bc52-11f9-4adf-8ba3-fbe39197b5aa","Type":"ContainerDied","Data":"4dd0f23e8016717430615b1264d0540e72cfdc1b1f77df4ef1637e8a7645fbed"} Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.034347 4771 scope.go:117] "RemoveContainer" containerID="8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.046434 4771 generic.go:334] "Generic (PLEG): container finished" podID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerID="2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0" exitCode=0 Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.046486 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.046494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c671dcf6-b1eb-4c4e-ba71-ae115ce811da","Type":"ContainerDied","Data":"2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0"} Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.046520 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c671dcf6-b1eb-4c4e-ba71-ae115ce811da","Type":"ContainerDied","Data":"fab845b0439f6a1a16bdbb8e88d1b1dc36c4ea8558910ecfa0c68190aec76024"} Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.048898 4771 generic.go:334] "Generic (PLEG): container finished" podID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerID="9649db40b00ac2459bb1317904d312d05a0f8a16cff8030457aea979837d6541" exitCode=0 Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.048937 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17cd62c3-f3af-4144-b6b1-ec2cafb424ad","Type":"ContainerDied","Data":"9649db40b00ac2459bb1317904d312d05a0f8a16cff8030457aea979837d6541"} Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.194664 4771 scope.go:117] "RemoveContainer" containerID="afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.202800 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.217659 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.224543 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.228478 4771 scope.go:117] "RemoveContainer" containerID="8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.228935 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292\": container with ID starting with 8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292 not found: ID does not exist" containerID="8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.228973 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292"} err="failed to get container status \"8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292\": rpc error: code = NotFound desc = could not find container \"8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292\": container with ID starting with 8710ec5849f97d32c34bf769a78f5ca9d9bcc3bfec5919c9182b29c3d0ca5292 not found: ID does not exist" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.228995 4771 scope.go:117] "RemoveContainer" containerID="afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.229254 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d\": container with ID starting with afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d not found: ID does not exist" containerID="afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.229285 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d"} err="failed to get container status \"afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d\": rpc error: code = NotFound desc = could not find container \"afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d\": container with ID starting with afa24ec99480617c743fd8cd74706357a4d75b73169767f7bb1096347b72b75d not found: ID does not exist" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.229302 4771 scope.go:117] "RemoveContainer" containerID="2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.232344 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.290355 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.303142 4771 scope.go:117] "RemoveContainer" containerID="7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.308811 4771 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 19 21:53:05 crc kubenswrapper[4771]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-19T21:52:58Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:53:05 crc kubenswrapper[4771]: /etc/init.d/functions: line 589: 435 Alarm clock "$@" Feb 19 21:53:05 crc kubenswrapper[4771]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-js7lh" message=< Feb 19 21:53:05 crc kubenswrapper[4771]: Exiting ovn-controller (1) [FAILED] Feb 19 21:53:05 crc kubenswrapper[4771]: Killing ovn-controller (1) [ OK ] Feb 19 21:53:05 crc kubenswrapper[4771]: Killing ovn-controller (1) with SIGKILL [ OK ] Feb 19 21:53:05 crc kubenswrapper[4771]: 2026-02-19T21:52:58Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:53:05 crc kubenswrapper[4771]: /etc/init.d/functions: line 589: 435 Alarm clock "$@" Feb 19 21:53:05 crc kubenswrapper[4771]: > Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.308837 4771 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 19 21:53:05 crc kubenswrapper[4771]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-19T21:52:58Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:53:05 crc kubenswrapper[4771]: /etc/init.d/functions: line 589: 435 Alarm clock "$@" Feb 19 21:53:05 crc kubenswrapper[4771]: > pod="openstack/ovn-controller-js7lh" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" containerID="cri-o://eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.308865 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-js7lh" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" containerID="cri-o://eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" gracePeriod=22 Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349170 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349224 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-server-conf\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-tls\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349362 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-plugins\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349420 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-erlang-cookie-secret\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349499 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdldr\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-kube-api-access-mdldr\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349523 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-plugins-conf\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349552 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-erlang-cookie\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-pod-info\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.349664 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-confd\") pod \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\" (UID: \"17cd62c3-f3af-4144-b6b1-ec2cafb424ad\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.351812 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.352316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.352692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.354814 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.355427 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.355564 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-kube-api-access-mdldr" (OuterVolumeSpecName: "kube-api-access-mdldr") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "kube-api-access-mdldr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.356625 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-pod-info" (OuterVolumeSpecName: "pod-info") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.357790 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.401124 4771 scope.go:117] "RemoveContainer" containerID="2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.401853 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0\": container with ID starting with 2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0 not found: ID does not exist" containerID="2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.401875 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0"} err="failed to get container status \"2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0\": rpc error: code = NotFound desc = could not find container \"2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0\": container with ID starting with 2b4a7475834e30fbce0755832a766cce81af5049adbeb40ca351d1984f8d17c0 not found: ID does not exist" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.401895 4771 scope.go:117] "RemoveContainer" containerID="7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.402086 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9\": container with ID starting with 7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9 not found: ID does not exist" containerID="7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.402101 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9"} err="failed to get container status \"7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9\": rpc error: code = NotFound desc = could not find container \"7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9\": container with ID starting with 7c54eac07c71914fb33fe180f9bcfe5dbf3ef859aa4d5f78b488e3758fae42c9 not found: ID does not exist" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.414673 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data" (OuterVolumeSpecName: "config-data") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.421916 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-server-conf" (OuterVolumeSpecName: "server-conf") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451885 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451916 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451927 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdldr\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-kube-api-access-mdldr\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451937 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451947 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451955 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451974 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451985 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.451992 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.452002 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.468301 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "17cd62c3-f3af-4144-b6b1-ec2cafb424ad" (UID: "17cd62c3-f3af-4144-b6b1-ec2cafb424ad"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.468742 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.470068 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8 is running failed: container process not found" containerID="eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.470321 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8 is running failed: container process not found" containerID="eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.470492 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8 is running failed: container process not found" containerID="eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.470516 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-js7lh" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.488779 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.490172 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.492496 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.492522 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.501670 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.509114 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.513059 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:05 crc kubenswrapper[4771]: E0219 21:53:05.513087 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.553437 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17cd62c3-f3af-4144-b6b1-ec2cafb424ad-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.553465 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.592327 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.593596 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.663170 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-scripts\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.663257 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-sg-core-conf-yaml\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.663291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-log-httpd\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.663327 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-logs\") pod \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.663343 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data-custom\") pod \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.663785 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.663975 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-logs" (OuterVolumeSpecName: "logs") pod "1b36b1b4-e091-4032-90aa-7f983b5c4b4f" (UID: "1b36b1b4-e091-4032-90aa-7f983b5c4b4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.665495 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-combined-ca-bundle\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.667166 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1b36b1b4-e091-4032-90aa-7f983b5c4b4f" (UID: "1b36b1b4-e091-4032-90aa-7f983b5c4b4f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.667354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-config-data\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.667455 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-ceilometer-tls-certs\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.667563 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hcmh\" (UniqueName: \"kubernetes.io/projected/6733d997-54a6-4c46-8692-db99d5a20a9e-kube-api-access-4hcmh\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.667915 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data\") pod \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.668031 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxqds\" (UniqueName: \"kubernetes.io/projected/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-kube-api-access-fxqds\") pod \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.668214 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-combined-ca-bundle\") pod \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\" (UID: \"1b36b1b4-e091-4032-90aa-7f983b5c4b4f\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.668317 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-run-httpd\") pod \"6733d997-54a6-4c46-8692-db99d5a20a9e\" (UID: \"6733d997-54a6-4c46-8692-db99d5a20a9e\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.669600 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.671236 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.671448 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.674155 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6733d997-54a6-4c46-8692-db99d5a20a9e-kube-api-access-4hcmh" (OuterVolumeSpecName: "kube-api-access-4hcmh") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "kube-api-access-4hcmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.676257 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.676800 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-scripts" (OuterVolumeSpecName: "scripts") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.691037 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-kube-api-access-fxqds" (OuterVolumeSpecName: "kube-api-access-fxqds") pod "1b36b1b4-e091-4032-90aa-7f983b5c4b4f" (UID: "1b36b1b4-e091-4032-90aa-7f983b5c4b4f"). InnerVolumeSpecName "kube-api-access-fxqds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.700793 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b36b1b4-e091-4032-90aa-7f983b5c4b4f" (UID: "1b36b1b4-e091-4032-90aa-7f983b5c4b4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.702166 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.715749 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data" (OuterVolumeSpecName: "config-data") pod "1b36b1b4-e091-4032-90aa-7f983b5c4b4f" (UID: "1b36b1b4-e091-4032-90aa-7f983b5c4b4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.721374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.732824 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.738292 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.740366 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.742833 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-js7lh_d016daf3-054a-4914-8711-fc82edab9f88/ovn-controller/0.log" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.742963 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.752642 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-config-data" (OuterVolumeSpecName: "config-data") pod "6733d997-54a6-4c46-8692-db99d5a20a9e" (UID: "6733d997-54a6-4c46-8692-db99d5a20a9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772914 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6733d997-54a6-4c46-8692-db99d5a20a9e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772943 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772951 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772963 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772971 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772979 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6733d997-54a6-4c46-8692-db99d5a20a9e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772987 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hcmh\" (UniqueName: \"kubernetes.io/projected/6733d997-54a6-4c46-8692-db99d5a20a9e-kube-api-access-4hcmh\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.772995 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.773004 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxqds\" (UniqueName: \"kubernetes.io/projected/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-kube-api-access-fxqds\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.773012 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b36b1b4-e091-4032-90aa-7f983b5c4b4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.874736 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a201b832-5d71-4947-98ba-02adf91bccd5-logs\") pod \"a201b832-5d71-4947-98ba-02adf91bccd5\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.874820 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvvxk\" (UniqueName: \"kubernetes.io/projected/98bf8c7c-234c-47ea-86aa-4b1313f7b983-kube-api-access-kvvxk\") pod \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.874870 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-combined-ca-bundle\") pod \"a201b832-5d71-4947-98ba-02adf91bccd5\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.874960 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g5tj\" (UniqueName: \"kubernetes.io/projected/d016daf3-054a-4914-8711-fc82edab9f88-kube-api-access-8g5tj\") pod \"d016daf3-054a-4914-8711-fc82edab9f88\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.874997 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data-custom\") pod \"a201b832-5d71-4947-98ba-02adf91bccd5\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875060 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-ovn-controller-tls-certs\") pod \"d016daf3-054a-4914-8711-fc82edab9f88\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875104 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-config-data\") pod \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875150 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-log-ovn\") pod \"d016daf3-054a-4914-8711-fc82edab9f88\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875181 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run-ovn\") pod \"d016daf3-054a-4914-8711-fc82edab9f88\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875230 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-combined-ca-bundle\") pod \"d016daf3-054a-4914-8711-fc82edab9f88\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875264 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d016daf3-054a-4914-8711-fc82edab9f88-scripts\") pod \"d016daf3-054a-4914-8711-fc82edab9f88\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-combined-ca-bundle\") pod \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\" (UID: \"98bf8c7c-234c-47ea-86aa-4b1313f7b983\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875332 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data\") pod \"a201b832-5d71-4947-98ba-02adf91bccd5\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875393 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run\") pod \"d016daf3-054a-4914-8711-fc82edab9f88\" (UID: \"d016daf3-054a-4914-8711-fc82edab9f88\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.875431 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc8dx\" (UniqueName: \"kubernetes.io/projected/a201b832-5d71-4947-98ba-02adf91bccd5-kube-api-access-dc8dx\") pod \"a201b832-5d71-4947-98ba-02adf91bccd5\" (UID: \"a201b832-5d71-4947-98ba-02adf91bccd5\") " Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.876191 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d016daf3-054a-4914-8711-fc82edab9f88" (UID: "d016daf3-054a-4914-8711-fc82edab9f88"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.876496 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d016daf3-054a-4914-8711-fc82edab9f88" (UID: "d016daf3-054a-4914-8711-fc82edab9f88"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.876693 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run" (OuterVolumeSpecName: "var-run") pod "d016daf3-054a-4914-8711-fc82edab9f88" (UID: "d016daf3-054a-4914-8711-fc82edab9f88"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.878979 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a201b832-5d71-4947-98ba-02adf91bccd5-logs" (OuterVolumeSpecName: "logs") pod "a201b832-5d71-4947-98ba-02adf91bccd5" (UID: "a201b832-5d71-4947-98ba-02adf91bccd5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.879392 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d016daf3-054a-4914-8711-fc82edab9f88-scripts" (OuterVolumeSpecName: "scripts") pod "d016daf3-054a-4914-8711-fc82edab9f88" (UID: "d016daf3-054a-4914-8711-fc82edab9f88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.881162 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a201b832-5d71-4947-98ba-02adf91bccd5" (UID: "a201b832-5d71-4947-98ba-02adf91bccd5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.881313 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bf8c7c-234c-47ea-86aa-4b1313f7b983-kube-api-access-kvvxk" (OuterVolumeSpecName: "kube-api-access-kvvxk") pod "98bf8c7c-234c-47ea-86aa-4b1313f7b983" (UID: "98bf8c7c-234c-47ea-86aa-4b1313f7b983"). InnerVolumeSpecName "kube-api-access-kvvxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.885245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a201b832-5d71-4947-98ba-02adf91bccd5-kube-api-access-dc8dx" (OuterVolumeSpecName: "kube-api-access-dc8dx") pod "a201b832-5d71-4947-98ba-02adf91bccd5" (UID: "a201b832-5d71-4947-98ba-02adf91bccd5"). InnerVolumeSpecName "kube-api-access-dc8dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.887982 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d016daf3-054a-4914-8711-fc82edab9f88-kube-api-access-8g5tj" (OuterVolumeSpecName: "kube-api-access-8g5tj") pod "d016daf3-054a-4914-8711-fc82edab9f88" (UID: "d016daf3-054a-4914-8711-fc82edab9f88"). InnerVolumeSpecName "kube-api-access-8g5tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.897162 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a201b832-5d71-4947-98ba-02adf91bccd5" (UID: "a201b832-5d71-4947-98ba-02adf91bccd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.899274 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98bf8c7c-234c-47ea-86aa-4b1313f7b983" (UID: "98bf8c7c-234c-47ea-86aa-4b1313f7b983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.912588 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-config-data" (OuterVolumeSpecName: "config-data") pod "98bf8c7c-234c-47ea-86aa-4b1313f7b983" (UID: "98bf8c7c-234c-47ea-86aa-4b1313f7b983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.920364 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data" (OuterVolumeSpecName: "config-data") pod "a201b832-5d71-4947-98ba-02adf91bccd5" (UID: "a201b832-5d71-4947-98ba-02adf91bccd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.946381 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "d016daf3-054a-4914-8711-fc82edab9f88" (UID: "d016daf3-054a-4914-8711-fc82edab9f88"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.947310 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d016daf3-054a-4914-8711-fc82edab9f88" (UID: "d016daf3-054a-4914-8711-fc82edab9f88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985121 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g5tj\" (UniqueName: \"kubernetes.io/projected/d016daf3-054a-4914-8711-fc82edab9f88-kube-api-access-8g5tj\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985173 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985192 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985212 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985231 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985247 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985264 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d016daf3-054a-4914-8711-fc82edab9f88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985281 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d016daf3-054a-4914-8711-fc82edab9f88-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985298 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf8c7c-234c-47ea-86aa-4b1313f7b983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985314 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985330 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d016daf3-054a-4914-8711-fc82edab9f88-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985346 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8dx\" (UniqueName: \"kubernetes.io/projected/a201b832-5d71-4947-98ba-02adf91bccd5-kube-api-access-dc8dx\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985364 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a201b832-5d71-4947-98ba-02adf91bccd5-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985381 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvvxk\" (UniqueName: \"kubernetes.io/projected/98bf8c7c-234c-47ea-86aa-4b1313f7b983-kube-api-access-kvvxk\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:05 crc kubenswrapper[4771]: I0219 21:53:05.985397 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a201b832-5d71-4947-98ba-02adf91bccd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.086448 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.086447 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"17cd62c3-f3af-4144-b6b1-ec2cafb424ad","Type":"ContainerDied","Data":"dd0d59f4e36f9194b800f7a23d8e86cdc37a1d94be5b4b9b96d656186b179bf5"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.086627 4771 scope.go:117] "RemoveContainer" containerID="9649db40b00ac2459bb1317904d312d05a0f8a16cff8030457aea979837d6541" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.090843 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-js7lh_d016daf3-054a-4914-8711-fc82edab9f88/ovn-controller/0.log" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.090891 4771 generic.go:334] "Generic (PLEG): container finished" podID="d016daf3-054a-4914-8711-fc82edab9f88" containerID="eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" exitCode=137 Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.090950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh" event={"ID":"d016daf3-054a-4914-8711-fc82edab9f88","Type":"ContainerDied","Data":"eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.090977 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-js7lh" event={"ID":"d016daf3-054a-4914-8711-fc82edab9f88","Type":"ContainerDied","Data":"a9f5f6c807a52b1ed4077a6997a9fd7cd8e3aca97fcd66b27da51717fbcad3cd"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.091232 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-js7lh" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.093915 4771 generic.go:334] "Generic (PLEG): container finished" podID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" exitCode=0 Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.093973 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"98bf8c7c-234c-47ea-86aa-4b1313f7b983","Type":"ContainerDied","Data":"6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.093998 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"98bf8c7c-234c-47ea-86aa-4b1313f7b983","Type":"ContainerDied","Data":"81a0cb6ef754df2b0925b0afa093507c7e97ef1c5e12cb0fa5d717bf48b52b4f"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.094063 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.100098 4771 generic.go:334] "Generic (PLEG): container finished" podID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerID="0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8" exitCode=0 Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.100285 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7965bd8f87-ttph6" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.100305 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7965bd8f87-ttph6" event={"ID":"1b36b1b4-e091-4032-90aa-7f983b5c4b4f","Type":"ContainerDied","Data":"0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.100369 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7965bd8f87-ttph6" event={"ID":"1b36b1b4-e091-4032-90aa-7f983b5c4b4f","Type":"ContainerDied","Data":"f0619468cc8a83de8d4aa87f9c90269413a4afaaf23cdba825ac780da85e3679"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.103748 4771 generic.go:334] "Generic (PLEG): container finished" podID="a201b832-5d71-4947-98ba-02adf91bccd5" containerID="28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03" exitCode=0 Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.103838 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" event={"ID":"a201b832-5d71-4947-98ba-02adf91bccd5","Type":"ContainerDied","Data":"28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.103871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" event={"ID":"a201b832-5d71-4947-98ba-02adf91bccd5","Type":"ContainerDied","Data":"4d378e093609eebcb7349df78092b6f9bdd649912cfa737bf9f2b8c97daea77e"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.103878 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8c658556-bs9wt" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.115231 4771 generic.go:334] "Generic (PLEG): container finished" podID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerID="30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab" exitCode=0 Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.115300 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerDied","Data":"30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.115364 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6733d997-54a6-4c46-8692-db99d5a20a9e","Type":"ContainerDied","Data":"a7189f907c3f6278bb817eb96a4052cc309c4db84001bae0fa037b2f77fbb2c5"} Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.115311 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.120563 4771 scope.go:117] "RemoveContainer" containerID="dcc43673db211834491a0c5c33660484520c0820b9253defee749bead7148c83" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.151834 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.165734 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.179117 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.183279 4771 scope.go:117] "RemoveContainer" containerID="eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.220413 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.230676 4771 scope.go:117] "RemoveContainer" containerID="eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.233410 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8\": container with ID starting with eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8 not found: ID does not exist" containerID="eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.233453 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8"} err="failed to get container status \"eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8\": rpc error: code = NotFound desc = could not find container \"eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8\": container with ID starting with eae7e0089ef5d553514050a8af1e9131d1d87025cdf81f7173731d13e0df60a8 not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.233479 4771 scope.go:117] "RemoveContainer" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.261936 4771 scope.go:117] "RemoveContainer" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.265173 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7965bd8f87-ttph6"] Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.265643 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30\": container with ID starting with 6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30 not found: ID does not exist" containerID="6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.269110 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30"} err="failed to get container status \"6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30\": rpc error: code = NotFound desc = could not find container \"6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30\": container with ID starting with 6949aafca83225f4323a6643e8af285dd20b1d9e71a03542059822a1c2967f30 not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.269219 4771 scope.go:117] "RemoveContainer" containerID="0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.270068 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7965bd8f87-ttph6"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.279310 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8c658556-bs9wt"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.289075 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-8c658556-bs9wt"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.289147 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-js7lh"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.295052 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-js7lh"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.316611 4771 scope.go:117] "RemoveContainer" containerID="4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.327220 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.343163 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.376851 4771 scope.go:117] "RemoveContainer" containerID="0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.390166 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8\": container with ID starting with 0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8 not found: ID does not exist" containerID="0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.390361 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8"} err="failed to get container status \"0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8\": rpc error: code = NotFound desc = could not find container \"0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8\": container with ID starting with 0ecb9b280799923172fa1dee143825478502915d4e273c9a5e6d90a94963dff8 not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.390467 4771 scope.go:117] "RemoveContainer" containerID="4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.397111 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a\": container with ID starting with 4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a not found: ID does not exist" containerID="4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.397260 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a"} err="failed to get container status \"4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a\": rpc error: code = NotFound desc = could not find container \"4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a\": container with ID starting with 4a8b7c9af6f0e8593bb8bab1bfc5560e1adfd0c66b07e1be60ec9a90c220f45a not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.397331 4771 scope.go:117] "RemoveContainer" containerID="28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.419672 4771 scope.go:117] "RemoveContainer" containerID="8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.442184 4771 scope.go:117] "RemoveContainer" containerID="28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.443380 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03\": container with ID starting with 28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03 not found: ID does not exist" containerID="28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.443424 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03"} err="failed to get container status \"28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03\": rpc error: code = NotFound desc = could not find container \"28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03\": container with ID starting with 28de1909208f8a7665074255954bd4a891ce8261ac8161400a2ed7828108ef03 not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.443445 4771 scope.go:117] "RemoveContainer" containerID="8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.446499 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a\": container with ID starting with 8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a not found: ID does not exist" containerID="8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.446539 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a"} err="failed to get container status \"8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a\": rpc error: code = NotFound desc = could not find container \"8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a\": container with ID starting with 8a20e19269b476a4166414af80479c0b233fd2a3a181cb8de54bd0fc5a5acb6a not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.446567 4771 scope.go:117] "RemoveContainer" containerID="1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.448300 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" path="/var/lib/kubelet/pods/17cd62c3-f3af-4144-b6b1-ec2cafb424ad/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.448862 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" path="/var/lib/kubelet/pods/1b36b1b4-e091-4032-90aa-7f983b5c4b4f/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.450141 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" path="/var/lib/kubelet/pods/56b8bc52-11f9-4adf-8ba3-fbe39197b5aa/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.450704 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" path="/var/lib/kubelet/pods/6733d997-54a6-4c46-8692-db99d5a20a9e/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.451352 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" path="/var/lib/kubelet/pods/98bf8c7c-234c-47ea-86aa-4b1313f7b983/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.452309 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" path="/var/lib/kubelet/pods/a201b832-5d71-4947-98ba-02adf91bccd5/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.454107 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" path="/var/lib/kubelet/pods/c671dcf6-b1eb-4c4e-ba71-ae115ce811da/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.455129 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d016daf3-054a-4914-8711-fc82edab9f88" path="/var/lib/kubelet/pods/d016daf3-054a-4914-8711-fc82edab9f88/volumes" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.505623 4771 scope.go:117] "RemoveContainer" containerID="b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.525073 4771 scope.go:117] "RemoveContainer" containerID="30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.543586 4771 scope.go:117] "RemoveContainer" containerID="88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.576663 4771 scope.go:117] "RemoveContainer" containerID="1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.577215 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60\": container with ID starting with 1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60 not found: ID does not exist" containerID="1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.577253 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60"} err="failed to get container status \"1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60\": rpc error: code = NotFound desc = could not find container \"1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60\": container with ID starting with 1cd0c6ab7e99448a83723398264c4f213e0ee9cb6b0962a53be304ec72437f60 not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.577282 4771 scope.go:117] "RemoveContainer" containerID="b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.577677 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38\": container with ID starting with b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38 not found: ID does not exist" containerID="b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.577697 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38"} err="failed to get container status \"b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38\": rpc error: code = NotFound desc = could not find container \"b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38\": container with ID starting with b0e3e9b4533ad45e173bc6a9d09245b4f0b63e005eaa093ccfabf5840e284c38 not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.577711 4771 scope.go:117] "RemoveContainer" containerID="30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.578083 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab\": container with ID starting with 30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab not found: ID does not exist" containerID="30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.578126 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab"} err="failed to get container status \"30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab\": rpc error: code = NotFound desc = could not find container \"30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab\": container with ID starting with 30258688903e0a7c339b0a903499f61962c574c1bed818e9de453683dfb054ab not found: ID does not exist" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.578154 4771 scope.go:117] "RemoveContainer" containerID="88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98" Feb 19 21:53:06 crc kubenswrapper[4771]: E0219 21:53:06.578614 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98\": container with ID starting with 88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98 not found: ID does not exist" containerID="88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98" Feb 19 21:53:06 crc kubenswrapper[4771]: I0219 21:53:06.578640 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98"} err="failed to get container status \"88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98\": rpc error: code = NotFound desc = could not find container \"88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98\": container with ID starting with 88dedb904debac2e87816a039179a8c0cf5886a811053891ae2688254363dd98 not found: ID does not exist" Feb 19 21:53:09 crc kubenswrapper[4771]: I0219 21:53:09.968233 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c4d6b9785-889jt" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9696/\": dial tcp 10.217.0.171:9696: connect: connection refused" Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.487510 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.488500 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.489180 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.489241 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.490691 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.494158 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.497343 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:10 crc kubenswrapper[4771]: E0219 21:53:10.497392 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:53:11 crc kubenswrapper[4771]: I0219 21:53:11.030320 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:11 crc kubenswrapper[4771]: I0219 21:53:11.030724 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.205587 4771 generic.go:334] "Generic (PLEG): container finished" podID="b5526752-6549-40aa-8443-9ad3572799d2" containerID="28873d69596e9435ccc225be2172f05b9359e76cc1602b3c9c36e2902b461021" exitCode=0 Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.206004 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4d6b9785-889jt" event={"ID":"b5526752-6549-40aa-8443-9ad3572799d2","Type":"ContainerDied","Data":"28873d69596e9435ccc225be2172f05b9359e76cc1602b3c9c36e2902b461021"} Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.634709 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.708057 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-httpd-config\") pod \"b5526752-6549-40aa-8443-9ad3572799d2\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.708189 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-ovndb-tls-certs\") pod \"b5526752-6549-40aa-8443-9ad3572799d2\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.708279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-public-tls-certs\") pod \"b5526752-6549-40aa-8443-9ad3572799d2\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.708309 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-combined-ca-bundle\") pod \"b5526752-6549-40aa-8443-9ad3572799d2\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.708334 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks497\" (UniqueName: \"kubernetes.io/projected/b5526752-6549-40aa-8443-9ad3572799d2-kube-api-access-ks497\") pod \"b5526752-6549-40aa-8443-9ad3572799d2\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.708401 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-internal-tls-certs\") pod \"b5526752-6549-40aa-8443-9ad3572799d2\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.708445 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-config\") pod \"b5526752-6549-40aa-8443-9ad3572799d2\" (UID: \"b5526752-6549-40aa-8443-9ad3572799d2\") " Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.716961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b5526752-6549-40aa-8443-9ad3572799d2" (UID: "b5526752-6549-40aa-8443-9ad3572799d2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.722637 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5526752-6549-40aa-8443-9ad3572799d2-kube-api-access-ks497" (OuterVolumeSpecName: "kube-api-access-ks497") pod "b5526752-6549-40aa-8443-9ad3572799d2" (UID: "b5526752-6549-40aa-8443-9ad3572799d2"). InnerVolumeSpecName "kube-api-access-ks497". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.748932 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-config" (OuterVolumeSpecName: "config") pod "b5526752-6549-40aa-8443-9ad3572799d2" (UID: "b5526752-6549-40aa-8443-9ad3572799d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.749884 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b5526752-6549-40aa-8443-9ad3572799d2" (UID: "b5526752-6549-40aa-8443-9ad3572799d2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.760859 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b5526752-6549-40aa-8443-9ad3572799d2" (UID: "b5526752-6549-40aa-8443-9ad3572799d2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.763293 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5526752-6549-40aa-8443-9ad3572799d2" (UID: "b5526752-6549-40aa-8443-9ad3572799d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.794580 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b5526752-6549-40aa-8443-9ad3572799d2" (UID: "b5526752-6549-40aa-8443-9ad3572799d2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.809880 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.809915 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.809929 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks497\" (UniqueName: \"kubernetes.io/projected/b5526752-6549-40aa-8443-9ad3572799d2-kube-api-access-ks497\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.809944 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.809959 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.809972 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.809984 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5526752-6549-40aa-8443-9ad3572799d2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.956889 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.956989 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.957113 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.958222 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3de117c6f677c9430a82ad131a4b88d218c50126d4fd008ad4d29c8f59650395"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:53:12 crc kubenswrapper[4771]: I0219 21:53:12.958342 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://3de117c6f677c9430a82ad131a4b88d218c50126d4fd008ad4d29c8f59650395" gracePeriod=600 Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.222661 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="3de117c6f677c9430a82ad131a4b88d218c50126d4fd008ad4d29c8f59650395" exitCode=0 Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.222776 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"3de117c6f677c9430a82ad131a4b88d218c50126d4fd008ad4d29c8f59650395"} Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.222860 4771 scope.go:117] "RemoveContainer" containerID="b980742e22e0e4d74d8de52154d96047cf8f0b8f6b46b5fe1e26926aa9dcd46b" Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.227588 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4d6b9785-889jt" event={"ID":"b5526752-6549-40aa-8443-9ad3572799d2","Type":"ContainerDied","Data":"39393119ad43783eb2cb65c6296110c5c8ca46ff51c0cb0efa1a2788ca5fdd75"} Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.227700 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4d6b9785-889jt" Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.307492 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c4d6b9785-889jt"] Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.311725 4771 scope.go:117] "RemoveContainer" containerID="89ed7fd8b6567c5530e1c7e743cf565b8674ecca14b4cf2d6f7229ab2c36478e" Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.322296 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c4d6b9785-889jt"] Feb 19 21:53:13 crc kubenswrapper[4771]: I0219 21:53:13.365230 4771 scope.go:117] "RemoveContainer" containerID="28873d69596e9435ccc225be2172f05b9359e76cc1602b3c9c36e2902b461021" Feb 19 21:53:14 crc kubenswrapper[4771]: I0219 21:53:14.240849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a"} Feb 19 21:53:14 crc kubenswrapper[4771]: I0219 21:53:14.445959 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5526752-6549-40aa-8443-9ad3572799d2" path="/var/lib/kubelet/pods/b5526752-6549-40aa-8443-9ad3572799d2/volumes" Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.487655 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.488262 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.489005 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.489214 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.489682 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.491740 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.493556 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:15 crc kubenswrapper[4771]: E0219 21:53:15.493654 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:53:16 crc kubenswrapper[4771]: I0219 21:53:16.040323 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:16 crc kubenswrapper[4771]: I0219 21:53:16.040333 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.487268 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.489215 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.489498 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.490183 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.490242 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.490891 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.493412 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:20 crc kubenswrapper[4771]: E0219 21:53:20.493464 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:53:21 crc kubenswrapper[4771]: I0219 21:53:21.049244 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:21 crc kubenswrapper[4771]: I0219 21:53:21.049277 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.487867 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.488858 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.489379 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.489426 4771 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.490115 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.491958 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.495157 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:53:25 crc kubenswrapper[4771]: E0219 21:53:25.495934 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-94qm4" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:53:26 crc kubenswrapper[4771]: I0219 21:53:26.058281 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:26 crc kubenswrapper[4771]: I0219 21:53:26.058320 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.411532 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-94qm4_a53e8d42-c35f-4ea0-9314-f4d849908089/ovs-vswitchd/0.log" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.413136 4771 generic.go:334] "Generic (PLEG): container finished" podID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" exitCode=137 Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.413194 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerDied","Data":"daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05"} Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.422869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"50d7f0fe603dcd477e42bf6d5cd23293366abcc57115b1583128c070088d06e9"} Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.422884 4771 generic.go:334] "Generic (PLEG): container finished" podID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerID="50d7f0fe603dcd477e42bf6d5cd23293366abcc57115b1583128c070088d06e9" exitCode=137 Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.423005 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5580e95c-81dc-4c90-bb0c-9b27a4a8c971","Type":"ContainerDied","Data":"91f546d67d148f31cb9294b4c3aa8ef7221cd85a19519fec2d7f4f58bdcc573a"} Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.423039 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91f546d67d148f31cb9294b4c3aa8ef7221cd85a19519fec2d7f4f58bdcc573a" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.459723 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.578293 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2sh6\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-kube-api-access-s2sh6\") pod \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.578373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-cache\") pod \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.578426 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") pod \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.578513 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-lock\") pod \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.578552 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.578577 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-combined-ca-bundle\") pod \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\" (UID: \"5580e95c-81dc-4c90-bb0c-9b27a4a8c971\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.580469 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-lock" (OuterVolumeSpecName: "lock") pod "5580e95c-81dc-4c90-bb0c-9b27a4a8c971" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.581161 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-cache" (OuterVolumeSpecName: "cache") pod "5580e95c-81dc-4c90-bb0c-9b27a4a8c971" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.585422 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-kube-api-access-s2sh6" (OuterVolumeSpecName: "kube-api-access-s2sh6") pod "5580e95c-81dc-4c90-bb0c-9b27a4a8c971" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971"). InnerVolumeSpecName "kube-api-access-s2sh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.585520 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5580e95c-81dc-4c90-bb0c-9b27a4a8c971" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.591604 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "5580e95c-81dc-4c90-bb0c-9b27a4a8c971" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.681305 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2sh6\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-kube-api-access-s2sh6\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.681354 4771 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-cache\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.681405 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.681423 4771 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.681500 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.717495 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.782946 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.896611 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-94qm4_a53e8d42-c35f-4ea0-9314-f4d849908089/ovs-vswitchd/0.log" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.897656 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.950282 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5580e95c-81dc-4c90-bb0c-9b27a4a8c971" (UID: "5580e95c-81dc-4c90-bb0c-9b27a4a8c971"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.985192 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-lib\") pod \"a53e8d42-c35f-4ea0-9314-f4d849908089\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.985359 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-lib" (OuterVolumeSpecName: "var-lib") pod "a53e8d42-c35f-4ea0-9314-f4d849908089" (UID: "a53e8d42-c35f-4ea0-9314-f4d849908089"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.985540 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e8d42-c35f-4ea0-9314-f4d849908089-scripts\") pod \"a53e8d42-c35f-4ea0-9314-f4d849908089\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.985673 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-etc-ovs\") pod \"a53e8d42-c35f-4ea0-9314-f4d849908089\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.985780 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88nf9\" (UniqueName: \"kubernetes.io/projected/a53e8d42-c35f-4ea0-9314-f4d849908089-kube-api-access-88nf9\") pod \"a53e8d42-c35f-4ea0-9314-f4d849908089\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.985900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-log\") pod \"a53e8d42-c35f-4ea0-9314-f4d849908089\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.986051 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-run\") pod \"a53e8d42-c35f-4ea0-9314-f4d849908089\" (UID: \"a53e8d42-c35f-4ea0-9314-f4d849908089\") " Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.986494 4771 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.986577 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5580e95c-81dc-4c90-bb0c-9b27a4a8c971-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.986682 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-run" (OuterVolumeSpecName: "var-run") pod "a53e8d42-c35f-4ea0-9314-f4d849908089" (UID: "a53e8d42-c35f-4ea0-9314-f4d849908089"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.986788 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-log" (OuterVolumeSpecName: "var-log") pod "a53e8d42-c35f-4ea0-9314-f4d849908089" (UID: "a53e8d42-c35f-4ea0-9314-f4d849908089"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.986833 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "a53e8d42-c35f-4ea0-9314-f4d849908089" (UID: "a53e8d42-c35f-4ea0-9314-f4d849908089"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.987371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a53e8d42-c35f-4ea0-9314-f4d849908089-scripts" (OuterVolumeSpecName: "scripts") pod "a53e8d42-c35f-4ea0-9314-f4d849908089" (UID: "a53e8d42-c35f-4ea0-9314-f4d849908089"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:53:27 crc kubenswrapper[4771]: I0219 21:53:27.991318 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a53e8d42-c35f-4ea0-9314-f4d849908089-kube-api-access-88nf9" (OuterVolumeSpecName: "kube-api-access-88nf9") pod "a53e8d42-c35f-4ea0-9314-f4d849908089" (UID: "a53e8d42-c35f-4ea0-9314-f4d849908089"). InnerVolumeSpecName "kube-api-access-88nf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.088541 4771 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.088587 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.088607 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a53e8d42-c35f-4ea0-9314-f4d849908089-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.088623 4771 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a53e8d42-c35f-4ea0-9314-f4d849908089-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.088641 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88nf9\" (UniqueName: \"kubernetes.io/projected/a53e8d42-c35f-4ea0-9314-f4d849908089-kube-api-access-88nf9\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.438574 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-94qm4_a53e8d42-c35f-4ea0-9314-f4d849908089/ovs-vswitchd/0.log" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.440633 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-94qm4" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.440689 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.466795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-94qm4" event={"ID":"a53e8d42-c35f-4ea0-9314-f4d849908089","Type":"ContainerDied","Data":"b468a81d180a3db9ea5135228c181090cc908a8d5b06793a384acb7c67cc1639"} Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.466874 4771 scope.go:117] "RemoveContainer" containerID="daf1adb4d74557b67abcd92bfd740c01e418b2c9daa2b0be39210541ec103e05" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.508214 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.522300 4771 scope.go:117] "RemoveContainer" containerID="e191fde29e54ce5cca022e76ba29d3c936bbf0ac161ac051aa1caf533395261f" Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.523387 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.532186 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-94qm4"] Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.542841 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-94qm4"] Feb 19 21:53:28 crc kubenswrapper[4771]: I0219 21:53:28.557358 4771 scope.go:117] "RemoveContainer" containerID="3fa5c85f92f572849f551738e161a886712697c5a2743885746274fbacd4eff4" Feb 19 21:53:29 crc kubenswrapper[4771]: I0219 21:53:29.125967 4771 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9d59b620-a9f9-4539-98e2-a4ad4d97d442"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9d59b620-a9f9-4539-98e2-a4ad4d97d442] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9d59b620_a9f9_4539_98e2_a4ad4d97d442.slice" Feb 19 21:53:29 crc kubenswrapper[4771]: E0219 21:53:29.126053 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod9d59b620-a9f9-4539-98e2-a4ad4d97d442] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod9d59b620-a9f9-4539-98e2-a4ad4d97d442] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9d59b620_a9f9_4539_98e2_a4ad4d97d442.slice" pod="openstack/ovsdbserver-sb-0" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" Feb 19 21:53:29 crc kubenswrapper[4771]: I0219 21:53:29.455792 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:53:29 crc kubenswrapper[4771]: I0219 21:53:29.495590 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:53:29 crc kubenswrapper[4771]: I0219 21:53:29.504106 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.457154 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" path="/var/lib/kubelet/pods/5580e95c-81dc-4c90-bb0c-9b27a4a8c971/volumes" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.462214 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d59b620-a9f9-4539-98e2-a4ad4d97d442" path="/var/lib/kubelet/pods/9d59b620-a9f9-4539-98e2-a4ad4d97d442/volumes" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.463401 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" path="/var/lib/kubelet/pods/a53e8d42-c35f-4ea0-9314-f4d849908089/volumes" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.483465 4771 generic.go:334] "Generic (PLEG): container finished" podID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerID="cdce0b0c182a30ee60311ed46bbd115e0505327a1cb4095a7212d2553f1b9638" exitCode=137 Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.483560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf8786687-hjsp2" event={"ID":"fe42dab7-ecba-4c58-8c1d-e9489760692a","Type":"ContainerDied","Data":"cdce0b0c182a30ee60311ed46bbd115e0505327a1cb4095a7212d2553f1b9638"} Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.483605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf8786687-hjsp2" event={"ID":"fe42dab7-ecba-4c58-8c1d-e9489760692a","Type":"ContainerDied","Data":"677e88010bf1996d567e664fa70b5a198a44a50288ec87470089047e4ac8f1ba"} Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.483626 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="677e88010bf1996d567e664fa70b5a198a44a50288ec87470089047e4ac8f1ba" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.517644 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.642799 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld47v\" (UniqueName: \"kubernetes.io/projected/fe42dab7-ecba-4c58-8c1d-e9489760692a-kube-api-access-ld47v\") pod \"fe42dab7-ecba-4c58-8c1d-e9489760692a\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.643006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe42dab7-ecba-4c58-8c1d-e9489760692a-logs\") pod \"fe42dab7-ecba-4c58-8c1d-e9489760692a\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.643121 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data\") pod \"fe42dab7-ecba-4c58-8c1d-e9489760692a\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.643160 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-combined-ca-bundle\") pod \"fe42dab7-ecba-4c58-8c1d-e9489760692a\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.643199 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data-custom\") pod \"fe42dab7-ecba-4c58-8c1d-e9489760692a\" (UID: \"fe42dab7-ecba-4c58-8c1d-e9489760692a\") " Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.643667 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe42dab7-ecba-4c58-8c1d-e9489760692a-logs" (OuterVolumeSpecName: "logs") pod "fe42dab7-ecba-4c58-8c1d-e9489760692a" (UID: "fe42dab7-ecba-4c58-8c1d-e9489760692a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.649180 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fe42dab7-ecba-4c58-8c1d-e9489760692a" (UID: "fe42dab7-ecba-4c58-8c1d-e9489760692a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.649811 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe42dab7-ecba-4c58-8c1d-e9489760692a-kube-api-access-ld47v" (OuterVolumeSpecName: "kube-api-access-ld47v") pod "fe42dab7-ecba-4c58-8c1d-e9489760692a" (UID: "fe42dab7-ecba-4c58-8c1d-e9489760692a"). InnerVolumeSpecName "kube-api-access-ld47v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.690192 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe42dab7-ecba-4c58-8c1d-e9489760692a" (UID: "fe42dab7-ecba-4c58-8c1d-e9489760692a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.748867 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.748916 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld47v\" (UniqueName: \"kubernetes.io/projected/fe42dab7-ecba-4c58-8c1d-e9489760692a-kube-api-access-ld47v\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.748932 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe42dab7-ecba-4c58-8c1d-e9489760692a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.748944 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.777200 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data" (OuterVolumeSpecName: "config-data") pod "fe42dab7-ecba-4c58-8c1d-e9489760692a" (UID: "fe42dab7-ecba-4c58-8c1d-e9489760692a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:30 crc kubenswrapper[4771]: I0219 21:53:30.850048 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe42dab7-ecba-4c58-8c1d-e9489760692a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.069117 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.069561 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.501347 4771 generic.go:334] "Generic (PLEG): container finished" podID="7a560423-c65f-4855-94f2-31e6763e0817" containerID="0fba45e8f2db0d7874562de5ae6e82ab65e3497ea143bc8a1322500183785f07" exitCode=137 Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.501465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" event={"ID":"7a560423-c65f-4855-94f2-31e6763e0817","Type":"ContainerDied","Data":"0fba45e8f2db0d7874562de5ae6e82ab65e3497ea143bc8a1322500183785f07"} Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.501532 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bf8786687-hjsp2" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.578479 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-bf8786687-hjsp2"] Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.581610 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": EOF" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.582154 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-676bcb667b-l4th4" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.220:9311/healthcheck\": EOF" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.596690 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-bf8786687-hjsp2"] Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.812160 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.923899 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.977149 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-combined-ca-bundle\") pod \"7a560423-c65f-4855-94f2-31e6763e0817\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.977223 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hws4f\" (UniqueName: \"kubernetes.io/projected/7a560423-c65f-4855-94f2-31e6763e0817-kube-api-access-hws4f\") pod \"7a560423-c65f-4855-94f2-31e6763e0817\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.977271 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a560423-c65f-4855-94f2-31e6763e0817-logs\") pod \"7a560423-c65f-4855-94f2-31e6763e0817\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.977344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data\") pod \"7a560423-c65f-4855-94f2-31e6763e0817\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.977402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data-custom\") pod \"7a560423-c65f-4855-94f2-31e6763e0817\" (UID: \"7a560423-c65f-4855-94f2-31e6763e0817\") " Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.977955 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a560423-c65f-4855-94f2-31e6763e0817-logs" (OuterVolumeSpecName: "logs") pod "7a560423-c65f-4855-94f2-31e6763e0817" (UID: "7a560423-c65f-4855-94f2-31e6763e0817"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.978904 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a560423-c65f-4855-94f2-31e6763e0817-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.982282 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a560423-c65f-4855-94f2-31e6763e0817-kube-api-access-hws4f" (OuterVolumeSpecName: "kube-api-access-hws4f") pod "7a560423-c65f-4855-94f2-31e6763e0817" (UID: "7a560423-c65f-4855-94f2-31e6763e0817"). InnerVolumeSpecName "kube-api-access-hws4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:31 crc kubenswrapper[4771]: I0219 21:53:31.982321 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a560423-c65f-4855-94f2-31e6763e0817" (UID: "7a560423-c65f-4855-94f2-31e6763e0817"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.008398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a560423-c65f-4855-94f2-31e6763e0817" (UID: "7a560423-c65f-4855-94f2-31e6763e0817"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.024312 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data" (OuterVolumeSpecName: "config-data") pod "7a560423-c65f-4855-94f2-31e6763e0817" (UID: "7a560423-c65f-4855-94f2-31e6763e0817"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.079761 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-combined-ca-bundle\") pod \"dede6da5-d7c3-45f8-9431-41ee414e21d3\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.079837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data-custom\") pod \"dede6da5-d7c3-45f8-9431-41ee414e21d3\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.079862 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data\") pod \"dede6da5-d7c3-45f8-9431-41ee414e21d3\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.079878 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2cwj\" (UniqueName: \"kubernetes.io/projected/dede6da5-d7c3-45f8-9431-41ee414e21d3-kube-api-access-v2cwj\") pod \"dede6da5-d7c3-45f8-9431-41ee414e21d3\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.079947 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-internal-tls-certs\") pod \"dede6da5-d7c3-45f8-9431-41ee414e21d3\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.079975 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-public-tls-certs\") pod \"dede6da5-d7c3-45f8-9431-41ee414e21d3\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.080871 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dede6da5-d7c3-45f8-9431-41ee414e21d3-logs\") pod \"dede6da5-d7c3-45f8-9431-41ee414e21d3\" (UID: \"dede6da5-d7c3-45f8-9431-41ee414e21d3\") " Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.081214 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.081232 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.081242 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hws4f\" (UniqueName: \"kubernetes.io/projected/7a560423-c65f-4855-94f2-31e6763e0817-kube-api-access-hws4f\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.081251 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a560423-c65f-4855-94f2-31e6763e0817-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.081569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dede6da5-d7c3-45f8-9431-41ee414e21d3-logs" (OuterVolumeSpecName: "logs") pod "dede6da5-d7c3-45f8-9431-41ee414e21d3" (UID: "dede6da5-d7c3-45f8-9431-41ee414e21d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.084070 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dede6da5-d7c3-45f8-9431-41ee414e21d3" (UID: "dede6da5-d7c3-45f8-9431-41ee414e21d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.085552 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dede6da5-d7c3-45f8-9431-41ee414e21d3-kube-api-access-v2cwj" (OuterVolumeSpecName: "kube-api-access-v2cwj") pod "dede6da5-d7c3-45f8-9431-41ee414e21d3" (UID: "dede6da5-d7c3-45f8-9431-41ee414e21d3"). InnerVolumeSpecName "kube-api-access-v2cwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.126725 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dede6da5-d7c3-45f8-9431-41ee414e21d3" (UID: "dede6da5-d7c3-45f8-9431-41ee414e21d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.133257 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data" (OuterVolumeSpecName: "config-data") pod "dede6da5-d7c3-45f8-9431-41ee414e21d3" (UID: "dede6da5-d7c3-45f8-9431-41ee414e21d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.144009 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dede6da5-d7c3-45f8-9431-41ee414e21d3" (UID: "dede6da5-d7c3-45f8-9431-41ee414e21d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.147423 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dede6da5-d7c3-45f8-9431-41ee414e21d3" (UID: "dede6da5-d7c3-45f8-9431-41ee414e21d3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.183258 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.183311 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.183334 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.183354 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2cwj\" (UniqueName: \"kubernetes.io/projected/dede6da5-d7c3-45f8-9431-41ee414e21d3-kube-api-access-v2cwj\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.183374 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.183393 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dede6da5-d7c3-45f8-9431-41ee414e21d3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.183409 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dede6da5-d7c3-45f8-9431-41ee414e21d3-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.457158 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" path="/var/lib/kubelet/pods/fe42dab7-ecba-4c58-8c1d-e9489760692a/volumes" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.514325 4771 generic.go:334] "Generic (PLEG): container finished" podID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerID="ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d" exitCode=137 Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.514419 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676bcb667b-l4th4" event={"ID":"dede6da5-d7c3-45f8-9431-41ee414e21d3","Type":"ContainerDied","Data":"ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d"} Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.514440 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-676bcb667b-l4th4" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.514477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-676bcb667b-l4th4" event={"ID":"dede6da5-d7c3-45f8-9431-41ee414e21d3","Type":"ContainerDied","Data":"ed027fc6dde39b02857ed6027b796e33b283ca89ef88b0f9f8ce1177cd3e7b4a"} Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.514503 4771 scope.go:117] "RemoveContainer" containerID="ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.519779 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" event={"ID":"7a560423-c65f-4855-94f2-31e6763e0817","Type":"ContainerDied","Data":"58226702a8394cad17c25b445adb5fce46c4251260fa56f89f284fd6f1d546fa"} Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.519922 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8569f64fdb-ctb42" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.545654 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8569f64fdb-ctb42"] Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.550466 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-8569f64fdb-ctb42"] Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.561328 4771 scope.go:117] "RemoveContainer" containerID="34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.568795 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-676bcb667b-l4th4"] Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.574401 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-676bcb667b-l4th4"] Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.589690 4771 scope.go:117] "RemoveContainer" containerID="ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d" Feb 19 21:53:32 crc kubenswrapper[4771]: E0219 21:53:32.590281 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d\": container with ID starting with ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d not found: ID does not exist" containerID="ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.590342 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d"} err="failed to get container status \"ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d\": rpc error: code = NotFound desc = could not find container \"ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d\": container with ID starting with ccaacb2430ff550fa9bb80c9a05c38ec774c90235db808d5d63bed0a6998f77d not found: ID does not exist" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.590380 4771 scope.go:117] "RemoveContainer" containerID="34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5" Feb 19 21:53:32 crc kubenswrapper[4771]: E0219 21:53:32.590810 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5\": container with ID starting with 34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5 not found: ID does not exist" containerID="34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.590849 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5"} err="failed to get container status \"34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5\": rpc error: code = NotFound desc = could not find container \"34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5\": container with ID starting with 34183cc76e322a3ff1f64c9a13b28e6c7358b88a1737ce9509388a57f25568c5 not found: ID does not exist" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.590888 4771 scope.go:117] "RemoveContainer" containerID="0fba45e8f2db0d7874562de5ae6e82ab65e3497ea143bc8a1322500183785f07" Feb 19 21:53:32 crc kubenswrapper[4771]: I0219 21:53:32.612336 4771 scope.go:117] "RemoveContainer" containerID="b0c885cf9d242a07ffb1ee39664e6caefa677fa8a5dc413188745bc487f46952" Feb 19 21:53:34 crc kubenswrapper[4771]: I0219 21:53:34.453872 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a560423-c65f-4855-94f2-31e6763e0817" path="/var/lib/kubelet/pods/7a560423-c65f-4855-94f2-31e6763e0817/volumes" Feb 19 21:53:34 crc kubenswrapper[4771]: I0219 21:53:34.456171 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" path="/var/lib/kubelet/pods/dede6da5-d7c3-45f8-9431-41ee414e21d3/volumes" Feb 19 21:54:29 crc kubenswrapper[4771]: I0219 21:54:29.848855 4771 scope.go:117] "RemoveContainer" containerID="e632fa06b132d46b46064debfe1c40044370bb1e249759e09ae099922fe08194" Feb 19 21:54:29 crc kubenswrapper[4771]: I0219 21:54:29.884072 4771 scope.go:117] "RemoveContainer" containerID="53da6e944f47c7e6208414ec28ec1aa3e842f3fe992ab24148a4550284b94cd8" Feb 19 21:54:29 crc kubenswrapper[4771]: I0219 21:54:29.913800 4771 scope.go:117] "RemoveContainer" containerID="f711f4133d60e80e95d60ebf53d43892d134a7d9eb5c4126f1c2d737d2fa0c70" Feb 19 21:54:29 crc kubenswrapper[4771]: I0219 21:54:29.949891 4771 scope.go:117] "RemoveContainer" containerID="cdda98f4356ab69d51d2afb39a09470445a49137d8bf71e2e8719ca4e104b16a" Feb 19 21:54:29 crc kubenswrapper[4771]: I0219 21:54:29.991084 4771 scope.go:117] "RemoveContainer" containerID="d59b28bcc6dd2c89348367ae07b2b6652ecabd2e3eb7e906cb4ce9dfbdfd12d9" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.027061 4771 scope.go:117] "RemoveContainer" containerID="89041a4e9d9ebc9c0d58f88cef337d9f987ef35bcc4c225ef45ad3a183a71e6a" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.057668 4771 scope.go:117] "RemoveContainer" containerID="eb415e751672ce67a387f56913bb8398561b2f551eec4c660857fbfeb0059682" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.085883 4771 scope.go:117] "RemoveContainer" containerID="25638182c5db50cf9a8ae3158a1f97e75098b4e1dd04c4bf5390d848f2cf8116" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.124653 4771 scope.go:117] "RemoveContainer" containerID="8cca09843dd3644e93ed1e8cc9ecc581a4f9dd9ee78bc9019acbe6a06ad7313a" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.162638 4771 scope.go:117] "RemoveContainer" containerID="50d7f0fe603dcd477e42bf6d5cd23293366abcc57115b1583128c070088d06e9" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.204066 4771 scope.go:117] "RemoveContainer" containerID="b644a611fb45629cd8753f4162c8679d62729f768396dc3a438befa376dfbcc8" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.250223 4771 scope.go:117] "RemoveContainer" containerID="5c65d1c446aced68f3ad290c01c85b2c1b2009f812ff2b51fc255014187d7cd9" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.292366 4771 scope.go:117] "RemoveContainer" containerID="a6a3ea0a05c4cc921bbd48b56d16a1efbe4f22bdc52aaa4b06df43fbc2c61851" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.313057 4771 scope.go:117] "RemoveContainer" containerID="b55b4a1bb4d3ccc6512f177b1e2310a9648853f7b9b1181b71f4d9e34527c221" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.330919 4771 scope.go:117] "RemoveContainer" containerID="3c74c9c0e0a6ff63a98ee8134cab58ee1ebb426589db225fc216652b578d5a5e" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.350299 4771 scope.go:117] "RemoveContainer" containerID="ad68261083ec7a599ecf4fd0bb7914ee14b7d216c3177a543f672928b2ec7a8a" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.390850 4771 scope.go:117] "RemoveContainer" containerID="179eaa3bb6001bb07dcf706e87adb665dfcbf4bf3d1b7de103517856791e3149" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.413268 4771 scope.go:117] "RemoveContainer" containerID="4473d10264a5ab1810bbb85934a6cba54c8cf4d08a9c56acd1cb434e928da82e" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.440811 4771 scope.go:117] "RemoveContainer" containerID="f2531aed0329bf883f8ab6b773f55605090d3db1851e36c0cc40d24deefd3f69" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.465734 4771 scope.go:117] "RemoveContainer" containerID="793312b4189c234b06ff32b07e8e0d5dfc74266fc0a1de9300ef68cf6768b6c2" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.488482 4771 scope.go:117] "RemoveContainer" containerID="3e4e004099ba936950340e5e238183d77b40a39f4a97474d8ddb30dfaefb94a9" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.519471 4771 scope.go:117] "RemoveContainer" containerID="14d0c01548938f03e42260d3bf1044d84ba51bfc1bc8242f357fa6c92a03e723" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.557798 4771 scope.go:117] "RemoveContainer" containerID="8bb34e090626c241dae7a24235616f7cf7f7c5d4a8bd8d21f038334a18fdd9f1" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.588320 4771 scope.go:117] "RemoveContainer" containerID="714fd91309191f6caf2b8f7b6db8af29e9a164f11470713344c8212bb188178d" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.616335 4771 scope.go:117] "RemoveContainer" containerID="0d8112d0d11bb87f970617cd7d38234c852cb777a0ea50e9330bd9eec7a4c5b5" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.647678 4771 scope.go:117] "RemoveContainer" containerID="dbdb1ec746d8ae8c73b0b46b35a801715fcbe6f5e2ae5147a3b70af6f2352619" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.674921 4771 scope.go:117] "RemoveContainer" containerID="a48ec811c0a5e4d47be5eb264a30dddf34cda4b99f616321d34ff066b2ee7c3e" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.713458 4771 scope.go:117] "RemoveContainer" containerID="4e14f018cde901d4da996f236082091e2c86ad393c05bc554a9a701ee38321a3" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.736367 4771 scope.go:117] "RemoveContainer" containerID="4cd6b0ceee64a599a8e1a3fcddded6e00c86fe15dc3839c5337074fb5721bc3e" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.790823 4771 scope.go:117] "RemoveContainer" containerID="56daf9ee26ebf4c3d4c6900b71838aba932c5f271a2526fe5d21871ad34fdb76" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.817057 4771 scope.go:117] "RemoveContainer" containerID="37b58896182f7320a318dedbc51a682c252143cf1f5f7dc1cc5a4d0d41fd6a9e" Feb 19 21:54:30 crc kubenswrapper[4771]: I0219 21:54:30.843339 4771 scope.go:117] "RemoveContainer" containerID="989780e22c1c2d03db82d8a1c9b2f4054eb65906251a7cbf79134c1b65f185a6" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.222804 4771 scope.go:117] "RemoveContainer" containerID="ea658d77089dbfc69308dbad9dd2e54d36caee337d295277c26f16fb2c64e832" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.260105 4771 scope.go:117] "RemoveContainer" containerID="7e29e380a4d40a5bef1636c27f62822a8ea13ed037c70a4a10bf1a1c0caab247" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.310901 4771 scope.go:117] "RemoveContainer" containerID="1adfb4b311f6a1160416ef7fc218352891df01330cf223d05202cbfbf7986ce3" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.348197 4771 scope.go:117] "RemoveContainer" containerID="0c95bf2af364f9fea1a07bb5b1eeda40dc6f61250043572e43473c8b0808ba86" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.402045 4771 scope.go:117] "RemoveContainer" containerID="0a967eeaba3d4bc56afe66a0da1bc669b7ba8837bde6edc23ce8b5bdde09949c" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.446936 4771 scope.go:117] "RemoveContainer" containerID="1f6090de362c26def6021ffae1e120ef61588dde212f93dc1d10af3cca3c778c" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.486654 4771 scope.go:117] "RemoveContainer" containerID="bee7c802e137a339145463341df9445a9986d274b0dbab724570fb327145cf73" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.516773 4771 scope.go:117] "RemoveContainer" containerID="59e8690151de99e0e2f8e5b6aa921e4b3e247c672687ea7a8b2e61fa6b3f510b" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.542104 4771 scope.go:117] "RemoveContainer" containerID="34d955f6ce785ca5102ed3d08369096426c73428d7115ae6d11a102e42fd4390" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.569414 4771 scope.go:117] "RemoveContainer" containerID="8762cac2f6ef78687a4af6d497fb280fa7b766f18eed1d002187a46fd63354ec" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.613508 4771 scope.go:117] "RemoveContainer" containerID="b2899341ffff7f0d7884e652ee1901e056f8cf1f617ef5cdc34eee16e4731b80" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.658598 4771 scope.go:117] "RemoveContainer" containerID="c39227b1f1a133079941597205bef527ddb978dee2a720fe0152b89dacaddbf8" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.688005 4771 scope.go:117] "RemoveContainer" containerID="6be2db0d0a7a15ac9e0f112592f3037222ad978893674b6b671e66899fbaf58b" Feb 19 21:55:31 crc kubenswrapper[4771]: I0219 21:55:31.740369 4771 scope.go:117] "RemoveContainer" containerID="26444679d7cec06fdc9c103258262da0f05d898c62474cf9b60ff923c1258255" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066406 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qj46z"] Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066697 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066711 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066728 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-metadata" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066736 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-metadata" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066766 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerName="mysql-bootstrap" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066774 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerName="mysql-bootstrap" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066785 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4938b4a1-32f8-4e7a-b334-8ce3b649fb46" containerName="nova-cell0-conductor-conductor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066793 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4938b4a1-32f8-4e7a-b334-8ce3b649fb46" containerName="nova-cell0-conductor-conductor" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066802 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b605459-c654-463d-b66c-ec804185ea7d" containerName="keystone-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066810 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b605459-c654-463d-b66c-ec804185ea7d" containerName="keystone-api" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066822 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066830 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066841 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066850 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066866 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-reaper" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066874 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-reaper" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066886 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066894 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-server" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066904 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c230c6-1af8-440e-806d-b3b1e98544c0" containerName="kube-state-metrics" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066911 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c230c6-1af8-440e-806d-b3b1e98544c0" containerName="kube-state-metrics" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066927 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-updater" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066935 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-updater" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066944 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="rsync" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066951 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="rsync" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066966 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066973 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.066982 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" containerName="mariadb-account-create-update" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.066990 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" containerName="mariadb-account-create-update" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067002 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c812f8b-6ad1-4873-8999-e649acd07d91" containerName="memcached" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067010 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c812f8b-6ad1-4873-8999-e649acd07d91" containerName="memcached" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067038 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067047 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067058 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-central-agent" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067066 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-central-agent" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067077 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067086 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067095 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="openstack-network-exporter" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067103 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="openstack-network-exporter" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067118 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067126 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-server" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067138 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067146 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067154 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerName="galera" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067162 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerName="galera" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067172 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067180 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-api" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067190 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067198 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067210 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerName="galera" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067219 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerName="galera" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067233 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067241 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067256 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067263 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067272 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067279 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-api" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067293 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067301 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067310 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067318 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067329 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067337 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-server" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067348 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067355 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067366 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067374 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067385 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerName="rabbitmq" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067393 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerName="rabbitmq" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067402 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-expirer" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067410 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-expirer" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067420 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067441 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067454 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-notification-agent" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067463 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-notification-agent" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067477 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067485 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-api" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067495 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-updater" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067502 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-updater" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067514 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerName="setup-container" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067522 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerName="setup-container" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067535 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56d3128-877b-4d8d-a48f-42a7b83e9347" containerName="nova-cell1-conductor-conductor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067542 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56d3128-877b-4d8d-a48f-42a7b83e9347" containerName="nova-cell1-conductor-conductor" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067557 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067565 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067576 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="swift-recon-cron" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067585 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="swift-recon-cron" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067596 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="ovn-northd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067603 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="ovn-northd" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067616 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server-init" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067625 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server-init" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067635 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerName="setup-container" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067643 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerName="setup-container" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067655 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067664 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067677 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067684 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067695 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067702 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067717 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067724 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067735 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067742 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067752 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067759 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067773 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067781 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067793 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067801 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067813 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" containerName="nova-scheduler-scheduler" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067822 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" containerName="nova-scheduler-scheduler" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067836 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067844 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067856 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067865 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067874 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067882 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067892 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067899 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067911 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="sg-core" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067919 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="sg-core" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067928 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067936 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067948 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067956 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067965 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerName="mysql-bootstrap" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067974 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerName="mysql-bootstrap" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.067986 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.067994 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.068003 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerName="rabbitmq" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068010 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerName="rabbitmq" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.068054 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="proxy-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068062 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="proxy-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068211 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c671dcf6-b1eb-4c4e-ba71-ae115ce811da" containerName="rabbitmq" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068224 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-central-agent" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068234 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068245 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068255 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068266 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="ovn-northd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068281 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068294 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068306 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bf8c7c-234c-47ea-86aa-4b1313f7b983" containerName="nova-scheduler-scheduler" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068315 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b605459-c654-463d-b66c-ec804185ea7d" containerName="keystone-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068323 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068334 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="ceilometer-notification-agent" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068348 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4368f372-87b3-4f85-95f9-72c2046b0cc7" containerName="galera" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068362 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068370 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="swift-recon-cron" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068379 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068392 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068403 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068415 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cd62c3-f3af-4144-b6b1-ec2cafb424ad" containerName="rabbitmq" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068429 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-updater" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068438 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068447 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b8bc52-11f9-4adf-8ba3-fbe39197b5aa" containerName="openstack-network-exporter" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068460 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovsdb-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068470 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-expirer" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068478 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="object-updater" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068486 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6a922b-a8fd-4f68-95e9-398e5a38bc6e" containerName="cinder-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068497 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53e8d42-c35f-4ea0-9314-f4d849908089" containerName="ovs-vswitchd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068508 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068522 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="rsync" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068529 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" containerName="mariadb-account-create-update" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068541 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b36b1b4-e091-4032-90aa-7f983b5c4b4f" containerName="barbican-worker-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068554 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068567 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="sg-core" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068575 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe42dab7-ecba-4c58-8c1d-e9489760692a" containerName="barbican-worker-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068588 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" containerName="mariadb-account-create-update" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068602 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb34081-bc3a-451a-93c5-c28299467781" containerName="nova-metadata-metadata" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068612 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068623 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c230c6-1af8-440e-806d-b3b1e98544c0" containerName="kube-state-metrics" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068637 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="680c2961-e33c-44b1-aadd-37556bf4839c" containerName="placement-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068645 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa7dd4a-4295-4120-bff7-fc2c82f76aed" containerName="galera" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068655 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4938b4a1-32f8-4e7a-b334-8ce3b649fb46" containerName="nova-cell0-conductor-conductor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068665 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068679 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-auditor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068688 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b64cc52a-4f20-4e05-b444-c46f97727527" containerName="glance-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068699 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068706 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="306067f7-88de-4cdb-8ca2-3540ada9b006" containerName="glance-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068715 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-server" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068723 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5526752-6549-40aa-8443-9ad3572799d2" containerName="neutron-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068732 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c812f8b-6ad1-4873-8999-e649acd07d91" containerName="memcached" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068744 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="container-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068754 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56d3128-877b-4d8d-a48f-42a7b83e9347" containerName="nova-cell1-conductor-conductor" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068764 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068775 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068783 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dede6da5-d7c3-45f8-9431-41ee414e21d3" containerName="barbican-api" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068795 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d016daf3-054a-4914-8711-fc82edab9f88" containerName="ovn-controller" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068802 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5f8191-842e-4a39-ac28-b0042c51f813" containerName="barbican-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068813 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a560423-c65f-4855-94f2-31e6763e0817" containerName="barbican-keystone-listener-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068821 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="63798ee6-b629-437f-9e15-5bd46b79894e" containerName="nova-api-log" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068832 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6733d997-54a6-4c46-8692-db99d5a20a9e" containerName="proxy-httpd" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068844 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a201b832-5d71-4947-98ba-02adf91bccd5" containerName="barbican-keystone-listener" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068854 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-reaper" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.068862 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5580e95c-81dc-4c90-bb0c-9b27a4a8c971" containerName="account-replicator" Feb 19 21:55:32 crc kubenswrapper[4771]: E0219 21:55:32.069041 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" containerName="mariadb-account-create-update" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.069051 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4401f3-2019-40a6-8829-5e22598f176a" containerName="mariadb-account-create-update" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.079216 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.109371 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qj46z"] Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.126261 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-catalog-content\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.126378 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-utilities\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.126397 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhhw4\" (UniqueName: \"kubernetes.io/projected/bdefe92f-6620-4d2c-b07b-24d83218c9a5-kube-api-access-jhhw4\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.228194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-utilities\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.228241 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhhw4\" (UniqueName: \"kubernetes.io/projected/bdefe92f-6620-4d2c-b07b-24d83218c9a5-kube-api-access-jhhw4\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.228294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-catalog-content\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.228726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-utilities\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.228757 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-catalog-content\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.248169 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhhw4\" (UniqueName: \"kubernetes.io/projected/bdefe92f-6620-4d2c-b07b-24d83218c9a5-kube-api-access-jhhw4\") pod \"redhat-operators-qj46z\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.401070 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:32 crc kubenswrapper[4771]: I0219 21:55:32.841377 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qj46z"] Feb 19 21:55:33 crc kubenswrapper[4771]: I0219 21:55:33.635209 4771 generic.go:334] "Generic (PLEG): container finished" podID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerID="f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34" exitCode=0 Feb 19 21:55:33 crc kubenswrapper[4771]: I0219 21:55:33.635251 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qj46z" event={"ID":"bdefe92f-6620-4d2c-b07b-24d83218c9a5","Type":"ContainerDied","Data":"f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34"} Feb 19 21:55:33 crc kubenswrapper[4771]: I0219 21:55:33.635277 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qj46z" event={"ID":"bdefe92f-6620-4d2c-b07b-24d83218c9a5","Type":"ContainerStarted","Data":"7f3a5b16c79254132c5e7c6ac19e7d1b5e613f2068dac0fec945684e8a567853"} Feb 19 21:55:34 crc kubenswrapper[4771]: I0219 21:55:34.646128 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qj46z" event={"ID":"bdefe92f-6620-4d2c-b07b-24d83218c9a5","Type":"ContainerStarted","Data":"572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063"} Feb 19 21:55:35 crc kubenswrapper[4771]: I0219 21:55:35.657999 4771 generic.go:334] "Generic (PLEG): container finished" podID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerID="572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063" exitCode=0 Feb 19 21:55:35 crc kubenswrapper[4771]: I0219 21:55:35.658101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qj46z" event={"ID":"bdefe92f-6620-4d2c-b07b-24d83218c9a5","Type":"ContainerDied","Data":"572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063"} Feb 19 21:55:36 crc kubenswrapper[4771]: I0219 21:55:36.666230 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qj46z" event={"ID":"bdefe92f-6620-4d2c-b07b-24d83218c9a5","Type":"ContainerStarted","Data":"8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0"} Feb 19 21:55:36 crc kubenswrapper[4771]: I0219 21:55:36.691706 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qj46z" podStartSLOduration=2.252072012 podStartE2EDuration="4.691687749s" podCreationTimestamp="2026-02-19 21:55:32 +0000 UTC" firstStartedPulling="2026-02-19 21:55:33.637065683 +0000 UTC m=+1633.908508153" lastFinishedPulling="2026-02-19 21:55:36.07668138 +0000 UTC m=+1636.348123890" observedRunningTime="2026-02-19 21:55:36.688514825 +0000 UTC m=+1636.959957305" watchObservedRunningTime="2026-02-19 21:55:36.691687749 +0000 UTC m=+1636.963130229" Feb 19 21:55:42 crc kubenswrapper[4771]: I0219 21:55:42.401984 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:42 crc kubenswrapper[4771]: I0219 21:55:42.402540 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:42 crc kubenswrapper[4771]: I0219 21:55:42.957121 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:55:42 crc kubenswrapper[4771]: I0219 21:55:42.957207 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:55:43 crc kubenswrapper[4771]: I0219 21:55:43.477358 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qj46z" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="registry-server" probeResult="failure" output=< Feb 19 21:55:43 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 21:55:43 crc kubenswrapper[4771]: > Feb 19 21:55:52 crc kubenswrapper[4771]: I0219 21:55:52.479579 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:52 crc kubenswrapper[4771]: I0219 21:55:52.556885 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:52 crc kubenswrapper[4771]: I0219 21:55:52.731992 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qj46z"] Feb 19 21:55:53 crc kubenswrapper[4771]: I0219 21:55:53.825197 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qj46z" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="registry-server" containerID="cri-o://8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0" gracePeriod=2 Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.332195 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.473587 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-catalog-content\") pod \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.473658 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhhw4\" (UniqueName: \"kubernetes.io/projected/bdefe92f-6620-4d2c-b07b-24d83218c9a5-kube-api-access-jhhw4\") pod \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.473687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-utilities\") pod \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\" (UID: \"bdefe92f-6620-4d2c-b07b-24d83218c9a5\") " Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.475539 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-utilities" (OuterVolumeSpecName: "utilities") pod "bdefe92f-6620-4d2c-b07b-24d83218c9a5" (UID: "bdefe92f-6620-4d2c-b07b-24d83218c9a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.484823 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdefe92f-6620-4d2c-b07b-24d83218c9a5-kube-api-access-jhhw4" (OuterVolumeSpecName: "kube-api-access-jhhw4") pod "bdefe92f-6620-4d2c-b07b-24d83218c9a5" (UID: "bdefe92f-6620-4d2c-b07b-24d83218c9a5"). InnerVolumeSpecName "kube-api-access-jhhw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.577385 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhhw4\" (UniqueName: \"kubernetes.io/projected/bdefe92f-6620-4d2c-b07b-24d83218c9a5-kube-api-access-jhhw4\") on node \"crc\" DevicePath \"\"" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.577708 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.642609 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdefe92f-6620-4d2c-b07b-24d83218c9a5" (UID: "bdefe92f-6620-4d2c-b07b-24d83218c9a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.680232 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdefe92f-6620-4d2c-b07b-24d83218c9a5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.840740 4771 generic.go:334] "Generic (PLEG): container finished" podID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerID="8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0" exitCode=0 Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.840800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qj46z" event={"ID":"bdefe92f-6620-4d2c-b07b-24d83218c9a5","Type":"ContainerDied","Data":"8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0"} Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.840858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qj46z" event={"ID":"bdefe92f-6620-4d2c-b07b-24d83218c9a5","Type":"ContainerDied","Data":"7f3a5b16c79254132c5e7c6ac19e7d1b5e613f2068dac0fec945684e8a567853"} Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.840854 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qj46z" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.840882 4771 scope.go:117] "RemoveContainer" containerID="8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.874011 4771 scope.go:117] "RemoveContainer" containerID="572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.907157 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qj46z"] Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.913828 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qj46z"] Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.921632 4771 scope.go:117] "RemoveContainer" containerID="f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.943822 4771 scope.go:117] "RemoveContainer" containerID="8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0" Feb 19 21:55:54 crc kubenswrapper[4771]: E0219 21:55:54.944458 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0\": container with ID starting with 8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0 not found: ID does not exist" containerID="8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.944501 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0"} err="failed to get container status \"8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0\": rpc error: code = NotFound desc = could not find container \"8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0\": container with ID starting with 8ae3ccd0f4c2ccf24fa324e594ca6449e23025480d5ad830de2f2288d10fd5e0 not found: ID does not exist" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.944527 4771 scope.go:117] "RemoveContainer" containerID="572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063" Feb 19 21:55:54 crc kubenswrapper[4771]: E0219 21:55:54.945043 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063\": container with ID starting with 572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063 not found: ID does not exist" containerID="572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.945083 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063"} err="failed to get container status \"572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063\": rpc error: code = NotFound desc = could not find container \"572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063\": container with ID starting with 572069a8dd4b7a97b17c1c850a055efb8bd9c4f63838be9c289e9e1afe7de063 not found: ID does not exist" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.945107 4771 scope.go:117] "RemoveContainer" containerID="f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34" Feb 19 21:55:54 crc kubenswrapper[4771]: E0219 21:55:54.945610 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34\": container with ID starting with f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34 not found: ID does not exist" containerID="f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34" Feb 19 21:55:54 crc kubenswrapper[4771]: I0219 21:55:54.945646 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34"} err="failed to get container status \"f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34\": rpc error: code = NotFound desc = could not find container \"f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34\": container with ID starting with f9070d8cb71e4fe1706d0e37e4a4d0a3b387f220c8cd2544c881b8bc28cf3a34 not found: ID does not exist" Feb 19 21:55:56 crc kubenswrapper[4771]: I0219 21:55:56.449943 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" path="/var/lib/kubelet/pods/bdefe92f-6620-4d2c-b07b-24d83218c9a5/volumes" Feb 19 21:56:12 crc kubenswrapper[4771]: I0219 21:56:12.956919 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:56:12 crc kubenswrapper[4771]: I0219 21:56:12.957696 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.001891 4771 scope.go:117] "RemoveContainer" containerID="894820fb1c0c12b28c969279bf97be7c33aace2ae97a9680116cae04f216d423" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.030770 4771 scope.go:117] "RemoveContainer" containerID="70ccbec8c9a37573cbcbf07ef46b413724db3b6ad9381cf0ff63195ae6ffb18a" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.083511 4771 scope.go:117] "RemoveContainer" containerID="6c7e9318c5ddd5d08bdec6168a67f0d518ff8348344d2f64f96c844314ef2153" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.106146 4771 scope.go:117] "RemoveContainer" containerID="40144a8e451dbe1a572950c52be0b02a654d805e4c7b97aeb0e77cd2fef2d350" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.143651 4771 scope.go:117] "RemoveContainer" containerID="91cf6aa4394a77413c6ac8c75e4e0d70166f76d1cea8bfe757e7914bcec5a944" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.164346 4771 scope.go:117] "RemoveContainer" containerID="050c7646bddc86dead29a1986df826ed7afb73f99f9161f47f854552f6cce761" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.196415 4771 scope.go:117] "RemoveContainer" containerID="71cc7d9442ba94c1444da4fb61fedc176b18a5f13ddee418fad83e7fc53e74bd" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.214673 4771 scope.go:117] "RemoveContainer" containerID="6459c1d0eae8011e51f373038f7426ac5702af28aa9315bf859efd56d18e4be8" Feb 19 21:56:32 crc kubenswrapper[4771]: I0219 21:56:32.247044 4771 scope.go:117] "RemoveContainer" containerID="82509bf1ab50929ddc715d7c46a7dac697c4c007fec020d2a38c870dfe0d48c8" Feb 19 21:56:42 crc kubenswrapper[4771]: I0219 21:56:42.957147 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:56:42 crc kubenswrapper[4771]: I0219 21:56:42.957852 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:56:42 crc kubenswrapper[4771]: I0219 21:56:42.957923 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 21:56:42 crc kubenswrapper[4771]: I0219 21:56:42.958779 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:56:42 crc kubenswrapper[4771]: I0219 21:56:42.958875 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" gracePeriod=600 Feb 19 21:56:43 crc kubenswrapper[4771]: E0219 21:56:43.113823 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:56:43 crc kubenswrapper[4771]: I0219 21:56:43.363295 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" exitCode=0 Feb 19 21:56:43 crc kubenswrapper[4771]: I0219 21:56:43.363339 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a"} Feb 19 21:56:43 crc kubenswrapper[4771]: I0219 21:56:43.363374 4771 scope.go:117] "RemoveContainer" containerID="3de117c6f677c9430a82ad131a4b88d218c50126d4fd008ad4d29c8f59650395" Feb 19 21:56:43 crc kubenswrapper[4771]: I0219 21:56:43.364072 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:56:43 crc kubenswrapper[4771]: E0219 21:56:43.364416 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:56:58 crc kubenswrapper[4771]: I0219 21:56:58.437542 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:56:58 crc kubenswrapper[4771]: E0219 21:56:58.438438 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:57:12 crc kubenswrapper[4771]: I0219 21:57:12.438232 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:57:12 crc kubenswrapper[4771]: E0219 21:57:12.439371 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:57:23 crc kubenswrapper[4771]: I0219 21:57:23.437722 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:57:23 crc kubenswrapper[4771]: E0219 21:57:23.438623 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:57:32 crc kubenswrapper[4771]: I0219 21:57:32.421486 4771 scope.go:117] "RemoveContainer" containerID="b8ca1cd5e8fb62c9444684cc61027938add1231dfcf2c180029a45a58ab67818" Feb 19 21:57:32 crc kubenswrapper[4771]: I0219 21:57:32.487586 4771 scope.go:117] "RemoveContainer" containerID="7259b0900abc0f8f057c7e0cc3ae36201abb5c977bdf9917bb917adf06c8bd91" Feb 19 21:57:32 crc kubenswrapper[4771]: I0219 21:57:32.515855 4771 scope.go:117] "RemoveContainer" containerID="572dcb283fd1c97c3eac3d2d950a2cb097579be9167f01910d217a90b523c5b4" Feb 19 21:57:32 crc kubenswrapper[4771]: I0219 21:57:32.560263 4771 scope.go:117] "RemoveContainer" containerID="57fb31ab711e5d13d400bec76bdcbd0c4341b1fc0d21f6f6600f2084488fa088" Feb 19 21:57:34 crc kubenswrapper[4771]: I0219 21:57:34.437568 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:57:34 crc kubenswrapper[4771]: E0219 21:57:34.437994 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:57:49 crc kubenswrapper[4771]: I0219 21:57:49.437684 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:57:49 crc kubenswrapper[4771]: E0219 21:57:49.438861 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:58:03 crc kubenswrapper[4771]: I0219 21:58:03.437685 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:58:03 crc kubenswrapper[4771]: E0219 21:58:03.438582 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:58:17 crc kubenswrapper[4771]: I0219 21:58:17.437419 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:58:17 crc kubenswrapper[4771]: E0219 21:58:17.438579 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:58:29 crc kubenswrapper[4771]: I0219 21:58:29.437165 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:58:29 crc kubenswrapper[4771]: E0219 21:58:29.438416 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:58:32 crc kubenswrapper[4771]: I0219 21:58:32.651510 4771 scope.go:117] "RemoveContainer" containerID="4a5aa26a299aa0d998acf896b457654e74cd80a622f33a8d72e1c3135f3d7b1e" Feb 19 21:58:43 crc kubenswrapper[4771]: I0219 21:58:43.437528 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:58:43 crc kubenswrapper[4771]: E0219 21:58:43.438227 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:58:56 crc kubenswrapper[4771]: I0219 21:58:56.437864 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:58:56 crc kubenswrapper[4771]: E0219 21:58:56.439082 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:59:08 crc kubenswrapper[4771]: I0219 21:59:08.437848 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:59:08 crc kubenswrapper[4771]: E0219 21:59:08.438856 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:59:19 crc kubenswrapper[4771]: I0219 21:59:19.436998 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:59:19 crc kubenswrapper[4771]: E0219 21:59:19.437972 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.780740 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hlnmp"] Feb 19 21:59:25 crc kubenswrapper[4771]: E0219 21:59:25.782375 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="extract-utilities" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.782406 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="extract-utilities" Feb 19 21:59:25 crc kubenswrapper[4771]: E0219 21:59:25.782453 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="extract-content" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.782470 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="extract-content" Feb 19 21:59:25 crc kubenswrapper[4771]: E0219 21:59:25.782505 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="registry-server" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.782521 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="registry-server" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.782826 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdefe92f-6620-4d2c-b07b-24d83218c9a5" containerName="registry-server" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.788153 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.811683 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlnmp"] Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.862221 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-utilities\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.862299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9js6\" (UniqueName: \"kubernetes.io/projected/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-kube-api-access-f9js6\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.862358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-catalog-content\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.965911 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-catalog-content\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.966124 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-utilities\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.966174 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9js6\" (UniqueName: \"kubernetes.io/projected/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-kube-api-access-f9js6\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.966741 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-catalog-content\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.966789 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gvnv2"] Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.967149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-utilities\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:25 crc kubenswrapper[4771]: I0219 21:59:25.968965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.016718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9js6\" (UniqueName: \"kubernetes.io/projected/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-kube-api-access-f9js6\") pod \"certified-operators-hlnmp\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.036847 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvnv2"] Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.067637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsfs\" (UniqueName: \"kubernetes.io/projected/9b175cbb-4729-4bfd-a0e1-11d57c905286-kube-api-access-vxsfs\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.067698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-utilities\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.067747 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-catalog-content\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.129578 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.169247 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsfs\" (UniqueName: \"kubernetes.io/projected/9b175cbb-4729-4bfd-a0e1-11d57c905286-kube-api-access-vxsfs\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.169299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-utilities\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.169362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-catalog-content\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.169840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-catalog-content\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.170899 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-utilities\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.191213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsfs\" (UniqueName: \"kubernetes.io/projected/9b175cbb-4729-4bfd-a0e1-11d57c905286-kube-api-access-vxsfs\") pod \"redhat-marketplace-gvnv2\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.356310 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.583496 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlnmp"] Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.656970 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvnv2"] Feb 19 21:59:26 crc kubenswrapper[4771]: W0219 21:59:26.669207 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b175cbb_4729_4bfd_a0e1_11d57c905286.slice/crio-5b948889a04968cb9ce177d433686dec113df535eb8adf65020e928010730035 WatchSource:0}: Error finding container 5b948889a04968cb9ce177d433686dec113df535eb8adf65020e928010730035: Status 404 returned error can't find the container with id 5b948889a04968cb9ce177d433686dec113df535eb8adf65020e928010730035 Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.906907 4771 generic.go:334] "Generic (PLEG): container finished" podID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerID="a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32" exitCode=0 Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.907261 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlnmp" event={"ID":"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2","Type":"ContainerDied","Data":"a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32"} Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.907315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlnmp" event={"ID":"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2","Type":"ContainerStarted","Data":"dad962b41ff42b1ff97a2cbcc6d07c68e52c36da94c26a51d9eeb80e5075f37b"} Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.908660 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.909120 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerID="ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6" exitCode=0 Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.909154 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvnv2" event={"ID":"9b175cbb-4729-4bfd-a0e1-11d57c905286","Type":"ContainerDied","Data":"ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6"} Feb 19 21:59:26 crc kubenswrapper[4771]: I0219 21:59:26.909195 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvnv2" event={"ID":"9b175cbb-4729-4bfd-a0e1-11d57c905286","Type":"ContainerStarted","Data":"5b948889a04968cb9ce177d433686dec113df535eb8adf65020e928010730035"} Feb 19 21:59:27 crc kubenswrapper[4771]: I0219 21:59:27.928170 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlnmp" event={"ID":"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2","Type":"ContainerStarted","Data":"148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f"} Feb 19 21:59:27 crc kubenswrapper[4771]: I0219 21:59:27.931921 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvnv2" event={"ID":"9b175cbb-4729-4bfd-a0e1-11d57c905286","Type":"ContainerStarted","Data":"a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15"} Feb 19 21:59:28 crc kubenswrapper[4771]: I0219 21:59:28.948563 4771 generic.go:334] "Generic (PLEG): container finished" podID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerID="148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f" exitCode=0 Feb 19 21:59:28 crc kubenswrapper[4771]: I0219 21:59:28.948681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlnmp" event={"ID":"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2","Type":"ContainerDied","Data":"148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f"} Feb 19 21:59:28 crc kubenswrapper[4771]: I0219 21:59:28.955274 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerID="a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15" exitCode=0 Feb 19 21:59:28 crc kubenswrapper[4771]: I0219 21:59:28.955348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvnv2" event={"ID":"9b175cbb-4729-4bfd-a0e1-11d57c905286","Type":"ContainerDied","Data":"a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15"} Feb 19 21:59:29 crc kubenswrapper[4771]: I0219 21:59:29.965670 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlnmp" event={"ID":"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2","Type":"ContainerStarted","Data":"4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4"} Feb 19 21:59:29 crc kubenswrapper[4771]: I0219 21:59:29.968986 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvnv2" event={"ID":"9b175cbb-4729-4bfd-a0e1-11d57c905286","Type":"ContainerStarted","Data":"7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498"} Feb 19 21:59:30 crc kubenswrapper[4771]: I0219 21:59:30.010953 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gvnv2" podStartSLOduration=2.545302037 podStartE2EDuration="5.010938812s" podCreationTimestamp="2026-02-19 21:59:25 +0000 UTC" firstStartedPulling="2026-02-19 21:59:26.910103305 +0000 UTC m=+1867.181545775" lastFinishedPulling="2026-02-19 21:59:29.37574005 +0000 UTC m=+1869.647182550" observedRunningTime="2026-02-19 21:59:30.00894704 +0000 UTC m=+1870.280389520" watchObservedRunningTime="2026-02-19 21:59:30.010938812 +0000 UTC m=+1870.282381282" Feb 19 21:59:30 crc kubenswrapper[4771]: I0219 21:59:30.013388 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hlnmp" podStartSLOduration=2.567514736 podStartE2EDuration="5.013381417s" podCreationTimestamp="2026-02-19 21:59:25 +0000 UTC" firstStartedPulling="2026-02-19 21:59:26.908436821 +0000 UTC m=+1867.179879291" lastFinishedPulling="2026-02-19 21:59:29.354303462 +0000 UTC m=+1869.625745972" observedRunningTime="2026-02-19 21:59:29.989094735 +0000 UTC m=+1870.260537235" watchObservedRunningTime="2026-02-19 21:59:30.013381417 +0000 UTC m=+1870.284823887" Feb 19 21:59:32 crc kubenswrapper[4771]: I0219 21:59:32.746201 4771 scope.go:117] "RemoveContainer" containerID="cdce0b0c182a30ee60311ed46bbd115e0505327a1cb4095a7212d2553f1b9638" Feb 19 21:59:32 crc kubenswrapper[4771]: I0219 21:59:32.770961 4771 scope.go:117] "RemoveContainer" containerID="b125e97d3aa387a74d59769b99c8a8b7c2519de1b84fed4d8cf928e890c90ec4" Feb 19 21:59:33 crc kubenswrapper[4771]: I0219 21:59:33.438322 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:59:33 crc kubenswrapper[4771]: E0219 21:59:33.438805 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:59:36 crc kubenswrapper[4771]: I0219 21:59:36.129818 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:36 crc kubenswrapper[4771]: I0219 21:59:36.130088 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:36 crc kubenswrapper[4771]: I0219 21:59:36.206792 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:36 crc kubenswrapper[4771]: I0219 21:59:36.356459 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:36 crc kubenswrapper[4771]: I0219 21:59:36.356529 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:36 crc kubenswrapper[4771]: I0219 21:59:36.433207 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:37 crc kubenswrapper[4771]: I0219 21:59:37.099394 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:37 crc kubenswrapper[4771]: I0219 21:59:37.102704 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:37 crc kubenswrapper[4771]: I0219 21:59:37.656567 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvnv2"] Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.048461 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gvnv2" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="registry-server" containerID="cri-o://7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498" gracePeriod=2 Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.458314 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlnmp"] Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.458792 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hlnmp" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="registry-server" containerID="cri-o://4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4" gracePeriod=2 Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.616091 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.791530 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-catalog-content\") pod \"9b175cbb-4729-4bfd-a0e1-11d57c905286\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.791608 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxsfs\" (UniqueName: \"kubernetes.io/projected/9b175cbb-4729-4bfd-a0e1-11d57c905286-kube-api-access-vxsfs\") pod \"9b175cbb-4729-4bfd-a0e1-11d57c905286\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.791679 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-utilities\") pod \"9b175cbb-4729-4bfd-a0e1-11d57c905286\" (UID: \"9b175cbb-4729-4bfd-a0e1-11d57c905286\") " Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.792745 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-utilities" (OuterVolumeSpecName: "utilities") pod "9b175cbb-4729-4bfd-a0e1-11d57c905286" (UID: "9b175cbb-4729-4bfd-a0e1-11d57c905286"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.793128 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.798804 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b175cbb-4729-4bfd-a0e1-11d57c905286-kube-api-access-vxsfs" (OuterVolumeSpecName: "kube-api-access-vxsfs") pod "9b175cbb-4729-4bfd-a0e1-11d57c905286" (UID: "9b175cbb-4729-4bfd-a0e1-11d57c905286"). InnerVolumeSpecName "kube-api-access-vxsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.845493 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b175cbb-4729-4bfd-a0e1-11d57c905286" (UID: "9b175cbb-4729-4bfd-a0e1-11d57c905286"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.886378 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.894624 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b175cbb-4729-4bfd-a0e1-11d57c905286-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.894658 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxsfs\" (UniqueName: \"kubernetes.io/projected/9b175cbb-4729-4bfd-a0e1-11d57c905286-kube-api-access-vxsfs\") on node \"crc\" DevicePath \"\"" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.996067 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9js6\" (UniqueName: \"kubernetes.io/projected/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-kube-api-access-f9js6\") pod \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.996464 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-catalog-content\") pod \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.996667 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-utilities\") pod \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\" (UID: \"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2\") " Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.997785 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-utilities" (OuterVolumeSpecName: "utilities") pod "5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" (UID: "5ec2d319-d1c5-49dc-810b-d2bc1be80cf2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:59:39 crc kubenswrapper[4771]: I0219 21:59:39.999789 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-kube-api-access-f9js6" (OuterVolumeSpecName: "kube-api-access-f9js6") pod "5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" (UID: "5ec2d319-d1c5-49dc-810b-d2bc1be80cf2"). InnerVolumeSpecName "kube-api-access-f9js6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.058429 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerID="7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498" exitCode=0 Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.058510 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvnv2" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.058543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvnv2" event={"ID":"9b175cbb-4729-4bfd-a0e1-11d57c905286","Type":"ContainerDied","Data":"7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498"} Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.058582 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvnv2" event={"ID":"9b175cbb-4729-4bfd-a0e1-11d57c905286","Type":"ContainerDied","Data":"5b948889a04968cb9ce177d433686dec113df535eb8adf65020e928010730035"} Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.058607 4771 scope.go:117] "RemoveContainer" containerID="7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.062738 4771 generic.go:334] "Generic (PLEG): container finished" podID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerID="4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4" exitCode=0 Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.062846 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlnmp" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.063004 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlnmp" event={"ID":"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2","Type":"ContainerDied","Data":"4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4"} Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.063188 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlnmp" event={"ID":"5ec2d319-d1c5-49dc-810b-d2bc1be80cf2","Type":"ContainerDied","Data":"dad962b41ff42b1ff97a2cbcc6d07c68e52c36da94c26a51d9eeb80e5075f37b"} Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.079653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" (UID: "5ec2d319-d1c5-49dc-810b-d2bc1be80cf2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.090699 4771 scope.go:117] "RemoveContainer" containerID="a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.098281 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9js6\" (UniqueName: \"kubernetes.io/projected/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-kube-api-access-f9js6\") on node \"crc\" DevicePath \"\"" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.098321 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.098341 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.116351 4771 scope.go:117] "RemoveContainer" containerID="ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.116593 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvnv2"] Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.124688 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvnv2"] Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.135073 4771 scope.go:117] "RemoveContainer" containerID="7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498" Feb 19 21:59:40 crc kubenswrapper[4771]: E0219 21:59:40.135608 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498\": container with ID starting with 7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498 not found: ID does not exist" containerID="7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.135641 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498"} err="failed to get container status \"7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498\": rpc error: code = NotFound desc = could not find container \"7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498\": container with ID starting with 7dbe8ca80bb059a760b8a48b68b0e9825009f1a835c6bcdc30e8ad43a4a73498 not found: ID does not exist" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.135662 4771 scope.go:117] "RemoveContainer" containerID="a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15" Feb 19 21:59:40 crc kubenswrapper[4771]: E0219 21:59:40.135894 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15\": container with ID starting with a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15 not found: ID does not exist" containerID="a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.135917 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15"} err="failed to get container status \"a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15\": rpc error: code = NotFound desc = could not find container \"a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15\": container with ID starting with a63f5c7195577d0e3bbe9e208fca7331be0519e86a76e8064d2f310be3393a15 not found: ID does not exist" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.135932 4771 scope.go:117] "RemoveContainer" containerID="ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6" Feb 19 21:59:40 crc kubenswrapper[4771]: E0219 21:59:40.136159 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6\": container with ID starting with ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6 not found: ID does not exist" containerID="ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.136182 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6"} err="failed to get container status \"ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6\": rpc error: code = NotFound desc = could not find container \"ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6\": container with ID starting with ad50fe06a2cebfea539d74d178850a7c2fb58a06a65aa63aef295f76eba89ec6 not found: ID does not exist" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.136197 4771 scope.go:117] "RemoveContainer" containerID="4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.154677 4771 scope.go:117] "RemoveContainer" containerID="148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.173715 4771 scope.go:117] "RemoveContainer" containerID="a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.222807 4771 scope.go:117] "RemoveContainer" containerID="4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4" Feb 19 21:59:40 crc kubenswrapper[4771]: E0219 21:59:40.223373 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4\": container with ID starting with 4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4 not found: ID does not exist" containerID="4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.223407 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4"} err="failed to get container status \"4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4\": rpc error: code = NotFound desc = could not find container \"4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4\": container with ID starting with 4f47b24e2658ec1143ec2b55108a1ded5df7b201d71f3dc736a309ee09d18ad4 not found: ID does not exist" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.223447 4771 scope.go:117] "RemoveContainer" containerID="148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f" Feb 19 21:59:40 crc kubenswrapper[4771]: E0219 21:59:40.223841 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f\": container with ID starting with 148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f not found: ID does not exist" containerID="148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.223900 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f"} err="failed to get container status \"148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f\": rpc error: code = NotFound desc = could not find container \"148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f\": container with ID starting with 148c6662a3f0af59d81f683094a4abbfec5ce84b151b2d48b8cb022256942e3f not found: ID does not exist" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.223918 4771 scope.go:117] "RemoveContainer" containerID="a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32" Feb 19 21:59:40 crc kubenswrapper[4771]: E0219 21:59:40.224386 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32\": container with ID starting with a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32 not found: ID does not exist" containerID="a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.224561 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32"} err="failed to get container status \"a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32\": rpc error: code = NotFound desc = could not find container \"a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32\": container with ID starting with a8f83483d1c60ab66fad209a41a2514cd33ea03a390188adf1ecf12c8be0ac32 not found: ID does not exist" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.399429 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlnmp"] Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.403873 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hlnmp"] Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.447113 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" path="/var/lib/kubelet/pods/5ec2d319-d1c5-49dc-810b-d2bc1be80cf2/volumes" Feb 19 21:59:40 crc kubenswrapper[4771]: I0219 21:59:40.447689 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" path="/var/lib/kubelet/pods/9b175cbb-4729-4bfd-a0e1-11d57c905286/volumes" Feb 19 21:59:45 crc kubenswrapper[4771]: I0219 21:59:45.437312 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:59:45 crc kubenswrapper[4771]: E0219 21:59:45.438002 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 21:59:57 crc kubenswrapper[4771]: I0219 21:59:57.437577 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 21:59:57 crc kubenswrapper[4771]: E0219 21:59:57.438640 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.164507 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm"] Feb 19 22:00:00 crc kubenswrapper[4771]: E0219 22:00:00.166848 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="extract-content" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.167143 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="extract-content" Feb 19 22:00:00 crc kubenswrapper[4771]: E0219 22:00:00.167345 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="registry-server" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.167521 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="registry-server" Feb 19 22:00:00 crc kubenswrapper[4771]: E0219 22:00:00.167707 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="extract-content" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.167869 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="extract-content" Feb 19 22:00:00 crc kubenswrapper[4771]: E0219 22:00:00.168092 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="extract-utilities" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.168314 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="extract-utilities" Feb 19 22:00:00 crc kubenswrapper[4771]: E0219 22:00:00.168491 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="registry-server" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.168676 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="registry-server" Feb 19 22:00:00 crc kubenswrapper[4771]: E0219 22:00:00.169201 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="extract-utilities" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.169419 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="extract-utilities" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.169901 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec2d319-d1c5-49dc-810b-d2bc1be80cf2" containerName="registry-server" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.170144 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b175cbb-4729-4bfd-a0e1-11d57c905286" containerName="registry-server" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.171083 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.178372 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm"] Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.218788 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.219099 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.315081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f68f6324-2e48-46b5-9bdb-66f1bd733a07-config-volume\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.315465 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjh6\" (UniqueName: \"kubernetes.io/projected/f68f6324-2e48-46b5-9bdb-66f1bd733a07-kube-api-access-mxjh6\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.315676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f68f6324-2e48-46b5-9bdb-66f1bd733a07-secret-volume\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.417671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f68f6324-2e48-46b5-9bdb-66f1bd733a07-config-volume\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.417804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjh6\" (UniqueName: \"kubernetes.io/projected/f68f6324-2e48-46b5-9bdb-66f1bd733a07-kube-api-access-mxjh6\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.417869 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f68f6324-2e48-46b5-9bdb-66f1bd733a07-secret-volume\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.419556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f68f6324-2e48-46b5-9bdb-66f1bd733a07-config-volume\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.436132 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f68f6324-2e48-46b5-9bdb-66f1bd733a07-secret-volume\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.450520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjh6\" (UniqueName: \"kubernetes.io/projected/f68f6324-2e48-46b5-9bdb-66f1bd733a07-kube-api-access-mxjh6\") pod \"collect-profiles-29525640-5wwgm\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:00 crc kubenswrapper[4771]: I0219 22:00:00.552756 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:01 crc kubenswrapper[4771]: I0219 22:00:01.424844 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm"] Feb 19 22:00:02 crc kubenswrapper[4771]: I0219 22:00:02.283139 4771 generic.go:334] "Generic (PLEG): container finished" podID="f68f6324-2e48-46b5-9bdb-66f1bd733a07" containerID="be2c97b0a9c68c4a56643d637f40257dbf699d21d472caf754e5fd57419152c7" exitCode=0 Feb 19 22:00:02 crc kubenswrapper[4771]: I0219 22:00:02.283216 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" event={"ID":"f68f6324-2e48-46b5-9bdb-66f1bd733a07","Type":"ContainerDied","Data":"be2c97b0a9c68c4a56643d637f40257dbf699d21d472caf754e5fd57419152c7"} Feb 19 22:00:02 crc kubenswrapper[4771]: I0219 22:00:02.283512 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" event={"ID":"f68f6324-2e48-46b5-9bdb-66f1bd733a07","Type":"ContainerStarted","Data":"1cbefd7c8f54a46d481a64e8fdad6be2d107d22644afd8b4e40c4f81a1b44173"} Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.609110 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.714508 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxjh6\" (UniqueName: \"kubernetes.io/projected/f68f6324-2e48-46b5-9bdb-66f1bd733a07-kube-api-access-mxjh6\") pod \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.714557 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f68f6324-2e48-46b5-9bdb-66f1bd733a07-secret-volume\") pod \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.714609 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f68f6324-2e48-46b5-9bdb-66f1bd733a07-config-volume\") pod \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\" (UID: \"f68f6324-2e48-46b5-9bdb-66f1bd733a07\") " Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.715364 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68f6324-2e48-46b5-9bdb-66f1bd733a07-config-volume" (OuterVolumeSpecName: "config-volume") pod "f68f6324-2e48-46b5-9bdb-66f1bd733a07" (UID: "f68f6324-2e48-46b5-9bdb-66f1bd733a07"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.715639 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f68f6324-2e48-46b5-9bdb-66f1bd733a07-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.720526 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68f6324-2e48-46b5-9bdb-66f1bd733a07-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f68f6324-2e48-46b5-9bdb-66f1bd733a07" (UID: "f68f6324-2e48-46b5-9bdb-66f1bd733a07"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.720703 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68f6324-2e48-46b5-9bdb-66f1bd733a07-kube-api-access-mxjh6" (OuterVolumeSpecName: "kube-api-access-mxjh6") pod "f68f6324-2e48-46b5-9bdb-66f1bd733a07" (UID: "f68f6324-2e48-46b5-9bdb-66f1bd733a07"). InnerVolumeSpecName "kube-api-access-mxjh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.816751 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxjh6\" (UniqueName: \"kubernetes.io/projected/f68f6324-2e48-46b5-9bdb-66f1bd733a07-kube-api-access-mxjh6\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:03 crc kubenswrapper[4771]: I0219 22:00:03.816789 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f68f6324-2e48-46b5-9bdb-66f1bd733a07-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:04 crc kubenswrapper[4771]: I0219 22:00:04.331072 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" event={"ID":"f68f6324-2e48-46b5-9bdb-66f1bd733a07","Type":"ContainerDied","Data":"1cbefd7c8f54a46d481a64e8fdad6be2d107d22644afd8b4e40c4f81a1b44173"} Feb 19 22:00:04 crc kubenswrapper[4771]: I0219 22:00:04.331121 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm" Feb 19 22:00:04 crc kubenswrapper[4771]: I0219 22:00:04.331137 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cbefd7c8f54a46d481a64e8fdad6be2d107d22644afd8b4e40c4f81a1b44173" Feb 19 22:00:04 crc kubenswrapper[4771]: I0219 22:00:04.711736 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm"] Feb 19 22:00:04 crc kubenswrapper[4771]: I0219 22:00:04.723335 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525595-bvfmm"] Feb 19 22:00:06 crc kubenswrapper[4771]: I0219 22:00:06.454935 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06df3295-b598-4767-b2c1-6fe0a9dbaf37" path="/var/lib/kubelet/pods/06df3295-b598-4767-b2c1-6fe0a9dbaf37/volumes" Feb 19 22:00:12 crc kubenswrapper[4771]: I0219 22:00:12.437724 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:00:12 crc kubenswrapper[4771]: E0219 22:00:12.438426 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:00:23 crc kubenswrapper[4771]: I0219 22:00:23.437905 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:00:23 crc kubenswrapper[4771]: E0219 22:00:23.439349 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:00:32 crc kubenswrapper[4771]: I0219 22:00:32.830461 4771 scope.go:117] "RemoveContainer" containerID="2a98662daf41e5fbf0654b65df289550b89a916a8e22da75da3fee17f2c09438" Feb 19 22:00:38 crc kubenswrapper[4771]: I0219 22:00:38.437638 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:00:38 crc kubenswrapper[4771]: E0219 22:00:38.438898 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:00:50 crc kubenswrapper[4771]: I0219 22:00:50.448209 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:00:50 crc kubenswrapper[4771]: E0219 22:00:50.451130 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:01:03 crc kubenswrapper[4771]: I0219 22:01:03.437308 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:01:03 crc kubenswrapper[4771]: E0219 22:01:03.438186 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:01:17 crc kubenswrapper[4771]: I0219 22:01:17.438110 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:01:17 crc kubenswrapper[4771]: E0219 22:01:17.439242 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:01:30 crc kubenswrapper[4771]: I0219 22:01:30.444266 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:01:30 crc kubenswrapper[4771]: E0219 22:01:30.445174 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:01:44 crc kubenswrapper[4771]: I0219 22:01:44.439783 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:01:45 crc kubenswrapper[4771]: I0219 22:01:45.251552 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"5731bc4a4550e5c4c9bd904ba27496fb9d0a649f2b64e7cffa22c1d6bfb60f44"} Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.611348 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z4mm7"] Feb 19 22:03:06 crc kubenswrapper[4771]: E0219 22:03:06.612126 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68f6324-2e48-46b5-9bdb-66f1bd733a07" containerName="collect-profiles" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.612138 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68f6324-2e48-46b5-9bdb-66f1bd733a07" containerName="collect-profiles" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.612284 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68f6324-2e48-46b5-9bdb-66f1bd733a07" containerName="collect-profiles" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.614123 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.623607 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z4mm7"] Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.768951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-utilities\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.769009 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-catalog-content\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.769250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2nzk\" (UniqueName: \"kubernetes.io/projected/79e97aaf-5e5e-4f2b-84a8-3320014006ee-kube-api-access-g2nzk\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.870438 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2nzk\" (UniqueName: \"kubernetes.io/projected/79e97aaf-5e5e-4f2b-84a8-3320014006ee-kube-api-access-g2nzk\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.870581 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-utilities\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.870643 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-catalog-content\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.871377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-catalog-content\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.871377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-utilities\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.897361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2nzk\" (UniqueName: \"kubernetes.io/projected/79e97aaf-5e5e-4f2b-84a8-3320014006ee-kube-api-access-g2nzk\") pod \"community-operators-z4mm7\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:06 crc kubenswrapper[4771]: I0219 22:03:06.952156 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:07 crc kubenswrapper[4771]: I0219 22:03:07.444123 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z4mm7"] Feb 19 22:03:07 crc kubenswrapper[4771]: I0219 22:03:07.980236 4771 generic.go:334] "Generic (PLEG): container finished" podID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerID="8a15ed464b79a9f3343419dbd88864cf08cd6d2f2b925c204a97b47cc4c1bcf5" exitCode=0 Feb 19 22:03:07 crc kubenswrapper[4771]: I0219 22:03:07.980310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mm7" event={"ID":"79e97aaf-5e5e-4f2b-84a8-3320014006ee","Type":"ContainerDied","Data":"8a15ed464b79a9f3343419dbd88864cf08cd6d2f2b925c204a97b47cc4c1bcf5"} Feb 19 22:03:07 crc kubenswrapper[4771]: I0219 22:03:07.980351 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mm7" event={"ID":"79e97aaf-5e5e-4f2b-84a8-3320014006ee","Type":"ContainerStarted","Data":"356ae726ce1c6bd33e4b8ca5a8f5fe2ae8baf36d08ddc60994d39d4582efaa49"} Feb 19 22:03:08 crc kubenswrapper[4771]: I0219 22:03:08.991716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mm7" event={"ID":"79e97aaf-5e5e-4f2b-84a8-3320014006ee","Type":"ContainerStarted","Data":"3a4c83d2cbdffec57c0e59694b713205f6bbd5a2f530298e4e9bc6ce59469182"} Feb 19 22:03:10 crc kubenswrapper[4771]: I0219 22:03:10.005760 4771 generic.go:334] "Generic (PLEG): container finished" podID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerID="3a4c83d2cbdffec57c0e59694b713205f6bbd5a2f530298e4e9bc6ce59469182" exitCode=0 Feb 19 22:03:10 crc kubenswrapper[4771]: I0219 22:03:10.005923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mm7" event={"ID":"79e97aaf-5e5e-4f2b-84a8-3320014006ee","Type":"ContainerDied","Data":"3a4c83d2cbdffec57c0e59694b713205f6bbd5a2f530298e4e9bc6ce59469182"} Feb 19 22:03:11 crc kubenswrapper[4771]: I0219 22:03:11.015073 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mm7" event={"ID":"79e97aaf-5e5e-4f2b-84a8-3320014006ee","Type":"ContainerStarted","Data":"118047f456e13082646e3eb0141cb15fe43741f959a16fba62d3bf1fd2e2a155"} Feb 19 22:03:11 crc kubenswrapper[4771]: I0219 22:03:11.042994 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z4mm7" podStartSLOduration=2.607446183 podStartE2EDuration="5.042974821s" podCreationTimestamp="2026-02-19 22:03:06 +0000 UTC" firstStartedPulling="2026-02-19 22:03:07.982725167 +0000 UTC m=+2088.254167677" lastFinishedPulling="2026-02-19 22:03:10.418253805 +0000 UTC m=+2090.689696315" observedRunningTime="2026-02-19 22:03:11.038356789 +0000 UTC m=+2091.309799299" watchObservedRunningTime="2026-02-19 22:03:11.042974821 +0000 UTC m=+2091.314417301" Feb 19 22:03:16 crc kubenswrapper[4771]: I0219 22:03:16.953357 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:16 crc kubenswrapper[4771]: I0219 22:03:16.954126 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:17 crc kubenswrapper[4771]: I0219 22:03:17.035660 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:17 crc kubenswrapper[4771]: I0219 22:03:17.145831 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:21 crc kubenswrapper[4771]: I0219 22:03:21.882560 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z4mm7"] Feb 19 22:03:21 crc kubenswrapper[4771]: I0219 22:03:21.886976 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z4mm7" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="registry-server" containerID="cri-o://118047f456e13082646e3eb0141cb15fe43741f959a16fba62d3bf1fd2e2a155" gracePeriod=2 Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.120052 4771 generic.go:334] "Generic (PLEG): container finished" podID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerID="118047f456e13082646e3eb0141cb15fe43741f959a16fba62d3bf1fd2e2a155" exitCode=0 Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.120105 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mm7" event={"ID":"79e97aaf-5e5e-4f2b-84a8-3320014006ee","Type":"ContainerDied","Data":"118047f456e13082646e3eb0141cb15fe43741f959a16fba62d3bf1fd2e2a155"} Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.335709 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.517968 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-utilities\") pod \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.518044 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-catalog-content\") pod \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.518152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2nzk\" (UniqueName: \"kubernetes.io/projected/79e97aaf-5e5e-4f2b-84a8-3320014006ee-kube-api-access-g2nzk\") pod \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\" (UID: \"79e97aaf-5e5e-4f2b-84a8-3320014006ee\") " Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.518853 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-utilities" (OuterVolumeSpecName: "utilities") pod "79e97aaf-5e5e-4f2b-84a8-3320014006ee" (UID: "79e97aaf-5e5e-4f2b-84a8-3320014006ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.523199 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e97aaf-5e5e-4f2b-84a8-3320014006ee-kube-api-access-g2nzk" (OuterVolumeSpecName: "kube-api-access-g2nzk") pod "79e97aaf-5e5e-4f2b-84a8-3320014006ee" (UID: "79e97aaf-5e5e-4f2b-84a8-3320014006ee"). InnerVolumeSpecName "kube-api-access-g2nzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.567059 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79e97aaf-5e5e-4f2b-84a8-3320014006ee" (UID: "79e97aaf-5e5e-4f2b-84a8-3320014006ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.619649 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.619695 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e97aaf-5e5e-4f2b-84a8-3320014006ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:03:22 crc kubenswrapper[4771]: I0219 22:03:22.619709 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2nzk\" (UniqueName: \"kubernetes.io/projected/79e97aaf-5e5e-4f2b-84a8-3320014006ee-kube-api-access-g2nzk\") on node \"crc\" DevicePath \"\"" Feb 19 22:03:23 crc kubenswrapper[4771]: I0219 22:03:23.129978 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4mm7" event={"ID":"79e97aaf-5e5e-4f2b-84a8-3320014006ee","Type":"ContainerDied","Data":"356ae726ce1c6bd33e4b8ca5a8f5fe2ae8baf36d08ddc60994d39d4582efaa49"} Feb 19 22:03:23 crc kubenswrapper[4771]: I0219 22:03:23.130039 4771 scope.go:117] "RemoveContainer" containerID="118047f456e13082646e3eb0141cb15fe43741f959a16fba62d3bf1fd2e2a155" Feb 19 22:03:23 crc kubenswrapper[4771]: I0219 22:03:23.131153 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4mm7" Feb 19 22:03:23 crc kubenswrapper[4771]: I0219 22:03:23.178409 4771 scope.go:117] "RemoveContainer" containerID="3a4c83d2cbdffec57c0e59694b713205f6bbd5a2f530298e4e9bc6ce59469182" Feb 19 22:03:23 crc kubenswrapper[4771]: I0219 22:03:23.211225 4771 scope.go:117] "RemoveContainer" containerID="8a15ed464b79a9f3343419dbd88864cf08cd6d2f2b925c204a97b47cc4c1bcf5" Feb 19 22:03:23 crc kubenswrapper[4771]: I0219 22:03:23.212672 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z4mm7"] Feb 19 22:03:23 crc kubenswrapper[4771]: I0219 22:03:23.219411 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z4mm7"] Feb 19 22:03:24 crc kubenswrapper[4771]: I0219 22:03:24.452896 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" path="/var/lib/kubelet/pods/79e97aaf-5e5e-4f2b-84a8-3320014006ee/volumes" Feb 19 22:04:12 crc kubenswrapper[4771]: I0219 22:04:12.957011 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:04:12 crc kubenswrapper[4771]: I0219 22:04:12.957651 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:04:42 crc kubenswrapper[4771]: I0219 22:04:42.956969 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:04:42 crc kubenswrapper[4771]: I0219 22:04:42.957846 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:05:12 crc kubenswrapper[4771]: I0219 22:05:12.957500 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:05:12 crc kubenswrapper[4771]: I0219 22:05:12.958310 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:05:12 crc kubenswrapper[4771]: I0219 22:05:12.958419 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:05:12 crc kubenswrapper[4771]: I0219 22:05:12.959369 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5731bc4a4550e5c4c9bd904ba27496fb9d0a649f2b64e7cffa22c1d6bfb60f44"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:05:12 crc kubenswrapper[4771]: I0219 22:05:12.959469 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://5731bc4a4550e5c4c9bd904ba27496fb9d0a649f2b64e7cffa22c1d6bfb60f44" gracePeriod=600 Feb 19 22:05:13 crc kubenswrapper[4771]: I0219 22:05:13.187231 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="5731bc4a4550e5c4c9bd904ba27496fb9d0a649f2b64e7cffa22c1d6bfb60f44" exitCode=0 Feb 19 22:05:13 crc kubenswrapper[4771]: I0219 22:05:13.187327 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"5731bc4a4550e5c4c9bd904ba27496fb9d0a649f2b64e7cffa22c1d6bfb60f44"} Feb 19 22:05:13 crc kubenswrapper[4771]: I0219 22:05:13.188161 4771 scope.go:117] "RemoveContainer" containerID="74c9d01d5a61a501544cda3cf9371048aa498d98a59435cba52762d6c99fbf8a" Feb 19 22:05:14 crc kubenswrapper[4771]: I0219 22:05:14.200007 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b"} Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.903537 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2txkq"] Feb 19 22:05:55 crc kubenswrapper[4771]: E0219 22:05:55.904285 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="registry-server" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.904298 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="registry-server" Feb 19 22:05:55 crc kubenswrapper[4771]: E0219 22:05:55.904315 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="extract-content" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.904321 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="extract-content" Feb 19 22:05:55 crc kubenswrapper[4771]: E0219 22:05:55.904334 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="extract-utilities" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.904340 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="extract-utilities" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.904470 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e97aaf-5e5e-4f2b-84a8-3320014006ee" containerName="registry-server" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.905415 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.968935 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2txkq"] Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.980482 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42qs\" (UniqueName: \"kubernetes.io/projected/f2ad490b-08fb-43ce-9d76-96f3c36871d6-kube-api-access-f42qs\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.980525 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-utilities\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:55 crc kubenswrapper[4771]: I0219 22:05:55.980585 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-catalog-content\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.082543 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-catalog-content\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.082728 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42qs\" (UniqueName: \"kubernetes.io/projected/f2ad490b-08fb-43ce-9d76-96f3c36871d6-kube-api-access-f42qs\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.082772 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-utilities\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.083140 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-catalog-content\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.083491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-utilities\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.108526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42qs\" (UniqueName: \"kubernetes.io/projected/f2ad490b-08fb-43ce-9d76-96f3c36871d6-kube-api-access-f42qs\") pod \"redhat-operators-2txkq\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.227417 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.489588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2txkq"] Feb 19 22:05:56 crc kubenswrapper[4771]: I0219 22:05:56.565816 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2txkq" event={"ID":"f2ad490b-08fb-43ce-9d76-96f3c36871d6","Type":"ContainerStarted","Data":"e1a44c652548b5980146770ef8845fae38ea6fb24e6e7031ad7bd83d0f76db14"} Feb 19 22:05:57 crc kubenswrapper[4771]: I0219 22:05:57.574397 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerID="7fcffb0fc9c26df141cc654f3b8f8ea72235b413242dd5a99582bb5c2927ea1c" exitCode=0 Feb 19 22:05:57 crc kubenswrapper[4771]: I0219 22:05:57.574466 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2txkq" event={"ID":"f2ad490b-08fb-43ce-9d76-96f3c36871d6","Type":"ContainerDied","Data":"7fcffb0fc9c26df141cc654f3b8f8ea72235b413242dd5a99582bb5c2927ea1c"} Feb 19 22:05:57 crc kubenswrapper[4771]: I0219 22:05:57.577722 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:05:59 crc kubenswrapper[4771]: I0219 22:05:59.593955 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerID="4fcb5b436838ffa17d317cec32caab581bf1d21842bcbe8a386c1ec6e16968d2" exitCode=0 Feb 19 22:05:59 crc kubenswrapper[4771]: I0219 22:05:59.593998 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2txkq" event={"ID":"f2ad490b-08fb-43ce-9d76-96f3c36871d6","Type":"ContainerDied","Data":"4fcb5b436838ffa17d317cec32caab581bf1d21842bcbe8a386c1ec6e16968d2"} Feb 19 22:06:00 crc kubenswrapper[4771]: I0219 22:06:00.604622 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2txkq" event={"ID":"f2ad490b-08fb-43ce-9d76-96f3c36871d6","Type":"ContainerStarted","Data":"88af2643f830e1cbb248c1a8396450b33644544a2eb057e2611707cf8c709fd8"} Feb 19 22:06:00 crc kubenswrapper[4771]: I0219 22:06:00.635576 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2txkq" podStartSLOduration=2.977128398 podStartE2EDuration="5.635554982s" podCreationTimestamp="2026-02-19 22:05:55 +0000 UTC" firstStartedPulling="2026-02-19 22:05:57.57709666 +0000 UTC m=+2257.848539180" lastFinishedPulling="2026-02-19 22:06:00.235523254 +0000 UTC m=+2260.506965764" observedRunningTime="2026-02-19 22:06:00.634448233 +0000 UTC m=+2260.905890753" watchObservedRunningTime="2026-02-19 22:06:00.635554982 +0000 UTC m=+2260.906997482" Feb 19 22:06:06 crc kubenswrapper[4771]: I0219 22:06:06.227544 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:06:06 crc kubenswrapper[4771]: I0219 22:06:06.228196 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:06:07 crc kubenswrapper[4771]: I0219 22:06:07.279982 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2txkq" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="registry-server" probeResult="failure" output=< Feb 19 22:06:07 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 22:06:07 crc kubenswrapper[4771]: > Feb 19 22:06:16 crc kubenswrapper[4771]: I0219 22:06:16.302283 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:06:16 crc kubenswrapper[4771]: I0219 22:06:16.371347 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:06:16 crc kubenswrapper[4771]: I0219 22:06:16.550202 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2txkq"] Feb 19 22:06:18 crc kubenswrapper[4771]: I0219 22:06:18.103421 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2txkq" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="registry-server" containerID="cri-o://88af2643f830e1cbb248c1a8396450b33644544a2eb057e2611707cf8c709fd8" gracePeriod=2 Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.116711 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerID="88af2643f830e1cbb248c1a8396450b33644544a2eb057e2611707cf8c709fd8" exitCode=0 Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.116800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2txkq" event={"ID":"f2ad490b-08fb-43ce-9d76-96f3c36871d6","Type":"ContainerDied","Data":"88af2643f830e1cbb248c1a8396450b33644544a2eb057e2611707cf8c709fd8"} Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.197863 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.346550 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-catalog-content\") pod \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.349286 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-utilities\") pod \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.349409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f42qs\" (UniqueName: \"kubernetes.io/projected/f2ad490b-08fb-43ce-9d76-96f3c36871d6-kube-api-access-f42qs\") pod \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\" (UID: \"f2ad490b-08fb-43ce-9d76-96f3c36871d6\") " Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.350758 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-utilities" (OuterVolumeSpecName: "utilities") pod "f2ad490b-08fb-43ce-9d76-96f3c36871d6" (UID: "f2ad490b-08fb-43ce-9d76-96f3c36871d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.356536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ad490b-08fb-43ce-9d76-96f3c36871d6-kube-api-access-f42qs" (OuterVolumeSpecName: "kube-api-access-f42qs") pod "f2ad490b-08fb-43ce-9d76-96f3c36871d6" (UID: "f2ad490b-08fb-43ce-9d76-96f3c36871d6"). InnerVolumeSpecName "kube-api-access-f42qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.451682 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.451732 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f42qs\" (UniqueName: \"kubernetes.io/projected/f2ad490b-08fb-43ce-9d76-96f3c36871d6-kube-api-access-f42qs\") on node \"crc\" DevicePath \"\"" Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.507547 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2ad490b-08fb-43ce-9d76-96f3c36871d6" (UID: "f2ad490b-08fb-43ce-9d76-96f3c36871d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:06:19 crc kubenswrapper[4771]: I0219 22:06:19.553568 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2ad490b-08fb-43ce-9d76-96f3c36871d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.127127 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2txkq" event={"ID":"f2ad490b-08fb-43ce-9d76-96f3c36871d6","Type":"ContainerDied","Data":"e1a44c652548b5980146770ef8845fae38ea6fb24e6e7031ad7bd83d0f76db14"} Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.127191 4771 scope.go:117] "RemoveContainer" containerID="88af2643f830e1cbb248c1a8396450b33644544a2eb057e2611707cf8c709fd8" Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.127264 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2txkq" Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.143141 4771 scope.go:117] "RemoveContainer" containerID="4fcb5b436838ffa17d317cec32caab581bf1d21842bcbe8a386c1ec6e16968d2" Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.161239 4771 scope.go:117] "RemoveContainer" containerID="7fcffb0fc9c26df141cc654f3b8f8ea72235b413242dd5a99582bb5c2927ea1c" Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.166565 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2txkq"] Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.174547 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2txkq"] Feb 19 22:06:20 crc kubenswrapper[4771]: I0219 22:06:20.451550 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" path="/var/lib/kubelet/pods/f2ad490b-08fb-43ce-9d76-96f3c36871d6/volumes" Feb 19 22:07:42 crc kubenswrapper[4771]: I0219 22:07:42.957123 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:07:42 crc kubenswrapper[4771]: I0219 22:07:42.957716 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:08:12 crc kubenswrapper[4771]: I0219 22:08:12.957063 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:08:12 crc kubenswrapper[4771]: I0219 22:08:12.958073 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:08:42 crc kubenswrapper[4771]: I0219 22:08:42.956582 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:08:42 crc kubenswrapper[4771]: I0219 22:08:42.957403 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:08:42 crc kubenswrapper[4771]: I0219 22:08:42.957473 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:08:42 crc kubenswrapper[4771]: I0219 22:08:42.959195 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:08:42 crc kubenswrapper[4771]: I0219 22:08:42.959306 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" gracePeriod=600 Feb 19 22:08:43 crc kubenswrapper[4771]: E0219 22:08:43.094392 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:08:43 crc kubenswrapper[4771]: I0219 22:08:43.699054 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" exitCode=0 Feb 19 22:08:43 crc kubenswrapper[4771]: I0219 22:08:43.699099 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b"} Feb 19 22:08:43 crc kubenswrapper[4771]: I0219 22:08:43.699174 4771 scope.go:117] "RemoveContainer" containerID="5731bc4a4550e5c4c9bd904ba27496fb9d0a649f2b64e7cffa22c1d6bfb60f44" Feb 19 22:08:43 crc kubenswrapper[4771]: I0219 22:08:43.699690 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:08:43 crc kubenswrapper[4771]: E0219 22:08:43.700041 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:08:54 crc kubenswrapper[4771]: I0219 22:08:54.438060 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:08:54 crc kubenswrapper[4771]: E0219 22:08:54.440743 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:09:06 crc kubenswrapper[4771]: I0219 22:09:06.437600 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:09:06 crc kubenswrapper[4771]: E0219 22:09:06.438534 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:09:17 crc kubenswrapper[4771]: I0219 22:09:17.437951 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:09:17 crc kubenswrapper[4771]: E0219 22:09:17.439316 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:09:28 crc kubenswrapper[4771]: I0219 22:09:28.436986 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:09:28 crc kubenswrapper[4771]: E0219 22:09:28.437843 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:09:42 crc kubenswrapper[4771]: I0219 22:09:42.437908 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:09:42 crc kubenswrapper[4771]: E0219 22:09:42.438694 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:09:55 crc kubenswrapper[4771]: I0219 22:09:55.437698 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:09:55 crc kubenswrapper[4771]: E0219 22:09:55.438887 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.312933 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5l964"] Feb 19 22:10:00 crc kubenswrapper[4771]: E0219 22:10:00.315763 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="registry-server" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.316266 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="registry-server" Feb 19 22:10:00 crc kubenswrapper[4771]: E0219 22:10:00.316412 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="extract-utilities" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.316534 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="extract-utilities" Feb 19 22:10:00 crc kubenswrapper[4771]: E0219 22:10:00.316690 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="extract-content" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.316825 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="extract-content" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.317235 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ad490b-08fb-43ce-9d76-96f3c36871d6" containerName="registry-server" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.319136 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.326994 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l964"] Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.415240 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-utilities\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.415339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkn4\" (UniqueName: \"kubernetes.io/projected/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-kube-api-access-crkn4\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.415401 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-catalog-content\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.516687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-utilities\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.516787 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkn4\" (UniqueName: \"kubernetes.io/projected/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-kube-api-access-crkn4\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.516824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-catalog-content\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.517630 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-catalog-content\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.517622 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-utilities\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.541200 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkn4\" (UniqueName: \"kubernetes.io/projected/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-kube-api-access-crkn4\") pod \"redhat-marketplace-5l964\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:00 crc kubenswrapper[4771]: I0219 22:10:00.646454 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:01 crc kubenswrapper[4771]: I0219 22:10:01.122083 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l964"] Feb 19 22:10:01 crc kubenswrapper[4771]: I0219 22:10:01.457465 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerID="872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20" exitCode=0 Feb 19 22:10:01 crc kubenswrapper[4771]: I0219 22:10:01.457555 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l964" event={"ID":"0d18648e-d8db-4fa4-bbf6-48c139f28fbd","Type":"ContainerDied","Data":"872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20"} Feb 19 22:10:01 crc kubenswrapper[4771]: I0219 22:10:01.457849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l964" event={"ID":"0d18648e-d8db-4fa4-bbf6-48c139f28fbd","Type":"ContainerStarted","Data":"258f5124daadbdfdb86457f8e35f11becde6457efade662eb0c66ad106f649c4"} Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.480366 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerID="fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7" exitCode=0 Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.480451 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l964" event={"ID":"0d18648e-d8db-4fa4-bbf6-48c139f28fbd","Type":"ContainerDied","Data":"fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7"} Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.671566 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d64z8"] Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.681481 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.692333 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d64z8"] Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.784395 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmv6\" (UniqueName: \"kubernetes.io/projected/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-kube-api-access-jsmv6\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.784504 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-utilities\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.784566 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-catalog-content\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.885431 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-catalog-content\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.885556 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmv6\" (UniqueName: \"kubernetes.io/projected/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-kube-api-access-jsmv6\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.885613 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-utilities\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.886273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-catalog-content\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.886428 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-utilities\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:04 crc kubenswrapper[4771]: I0219 22:10:04.913803 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmv6\" (UniqueName: \"kubernetes.io/projected/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-kube-api-access-jsmv6\") pod \"certified-operators-d64z8\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:05 crc kubenswrapper[4771]: I0219 22:10:05.017101 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:05 crc kubenswrapper[4771]: I0219 22:10:05.488195 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l964" event={"ID":"0d18648e-d8db-4fa4-bbf6-48c139f28fbd","Type":"ContainerStarted","Data":"0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc"} Feb 19 22:10:05 crc kubenswrapper[4771]: I0219 22:10:05.513886 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5l964" podStartSLOduration=2.11146964 podStartE2EDuration="5.513865685s" podCreationTimestamp="2026-02-19 22:10:00 +0000 UTC" firstStartedPulling="2026-02-19 22:10:01.4595965 +0000 UTC m=+2501.731039010" lastFinishedPulling="2026-02-19 22:10:04.861992575 +0000 UTC m=+2505.133435055" observedRunningTime="2026-02-19 22:10:05.509916441 +0000 UTC m=+2505.781358931" watchObservedRunningTime="2026-02-19 22:10:05.513865685 +0000 UTC m=+2505.785308165" Feb 19 22:10:05 crc kubenswrapper[4771]: I0219 22:10:05.530484 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d64z8"] Feb 19 22:10:06 crc kubenswrapper[4771]: I0219 22:10:06.500977 4771 generic.go:334] "Generic (PLEG): container finished" podID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerID="162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563" exitCode=0 Feb 19 22:10:06 crc kubenswrapper[4771]: I0219 22:10:06.501202 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d64z8" event={"ID":"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3","Type":"ContainerDied","Data":"162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563"} Feb 19 22:10:06 crc kubenswrapper[4771]: I0219 22:10:06.501489 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d64z8" event={"ID":"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3","Type":"ContainerStarted","Data":"2a9eb21bc05884a0047c415fd511469affa36c55059e359e23e785793208fa88"} Feb 19 22:10:07 crc kubenswrapper[4771]: I0219 22:10:07.515735 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d64z8" event={"ID":"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3","Type":"ContainerStarted","Data":"eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d"} Feb 19 22:10:08 crc kubenswrapper[4771]: I0219 22:10:08.528627 4771 generic.go:334] "Generic (PLEG): container finished" podID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerID="eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d" exitCode=0 Feb 19 22:10:08 crc kubenswrapper[4771]: I0219 22:10:08.528692 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d64z8" event={"ID":"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3","Type":"ContainerDied","Data":"eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d"} Feb 19 22:10:09 crc kubenswrapper[4771]: I0219 22:10:09.551817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d64z8" event={"ID":"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3","Type":"ContainerStarted","Data":"860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89"} Feb 19 22:10:09 crc kubenswrapper[4771]: I0219 22:10:09.573393 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d64z8" podStartSLOduration=3.097660957 podStartE2EDuration="5.573373899s" podCreationTimestamp="2026-02-19 22:10:04 +0000 UTC" firstStartedPulling="2026-02-19 22:10:06.503171456 +0000 UTC m=+2506.774613936" lastFinishedPulling="2026-02-19 22:10:08.978884368 +0000 UTC m=+2509.250326878" observedRunningTime="2026-02-19 22:10:09.572180327 +0000 UTC m=+2509.843622877" watchObservedRunningTime="2026-02-19 22:10:09.573373899 +0000 UTC m=+2509.844816379" Feb 19 22:10:10 crc kubenswrapper[4771]: I0219 22:10:10.445227 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:10:10 crc kubenswrapper[4771]: E0219 22:10:10.445602 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:10:10 crc kubenswrapper[4771]: I0219 22:10:10.647293 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:10 crc kubenswrapper[4771]: I0219 22:10:10.647374 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:10 crc kubenswrapper[4771]: I0219 22:10:10.714932 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:11 crc kubenswrapper[4771]: I0219 22:10:11.654681 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:12 crc kubenswrapper[4771]: I0219 22:10:12.862375 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l964"] Feb 19 22:10:13 crc kubenswrapper[4771]: I0219 22:10:13.588935 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5l964" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="registry-server" containerID="cri-o://0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc" gracePeriod=2 Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.070208 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.233477 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-utilities\") pod \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.233646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-catalog-content\") pod \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.233751 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crkn4\" (UniqueName: \"kubernetes.io/projected/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-kube-api-access-crkn4\") pod \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\" (UID: \"0d18648e-d8db-4fa4-bbf6-48c139f28fbd\") " Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.235710 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-utilities" (OuterVolumeSpecName: "utilities") pod "0d18648e-d8db-4fa4-bbf6-48c139f28fbd" (UID: "0d18648e-d8db-4fa4-bbf6-48c139f28fbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.240873 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-kube-api-access-crkn4" (OuterVolumeSpecName: "kube-api-access-crkn4") pod "0d18648e-d8db-4fa4-bbf6-48c139f28fbd" (UID: "0d18648e-d8db-4fa4-bbf6-48c139f28fbd"). InnerVolumeSpecName "kube-api-access-crkn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.268670 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d18648e-d8db-4fa4-bbf6-48c139f28fbd" (UID: "0d18648e-d8db-4fa4-bbf6-48c139f28fbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.335514 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.335570 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.335600 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crkn4\" (UniqueName: \"kubernetes.io/projected/0d18648e-d8db-4fa4-bbf6-48c139f28fbd-kube-api-access-crkn4\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.599882 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerID="0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc" exitCode=0 Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.599988 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l964" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.600060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l964" event={"ID":"0d18648e-d8db-4fa4-bbf6-48c139f28fbd","Type":"ContainerDied","Data":"0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc"} Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.600550 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l964" event={"ID":"0d18648e-d8db-4fa4-bbf6-48c139f28fbd","Type":"ContainerDied","Data":"258f5124daadbdfdb86457f8e35f11becde6457efade662eb0c66ad106f649c4"} Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.600982 4771 scope.go:117] "RemoveContainer" containerID="0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.627924 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l964"] Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.634754 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l964"] Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.636438 4771 scope.go:117] "RemoveContainer" containerID="fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.661077 4771 scope.go:117] "RemoveContainer" containerID="872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.688730 4771 scope.go:117] "RemoveContainer" containerID="0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc" Feb 19 22:10:14 crc kubenswrapper[4771]: E0219 22:10:14.689517 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc\": container with ID starting with 0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc not found: ID does not exist" containerID="0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.689579 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc"} err="failed to get container status \"0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc\": rpc error: code = NotFound desc = could not find container \"0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc\": container with ID starting with 0ab308eace0f9150d03108d211cf588df10e00cf737c9b9b006cf26eab25f5bc not found: ID does not exist" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.689621 4771 scope.go:117] "RemoveContainer" containerID="fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7" Feb 19 22:10:14 crc kubenswrapper[4771]: E0219 22:10:14.690586 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7\": container with ID starting with fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7 not found: ID does not exist" containerID="fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.690629 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7"} err="failed to get container status \"fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7\": rpc error: code = NotFound desc = could not find container \"fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7\": container with ID starting with fba4fbe438ef23b6f43c5a23d59ec6011a129fc458a7de62221986e729d579f7 not found: ID does not exist" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.690656 4771 scope.go:117] "RemoveContainer" containerID="872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20" Feb 19 22:10:14 crc kubenswrapper[4771]: E0219 22:10:14.691106 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20\": container with ID starting with 872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20 not found: ID does not exist" containerID="872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20" Feb 19 22:10:14 crc kubenswrapper[4771]: I0219 22:10:14.691152 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20"} err="failed to get container status \"872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20\": rpc error: code = NotFound desc = could not find container \"872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20\": container with ID starting with 872f4f84cc4c00be390377e3f758374545750ae6a79eb5e775e02ed619d93e20 not found: ID does not exist" Feb 19 22:10:15 crc kubenswrapper[4771]: I0219 22:10:15.018058 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:15 crc kubenswrapper[4771]: I0219 22:10:15.018134 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:15 crc kubenswrapper[4771]: I0219 22:10:15.094526 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:15 crc kubenswrapper[4771]: I0219 22:10:15.684858 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:16 crc kubenswrapper[4771]: I0219 22:10:16.456165 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" path="/var/lib/kubelet/pods/0d18648e-d8db-4fa4-bbf6-48c139f28fbd/volumes" Feb 19 22:10:17 crc kubenswrapper[4771]: I0219 22:10:17.263504 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d64z8"] Feb 19 22:10:17 crc kubenswrapper[4771]: I0219 22:10:17.626028 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d64z8" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="registry-server" containerID="cri-o://860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89" gracePeriod=2 Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.188833 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.299977 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsmv6\" (UniqueName: \"kubernetes.io/projected/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-kube-api-access-jsmv6\") pod \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.300155 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-utilities\") pod \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.300209 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-catalog-content\") pod \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\" (UID: \"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3\") " Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.301800 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-utilities" (OuterVolumeSpecName: "utilities") pod "b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" (UID: "b896dfba-2d2f-44f9-84a2-14f8ecf7aab3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.311585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-kube-api-access-jsmv6" (OuterVolumeSpecName: "kube-api-access-jsmv6") pod "b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" (UID: "b896dfba-2d2f-44f9-84a2-14f8ecf7aab3"). InnerVolumeSpecName "kube-api-access-jsmv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.389549 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" (UID: "b896dfba-2d2f-44f9-84a2-14f8ecf7aab3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.402433 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.402503 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsmv6\" (UniqueName: \"kubernetes.io/projected/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-kube-api-access-jsmv6\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.402529 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.639470 4771 generic.go:334] "Generic (PLEG): container finished" podID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerID="860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89" exitCode=0 Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.639863 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d64z8" event={"ID":"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3","Type":"ContainerDied","Data":"860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89"} Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.639907 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d64z8" event={"ID":"b896dfba-2d2f-44f9-84a2-14f8ecf7aab3","Type":"ContainerDied","Data":"2a9eb21bc05884a0047c415fd511469affa36c55059e359e23e785793208fa88"} Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.639939 4771 scope.go:117] "RemoveContainer" containerID="860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.640241 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d64z8" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.682088 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d64z8"] Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.685803 4771 scope.go:117] "RemoveContainer" containerID="eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.692126 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d64z8"] Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.713352 4771 scope.go:117] "RemoveContainer" containerID="162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.772061 4771 scope.go:117] "RemoveContainer" containerID="860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89" Feb 19 22:10:18 crc kubenswrapper[4771]: E0219 22:10:18.772747 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89\": container with ID starting with 860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89 not found: ID does not exist" containerID="860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.772800 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89"} err="failed to get container status \"860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89\": rpc error: code = NotFound desc = could not find container \"860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89\": container with ID starting with 860be6f5e06a01495966efe5d8dc3445fefc0698c803139115bed7cf4ae9ae89 not found: ID does not exist" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.772836 4771 scope.go:117] "RemoveContainer" containerID="eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d" Feb 19 22:10:18 crc kubenswrapper[4771]: E0219 22:10:18.773634 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d\": container with ID starting with eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d not found: ID does not exist" containerID="eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.773682 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d"} err="failed to get container status \"eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d\": rpc error: code = NotFound desc = could not find container \"eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d\": container with ID starting with eecee33bdf69a8f55c22bba4c4d0dfd7af1fc5b5a40324fbfb11d61ee75e4f5d not found: ID does not exist" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.773711 4771 scope.go:117] "RemoveContainer" containerID="162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563" Feb 19 22:10:18 crc kubenswrapper[4771]: E0219 22:10:18.774226 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563\": container with ID starting with 162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563 not found: ID does not exist" containerID="162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563" Feb 19 22:10:18 crc kubenswrapper[4771]: I0219 22:10:18.774311 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563"} err="failed to get container status \"162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563\": rpc error: code = NotFound desc = could not find container \"162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563\": container with ID starting with 162cdcf4bc9eda761497acf0381cd7de15eff59c6dae9b1635d48472d45cc563 not found: ID does not exist" Feb 19 22:10:20 crc kubenswrapper[4771]: I0219 22:10:20.460952 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" path="/var/lib/kubelet/pods/b896dfba-2d2f-44f9-84a2-14f8ecf7aab3/volumes" Feb 19 22:10:21 crc kubenswrapper[4771]: I0219 22:10:21.437568 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:10:21 crc kubenswrapper[4771]: E0219 22:10:21.437970 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:10:36 crc kubenswrapper[4771]: I0219 22:10:36.437931 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:10:36 crc kubenswrapper[4771]: E0219 22:10:36.439096 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:10:50 crc kubenswrapper[4771]: I0219 22:10:50.442630 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:10:50 crc kubenswrapper[4771]: E0219 22:10:50.443304 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:11:04 crc kubenswrapper[4771]: I0219 22:11:04.438261 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:11:04 crc kubenswrapper[4771]: E0219 22:11:04.442464 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:11:16 crc kubenswrapper[4771]: I0219 22:11:16.440533 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:11:16 crc kubenswrapper[4771]: E0219 22:11:16.443972 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:11:27 crc kubenswrapper[4771]: I0219 22:11:27.437196 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:11:27 crc kubenswrapper[4771]: E0219 22:11:27.438172 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:11:39 crc kubenswrapper[4771]: I0219 22:11:39.437749 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:11:39 crc kubenswrapper[4771]: E0219 22:11:39.438708 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:11:54 crc kubenswrapper[4771]: I0219 22:11:54.437781 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:11:54 crc kubenswrapper[4771]: E0219 22:11:54.438744 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:12:09 crc kubenswrapper[4771]: I0219 22:12:09.437687 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:12:09 crc kubenswrapper[4771]: E0219 22:12:09.438592 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:12:23 crc kubenswrapper[4771]: I0219 22:12:23.438213 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:12:23 crc kubenswrapper[4771]: E0219 22:12:23.439524 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:12:34 crc kubenswrapper[4771]: I0219 22:12:34.447195 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:12:34 crc kubenswrapper[4771]: E0219 22:12:34.448608 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:12:47 crc kubenswrapper[4771]: I0219 22:12:47.440368 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:12:47 crc kubenswrapper[4771]: E0219 22:12:47.441811 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:12:59 crc kubenswrapper[4771]: I0219 22:12:59.438384 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:12:59 crc kubenswrapper[4771]: E0219 22:12:59.439367 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.447364 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:13:10 crc kubenswrapper[4771]: E0219 22:13:10.448407 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.510308 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9zmpc"] Feb 19 22:13:10 crc kubenswrapper[4771]: E0219 22:13:10.510792 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="extract-utilities" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.510831 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="extract-utilities" Feb 19 22:13:10 crc kubenswrapper[4771]: E0219 22:13:10.510865 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="registry-server" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.510882 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="registry-server" Feb 19 22:13:10 crc kubenswrapper[4771]: E0219 22:13:10.510909 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="extract-utilities" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.510927 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="extract-utilities" Feb 19 22:13:10 crc kubenswrapper[4771]: E0219 22:13:10.510959 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="extract-content" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.510972 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="extract-content" Feb 19 22:13:10 crc kubenswrapper[4771]: E0219 22:13:10.511004 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="registry-server" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.511059 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="registry-server" Feb 19 22:13:10 crc kubenswrapper[4771]: E0219 22:13:10.511080 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="extract-content" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.511098 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="extract-content" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.511448 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d18648e-d8db-4fa4-bbf6-48c139f28fbd" containerName="registry-server" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.511492 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b896dfba-2d2f-44f9-84a2-14f8ecf7aab3" containerName="registry-server" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.513329 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.545199 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zmpc"] Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.685128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jkdv\" (UniqueName: \"kubernetes.io/projected/b9be75f5-8845-4ea6-a97d-c0dc83363e84-kube-api-access-9jkdv\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.685546 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-utilities\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.685577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-catalog-content\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.787529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-utilities\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.787580 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-catalog-content\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.787634 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jkdv\" (UniqueName: \"kubernetes.io/projected/b9be75f5-8845-4ea6-a97d-c0dc83363e84-kube-api-access-9jkdv\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.788098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-utilities\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.788176 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-catalog-content\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.805292 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jkdv\" (UniqueName: \"kubernetes.io/projected/b9be75f5-8845-4ea6-a97d-c0dc83363e84-kube-api-access-9jkdv\") pod \"community-operators-9zmpc\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:10 crc kubenswrapper[4771]: I0219 22:13:10.854299 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:11 crc kubenswrapper[4771]: I0219 22:13:11.367946 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zmpc"] Feb 19 22:13:12 crc kubenswrapper[4771]: I0219 22:13:12.202097 4771 generic.go:334] "Generic (PLEG): container finished" podID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerID="07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46" exitCode=0 Feb 19 22:13:12 crc kubenswrapper[4771]: I0219 22:13:12.202171 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmpc" event={"ID":"b9be75f5-8845-4ea6-a97d-c0dc83363e84","Type":"ContainerDied","Data":"07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46"} Feb 19 22:13:12 crc kubenswrapper[4771]: I0219 22:13:12.202411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmpc" event={"ID":"b9be75f5-8845-4ea6-a97d-c0dc83363e84","Type":"ContainerStarted","Data":"2f1b1f76a54f84baf2c7aab53b94359a68f7f059178ca72ba2b11221a3262662"} Feb 19 22:13:12 crc kubenswrapper[4771]: I0219 22:13:12.205675 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:13:14 crc kubenswrapper[4771]: I0219 22:13:14.228393 4771 generic.go:334] "Generic (PLEG): container finished" podID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerID="a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142" exitCode=0 Feb 19 22:13:14 crc kubenswrapper[4771]: I0219 22:13:14.228485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmpc" event={"ID":"b9be75f5-8845-4ea6-a97d-c0dc83363e84","Type":"ContainerDied","Data":"a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142"} Feb 19 22:13:15 crc kubenswrapper[4771]: I0219 22:13:15.254591 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmpc" event={"ID":"b9be75f5-8845-4ea6-a97d-c0dc83363e84","Type":"ContainerStarted","Data":"29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76"} Feb 19 22:13:15 crc kubenswrapper[4771]: I0219 22:13:15.292463 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9zmpc" podStartSLOduration=2.835111757 podStartE2EDuration="5.292435202s" podCreationTimestamp="2026-02-19 22:13:10 +0000 UTC" firstStartedPulling="2026-02-19 22:13:12.20530598 +0000 UTC m=+2692.476748480" lastFinishedPulling="2026-02-19 22:13:14.662629415 +0000 UTC m=+2694.934071925" observedRunningTime="2026-02-19 22:13:15.282289722 +0000 UTC m=+2695.553732262" watchObservedRunningTime="2026-02-19 22:13:15.292435202 +0000 UTC m=+2695.563877712" Feb 19 22:13:20 crc kubenswrapper[4771]: I0219 22:13:20.854620 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:20 crc kubenswrapper[4771]: I0219 22:13:20.855303 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:20 crc kubenswrapper[4771]: I0219 22:13:20.922133 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:21 crc kubenswrapper[4771]: I0219 22:13:21.383944 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:21 crc kubenswrapper[4771]: I0219 22:13:21.446412 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zmpc"] Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.329697 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9zmpc" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="registry-server" containerID="cri-o://29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76" gracePeriod=2 Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.438303 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:13:23 crc kubenswrapper[4771]: E0219 22:13:23.438677 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.812817 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.835476 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-utilities\") pod \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.835520 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jkdv\" (UniqueName: \"kubernetes.io/projected/b9be75f5-8845-4ea6-a97d-c0dc83363e84-kube-api-access-9jkdv\") pod \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.835833 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-catalog-content\") pod \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\" (UID: \"b9be75f5-8845-4ea6-a97d-c0dc83363e84\") " Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.847184 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9be75f5-8845-4ea6-a97d-c0dc83363e84-kube-api-access-9jkdv" (OuterVolumeSpecName: "kube-api-access-9jkdv") pod "b9be75f5-8845-4ea6-a97d-c0dc83363e84" (UID: "b9be75f5-8845-4ea6-a97d-c0dc83363e84"). InnerVolumeSpecName "kube-api-access-9jkdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.850123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-utilities" (OuterVolumeSpecName: "utilities") pod "b9be75f5-8845-4ea6-a97d-c0dc83363e84" (UID: "b9be75f5-8845-4ea6-a97d-c0dc83363e84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.939141 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:13:23 crc kubenswrapper[4771]: I0219 22:13:23.939174 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jkdv\" (UniqueName: \"kubernetes.io/projected/b9be75f5-8845-4ea6-a97d-c0dc83363e84-kube-api-access-9jkdv\") on node \"crc\" DevicePath \"\"" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.211844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9be75f5-8845-4ea6-a97d-c0dc83363e84" (UID: "b9be75f5-8845-4ea6-a97d-c0dc83363e84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.242860 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9be75f5-8845-4ea6-a97d-c0dc83363e84-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.340621 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmpc" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.340623 4771 generic.go:334] "Generic (PLEG): container finished" podID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerID="29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76" exitCode=0 Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.341957 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmpc" event={"ID":"b9be75f5-8845-4ea6-a97d-c0dc83363e84","Type":"ContainerDied","Data":"29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76"} Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.342390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmpc" event={"ID":"b9be75f5-8845-4ea6-a97d-c0dc83363e84","Type":"ContainerDied","Data":"2f1b1f76a54f84baf2c7aab53b94359a68f7f059178ca72ba2b11221a3262662"} Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.342518 4771 scope.go:117] "RemoveContainer" containerID="29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.368487 4771 scope.go:117] "RemoveContainer" containerID="a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.397492 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zmpc"] Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.410117 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9zmpc"] Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.412841 4771 scope.go:117] "RemoveContainer" containerID="07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.432299 4771 scope.go:117] "RemoveContainer" containerID="29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76" Feb 19 22:13:24 crc kubenswrapper[4771]: E0219 22:13:24.432726 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76\": container with ID starting with 29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76 not found: ID does not exist" containerID="29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.432777 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76"} err="failed to get container status \"29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76\": rpc error: code = NotFound desc = could not find container \"29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76\": container with ID starting with 29c5fa5d10dd75bfe78bbe4c5162cd7e4db527ac1a768a4532e1e76d4b20ee76 not found: ID does not exist" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.432812 4771 scope.go:117] "RemoveContainer" containerID="a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142" Feb 19 22:13:24 crc kubenswrapper[4771]: E0219 22:13:24.433317 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142\": container with ID starting with a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142 not found: ID does not exist" containerID="a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.433356 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142"} err="failed to get container status \"a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142\": rpc error: code = NotFound desc = could not find container \"a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142\": container with ID starting with a1785df56360702c63fe0d8ee9cad3286229471094183f31037239e081d7c142 not found: ID does not exist" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.433384 4771 scope.go:117] "RemoveContainer" containerID="07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46" Feb 19 22:13:24 crc kubenswrapper[4771]: E0219 22:13:24.433723 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46\": container with ID starting with 07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46 not found: ID does not exist" containerID="07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.433794 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46"} err="failed to get container status \"07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46\": rpc error: code = NotFound desc = could not find container \"07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46\": container with ID starting with 07d9d966632c77745b070ab863c79ba507d1a70ef32298ce85406ddbef394f46 not found: ID does not exist" Feb 19 22:13:24 crc kubenswrapper[4771]: I0219 22:13:24.452905 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" path="/var/lib/kubelet/pods/b9be75f5-8845-4ea6-a97d-c0dc83363e84/volumes" Feb 19 22:13:34 crc kubenswrapper[4771]: I0219 22:13:34.437181 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:13:34 crc kubenswrapper[4771]: E0219 22:13:34.437910 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:13:45 crc kubenswrapper[4771]: I0219 22:13:45.437172 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:13:46 crc kubenswrapper[4771]: I0219 22:13:46.569774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"9431670b9ee98d72e79c5c37b86bc52b6f3d09d634a6360c205ca6d5ec5d5526"} Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.155952 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f"] Feb 19 22:15:00 crc kubenswrapper[4771]: E0219 22:15:00.156803 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="registry-server" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.156817 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="registry-server" Feb 19 22:15:00 crc kubenswrapper[4771]: E0219 22:15:00.156830 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="extract-content" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.156838 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="extract-content" Feb 19 22:15:00 crc kubenswrapper[4771]: E0219 22:15:00.156846 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="extract-utilities" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.156853 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="extract-utilities" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.157010 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9be75f5-8845-4ea6-a97d-c0dc83363e84" containerName="registry-server" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.157529 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.161733 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.164932 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.167250 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f"] Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.285125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-config-volume\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.285436 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6nvz\" (UniqueName: \"kubernetes.io/projected/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-kube-api-access-c6nvz\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.285598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-secret-volume\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.387846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6nvz\" (UniqueName: \"kubernetes.io/projected/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-kube-api-access-c6nvz\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.387944 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-secret-volume\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.388064 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-config-volume\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.389907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-config-volume\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.398074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-secret-volume\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.411069 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6nvz\" (UniqueName: \"kubernetes.io/projected/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-kube-api-access-c6nvz\") pod \"collect-profiles-29525655-qsx9f\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.484656 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:00 crc kubenswrapper[4771]: I0219 22:15:00.805963 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f"] Feb 19 22:15:01 crc kubenswrapper[4771]: I0219 22:15:01.264457 4771 generic.go:334] "Generic (PLEG): container finished" podID="8e61c602-b6d1-42a1-a13d-19d61fcb9e12" containerID="a1aec04422339f27f4205ac8acf7008279f11cb4fb400e79e945f193d2a9be7c" exitCode=0 Feb 19 22:15:01 crc kubenswrapper[4771]: I0219 22:15:01.264530 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" event={"ID":"8e61c602-b6d1-42a1-a13d-19d61fcb9e12","Type":"ContainerDied","Data":"a1aec04422339f27f4205ac8acf7008279f11cb4fb400e79e945f193d2a9be7c"} Feb 19 22:15:01 crc kubenswrapper[4771]: I0219 22:15:01.264577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" event={"ID":"8e61c602-b6d1-42a1-a13d-19d61fcb9e12","Type":"ContainerStarted","Data":"bd7cfc8fdb8a3dde707a1bd117da94011ae04c14d9894362ce5b3e9b8a902187"} Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.627118 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.825950 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-config-volume\") pod \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.826012 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-secret-volume\") pod \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.826236 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6nvz\" (UniqueName: \"kubernetes.io/projected/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-kube-api-access-c6nvz\") pod \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\" (UID: \"8e61c602-b6d1-42a1-a13d-19d61fcb9e12\") " Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.826862 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-config-volume" (OuterVolumeSpecName: "config-volume") pod "8e61c602-b6d1-42a1-a13d-19d61fcb9e12" (UID: "8e61c602-b6d1-42a1-a13d-19d61fcb9e12"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.835131 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-kube-api-access-c6nvz" (OuterVolumeSpecName: "kube-api-access-c6nvz") pod "8e61c602-b6d1-42a1-a13d-19d61fcb9e12" (UID: "8e61c602-b6d1-42a1-a13d-19d61fcb9e12"). InnerVolumeSpecName "kube-api-access-c6nvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.838317 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8e61c602-b6d1-42a1-a13d-19d61fcb9e12" (UID: "8e61c602-b6d1-42a1-a13d-19d61fcb9e12"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.928376 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6nvz\" (UniqueName: \"kubernetes.io/projected/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-kube-api-access-c6nvz\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.928427 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:02 crc kubenswrapper[4771]: I0219 22:15:02.928445 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8e61c602-b6d1-42a1-a13d-19d61fcb9e12-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:03 crc kubenswrapper[4771]: I0219 22:15:03.284474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" event={"ID":"8e61c602-b6d1-42a1-a13d-19d61fcb9e12","Type":"ContainerDied","Data":"bd7cfc8fdb8a3dde707a1bd117da94011ae04c14d9894362ce5b3e9b8a902187"} Feb 19 22:15:03 crc kubenswrapper[4771]: I0219 22:15:03.284537 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7cfc8fdb8a3dde707a1bd117da94011ae04c14d9894362ce5b3e9b8a902187" Feb 19 22:15:03 crc kubenswrapper[4771]: I0219 22:15:03.284544 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f" Feb 19 22:15:03 crc kubenswrapper[4771]: I0219 22:15:03.717656 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7"] Feb 19 22:15:03 crc kubenswrapper[4771]: I0219 22:15:03.726961 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-kggq7"] Feb 19 22:15:04 crc kubenswrapper[4771]: I0219 22:15:04.454405 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8aebf05-d80b-4528-bc5e-4c7e158cc7e0" path="/var/lib/kubelet/pods/a8aebf05-d80b-4528-bc5e-4c7e158cc7e0/volumes" Feb 19 22:15:33 crc kubenswrapper[4771]: I0219 22:15:33.262461 4771 scope.go:117] "RemoveContainer" containerID="eb5c38b7ec2ab3f2b06b6d44c4775348682ced0b07893cd77c09bb92e55436e6" Feb 19 22:16:12 crc kubenswrapper[4771]: I0219 22:16:12.957283 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:16:12 crc kubenswrapper[4771]: I0219 22:16:12.957901 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:16:42 crc kubenswrapper[4771]: I0219 22:16:42.957565 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:16:42 crc kubenswrapper[4771]: I0219 22:16:42.958122 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:17:12 crc kubenswrapper[4771]: I0219 22:17:12.957296 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:17:12 crc kubenswrapper[4771]: I0219 22:17:12.957968 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:17:12 crc kubenswrapper[4771]: I0219 22:17:12.958115 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:17:12 crc kubenswrapper[4771]: I0219 22:17:12.959239 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9431670b9ee98d72e79c5c37b86bc52b6f3d09d634a6360c205ca6d5ec5d5526"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:17:12 crc kubenswrapper[4771]: I0219 22:17:12.959372 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://9431670b9ee98d72e79c5c37b86bc52b6f3d09d634a6360c205ca6d5ec5d5526" gracePeriod=600 Feb 19 22:17:13 crc kubenswrapper[4771]: I0219 22:17:13.467333 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="9431670b9ee98d72e79c5c37b86bc52b6f3d09d634a6360c205ca6d5ec5d5526" exitCode=0 Feb 19 22:17:13 crc kubenswrapper[4771]: I0219 22:17:13.467418 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"9431670b9ee98d72e79c5c37b86bc52b6f3d09d634a6360c205ca6d5ec5d5526"} Feb 19 22:17:13 crc kubenswrapper[4771]: I0219 22:17:13.467612 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690"} Feb 19 22:17:13 crc kubenswrapper[4771]: I0219 22:17:13.467633 4771 scope.go:117] "RemoveContainer" containerID="f765ed3f82092bc4bd89a85a62400ada9782cfe482952c91ab98b0a697791b9b" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.702624 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tnnzn"] Feb 19 22:17:20 crc kubenswrapper[4771]: E0219 22:17:20.703421 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e61c602-b6d1-42a1-a13d-19d61fcb9e12" containerName="collect-profiles" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.703432 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61c602-b6d1-42a1-a13d-19d61fcb9e12" containerName="collect-profiles" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.703574 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e61c602-b6d1-42a1-a13d-19d61fcb9e12" containerName="collect-profiles" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.704437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.713619 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnnzn"] Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.820622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-utilities\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.820895 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckdq\" (UniqueName: \"kubernetes.io/projected/b38edba5-fcbd-4471-a670-569edd645941-kube-api-access-gckdq\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.821124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-catalog-content\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.922228 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-utilities\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.922282 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckdq\" (UniqueName: \"kubernetes.io/projected/b38edba5-fcbd-4471-a670-569edd645941-kube-api-access-gckdq\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.922348 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-catalog-content\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.922839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-utilities\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.922849 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-catalog-content\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:20 crc kubenswrapper[4771]: I0219 22:17:20.941475 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckdq\" (UniqueName: \"kubernetes.io/projected/b38edba5-fcbd-4471-a670-569edd645941-kube-api-access-gckdq\") pod \"redhat-operators-tnnzn\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:21 crc kubenswrapper[4771]: I0219 22:17:21.026737 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:21 crc kubenswrapper[4771]: I0219 22:17:21.301501 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnnzn"] Feb 19 22:17:21 crc kubenswrapper[4771]: I0219 22:17:21.542898 4771 generic.go:334] "Generic (PLEG): container finished" podID="b38edba5-fcbd-4471-a670-569edd645941" containerID="8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f" exitCode=0 Feb 19 22:17:21 crc kubenswrapper[4771]: I0219 22:17:21.542959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnnzn" event={"ID":"b38edba5-fcbd-4471-a670-569edd645941","Type":"ContainerDied","Data":"8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f"} Feb 19 22:17:21 crc kubenswrapper[4771]: I0219 22:17:21.542985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnnzn" event={"ID":"b38edba5-fcbd-4471-a670-569edd645941","Type":"ContainerStarted","Data":"0a25a51b9508bebcf26ab952a70508c46a70aa1ea744ba9bc34ede1260572002"} Feb 19 22:17:22 crc kubenswrapper[4771]: I0219 22:17:22.558297 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnnzn" event={"ID":"b38edba5-fcbd-4471-a670-569edd645941","Type":"ContainerStarted","Data":"cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173"} Feb 19 22:17:23 crc kubenswrapper[4771]: I0219 22:17:23.568611 4771 generic.go:334] "Generic (PLEG): container finished" podID="b38edba5-fcbd-4471-a670-569edd645941" containerID="cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173" exitCode=0 Feb 19 22:17:23 crc kubenswrapper[4771]: I0219 22:17:23.568658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnnzn" event={"ID":"b38edba5-fcbd-4471-a670-569edd645941","Type":"ContainerDied","Data":"cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173"} Feb 19 22:17:24 crc kubenswrapper[4771]: I0219 22:17:24.580255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnnzn" event={"ID":"b38edba5-fcbd-4471-a670-569edd645941","Type":"ContainerStarted","Data":"ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586"} Feb 19 22:17:24 crc kubenswrapper[4771]: I0219 22:17:24.627951 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tnnzn" podStartSLOduration=2.20853278 podStartE2EDuration="4.62791516s" podCreationTimestamp="2026-02-19 22:17:20 +0000 UTC" firstStartedPulling="2026-02-19 22:17:21.544661163 +0000 UTC m=+2941.816103633" lastFinishedPulling="2026-02-19 22:17:23.964043533 +0000 UTC m=+2944.235486013" observedRunningTime="2026-02-19 22:17:24.612720535 +0000 UTC m=+2944.884163055" watchObservedRunningTime="2026-02-19 22:17:24.62791516 +0000 UTC m=+2944.899357670" Feb 19 22:17:31 crc kubenswrapper[4771]: I0219 22:17:31.027475 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:31 crc kubenswrapper[4771]: I0219 22:17:31.027985 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:32 crc kubenswrapper[4771]: I0219 22:17:32.089112 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tnnzn" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="registry-server" probeResult="failure" output=< Feb 19 22:17:32 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 22:17:32 crc kubenswrapper[4771]: > Feb 19 22:17:41 crc kubenswrapper[4771]: I0219 22:17:41.104766 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:41 crc kubenswrapper[4771]: I0219 22:17:41.181816 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:41 crc kubenswrapper[4771]: I0219 22:17:41.364187 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnnzn"] Feb 19 22:17:42 crc kubenswrapper[4771]: I0219 22:17:42.735701 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tnnzn" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="registry-server" containerID="cri-o://ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586" gracePeriod=2 Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.238490 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.399502 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-utilities\") pod \"b38edba5-fcbd-4471-a670-569edd645941\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.399640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckdq\" (UniqueName: \"kubernetes.io/projected/b38edba5-fcbd-4471-a670-569edd645941-kube-api-access-gckdq\") pod \"b38edba5-fcbd-4471-a670-569edd645941\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.399786 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-catalog-content\") pod \"b38edba5-fcbd-4471-a670-569edd645941\" (UID: \"b38edba5-fcbd-4471-a670-569edd645941\") " Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.401308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-utilities" (OuterVolumeSpecName: "utilities") pod "b38edba5-fcbd-4471-a670-569edd645941" (UID: "b38edba5-fcbd-4471-a670-569edd645941"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.407088 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b38edba5-fcbd-4471-a670-569edd645941-kube-api-access-gckdq" (OuterVolumeSpecName: "kube-api-access-gckdq") pod "b38edba5-fcbd-4471-a670-569edd645941" (UID: "b38edba5-fcbd-4471-a670-569edd645941"). InnerVolumeSpecName "kube-api-access-gckdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.501298 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.501342 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckdq\" (UniqueName: \"kubernetes.io/projected/b38edba5-fcbd-4471-a670-569edd645941-kube-api-access-gckdq\") on node \"crc\" DevicePath \"\"" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.563539 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b38edba5-fcbd-4471-a670-569edd645941" (UID: "b38edba5-fcbd-4471-a670-569edd645941"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.602849 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38edba5-fcbd-4471-a670-569edd645941-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.746406 4771 generic.go:334] "Generic (PLEG): container finished" podID="b38edba5-fcbd-4471-a670-569edd645941" containerID="ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586" exitCode=0 Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.746447 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnnzn" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.746467 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnnzn" event={"ID":"b38edba5-fcbd-4471-a670-569edd645941","Type":"ContainerDied","Data":"ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586"} Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.746513 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnnzn" event={"ID":"b38edba5-fcbd-4471-a670-569edd645941","Type":"ContainerDied","Data":"0a25a51b9508bebcf26ab952a70508c46a70aa1ea744ba9bc34ede1260572002"} Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.746542 4771 scope.go:117] "RemoveContainer" containerID="ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.789065 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnnzn"] Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.789454 4771 scope.go:117] "RemoveContainer" containerID="cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.791725 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tnnzn"] Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.819954 4771 scope.go:117] "RemoveContainer" containerID="8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.856737 4771 scope.go:117] "RemoveContainer" containerID="ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586" Feb 19 22:17:43 crc kubenswrapper[4771]: E0219 22:17:43.857174 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586\": container with ID starting with ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586 not found: ID does not exist" containerID="ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.857218 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586"} err="failed to get container status \"ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586\": rpc error: code = NotFound desc = could not find container \"ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586\": container with ID starting with ebae82a176267a2757b4e74a10f82f59df198e54280aa5382489e07f8e40e586 not found: ID does not exist" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.857244 4771 scope.go:117] "RemoveContainer" containerID="cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173" Feb 19 22:17:43 crc kubenswrapper[4771]: E0219 22:17:43.857659 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173\": container with ID starting with cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173 not found: ID does not exist" containerID="cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.857689 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173"} err="failed to get container status \"cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173\": rpc error: code = NotFound desc = could not find container \"cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173\": container with ID starting with cd7703e8292bcfd4f7935744c5728e71e8ad51c35852a9540bd888da5ac5c173 not found: ID does not exist" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.857705 4771 scope.go:117] "RemoveContainer" containerID="8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f" Feb 19 22:17:43 crc kubenswrapper[4771]: E0219 22:17:43.857968 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f\": container with ID starting with 8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f not found: ID does not exist" containerID="8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f" Feb 19 22:17:43 crc kubenswrapper[4771]: I0219 22:17:43.858002 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f"} err="failed to get container status \"8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f\": rpc error: code = NotFound desc = could not find container \"8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f\": container with ID starting with 8f6075582dc09885e4bd694b8b55c75f7a8d69c5c6ae52863d231c8eba1fd54f not found: ID does not exist" Feb 19 22:17:44 crc kubenswrapper[4771]: I0219 22:17:44.452067 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38edba5-fcbd-4471-a670-569edd645941" path="/var/lib/kubelet/pods/b38edba5-fcbd-4471-a670-569edd645941/volumes" Feb 19 22:19:42 crc kubenswrapper[4771]: I0219 22:19:42.957216 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:19:42 crc kubenswrapper[4771]: I0219 22:19:42.957836 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:20:12 crc kubenswrapper[4771]: I0219 22:20:12.956659 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:20:12 crc kubenswrapper[4771]: I0219 22:20:12.957357 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:20:42 crc kubenswrapper[4771]: I0219 22:20:42.957316 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:20:42 crc kubenswrapper[4771]: I0219 22:20:42.957845 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:20:42 crc kubenswrapper[4771]: I0219 22:20:42.957892 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:20:42 crc kubenswrapper[4771]: I0219 22:20:42.958541 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:20:42 crc kubenswrapper[4771]: I0219 22:20:42.958596 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" gracePeriod=600 Feb 19 22:20:43 crc kubenswrapper[4771]: E0219 22:20:43.106754 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:20:43 crc kubenswrapper[4771]: I0219 22:20:43.397804 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" exitCode=0 Feb 19 22:20:43 crc kubenswrapper[4771]: I0219 22:20:43.397902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690"} Feb 19 22:20:43 crc kubenswrapper[4771]: I0219 22:20:43.398219 4771 scope.go:117] "RemoveContainer" containerID="9431670b9ee98d72e79c5c37b86bc52b6f3d09d634a6360c205ca6d5ec5d5526" Feb 19 22:20:43 crc kubenswrapper[4771]: I0219 22:20:43.398999 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:20:43 crc kubenswrapper[4771]: E0219 22:20:43.399604 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:20:57 crc kubenswrapper[4771]: I0219 22:20:57.437675 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:20:57 crc kubenswrapper[4771]: E0219 22:20:57.438756 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:21:06 crc kubenswrapper[4771]: I0219 22:21:06.949359 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-scw4n"] Feb 19 22:21:06 crc kubenswrapper[4771]: E0219 22:21:06.950230 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="registry-server" Feb 19 22:21:06 crc kubenswrapper[4771]: I0219 22:21:06.950243 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="registry-server" Feb 19 22:21:06 crc kubenswrapper[4771]: E0219 22:21:06.950255 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="extract-content" Feb 19 22:21:06 crc kubenswrapper[4771]: I0219 22:21:06.950260 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="extract-content" Feb 19 22:21:06 crc kubenswrapper[4771]: E0219 22:21:06.950273 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="extract-utilities" Feb 19 22:21:06 crc kubenswrapper[4771]: I0219 22:21:06.950280 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="extract-utilities" Feb 19 22:21:06 crc kubenswrapper[4771]: I0219 22:21:06.950415 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b38edba5-fcbd-4471-a670-569edd645941" containerName="registry-server" Feb 19 22:21:06 crc kubenswrapper[4771]: I0219 22:21:06.970098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:06 crc kubenswrapper[4771]: I0219 22:21:06.980369 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scw4n"] Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.100978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-catalog-content\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.101520 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-utilities\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.101579 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdr6\" (UniqueName: \"kubernetes.io/projected/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-kube-api-access-qsdr6\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.202665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-catalog-content\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.202744 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-utilities\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.202786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdr6\" (UniqueName: \"kubernetes.io/projected/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-kube-api-access-qsdr6\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.203276 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-catalog-content\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.203280 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-utilities\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.222685 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdr6\" (UniqueName: \"kubernetes.io/projected/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-kube-api-access-qsdr6\") pod \"certified-operators-scw4n\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.295077 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.571851 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scw4n"] Feb 19 22:21:07 crc kubenswrapper[4771]: I0219 22:21:07.638268 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scw4n" event={"ID":"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0","Type":"ContainerStarted","Data":"2bdd479039ec5cf487ebb980e5c8a0bb4f4275101aad8ee7eeabd283de8c5024"} Feb 19 22:21:08 crc kubenswrapper[4771]: I0219 22:21:08.803265 4771 generic.go:334] "Generic (PLEG): container finished" podID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerID="09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8" exitCode=0 Feb 19 22:21:08 crc kubenswrapper[4771]: I0219 22:21:08.803392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scw4n" event={"ID":"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0","Type":"ContainerDied","Data":"09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8"} Feb 19 22:21:08 crc kubenswrapper[4771]: I0219 22:21:08.804758 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:21:09 crc kubenswrapper[4771]: I0219 22:21:09.438093 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:21:09 crc kubenswrapper[4771]: E0219 22:21:09.438575 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:21:09 crc kubenswrapper[4771]: I0219 22:21:09.819451 4771 generic.go:334] "Generic (PLEG): container finished" podID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerID="4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1" exitCode=0 Feb 19 22:21:09 crc kubenswrapper[4771]: I0219 22:21:09.819494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scw4n" event={"ID":"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0","Type":"ContainerDied","Data":"4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1"} Feb 19 22:21:10 crc kubenswrapper[4771]: I0219 22:21:10.827494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scw4n" event={"ID":"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0","Type":"ContainerStarted","Data":"5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282"} Feb 19 22:21:10 crc kubenswrapper[4771]: I0219 22:21:10.857847 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-scw4n" podStartSLOduration=3.129712648 podStartE2EDuration="4.857823533s" podCreationTimestamp="2026-02-19 22:21:06 +0000 UTC" firstStartedPulling="2026-02-19 22:21:08.804566574 +0000 UTC m=+3169.076009044" lastFinishedPulling="2026-02-19 22:21:10.532677459 +0000 UTC m=+3170.804119929" observedRunningTime="2026-02-19 22:21:10.857239708 +0000 UTC m=+3171.128682198" watchObservedRunningTime="2026-02-19 22:21:10.857823533 +0000 UTC m=+3171.129266043" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.339524 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kw8kv"] Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.343159 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.367546 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw8kv"] Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.475223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-catalog-content\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.475385 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7h4j\" (UniqueName: \"kubernetes.io/projected/18a212d1-6c73-4fe5-8cf2-e486049c6b86-kube-api-access-l7h4j\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.475434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-utilities\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.576819 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-catalog-content\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.577268 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7h4j\" (UniqueName: \"kubernetes.io/projected/18a212d1-6c73-4fe5-8cf2-e486049c6b86-kube-api-access-l7h4j\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.577468 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-utilities\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.577794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-catalog-content\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.578010 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-utilities\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.640257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7h4j\" (UniqueName: \"kubernetes.io/projected/18a212d1-6c73-4fe5-8cf2-e486049c6b86-kube-api-access-l7h4j\") pod \"redhat-marketplace-kw8kv\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.671602 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:13 crc kubenswrapper[4771]: I0219 22:21:13.956527 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw8kv"] Feb 19 22:21:14 crc kubenswrapper[4771]: E0219 22:21:14.288979 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a212d1_6c73_4fe5_8cf2_e486049c6b86.slice/crio-conmon-2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a212d1_6c73_4fe5_8cf2_e486049c6b86.slice/crio-2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:21:14 crc kubenswrapper[4771]: I0219 22:21:14.867114 4771 generic.go:334] "Generic (PLEG): container finished" podID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerID="2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7" exitCode=0 Feb 19 22:21:14 crc kubenswrapper[4771]: I0219 22:21:14.867272 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw8kv" event={"ID":"18a212d1-6c73-4fe5-8cf2-e486049c6b86","Type":"ContainerDied","Data":"2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7"} Feb 19 22:21:14 crc kubenswrapper[4771]: I0219 22:21:14.867568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw8kv" event={"ID":"18a212d1-6c73-4fe5-8cf2-e486049c6b86","Type":"ContainerStarted","Data":"66cc1c2ce3ae7964ba6d925a0de5d6ddc3edbb367a2895eeb2da6d5e8d5b83c7"} Feb 19 22:21:15 crc kubenswrapper[4771]: I0219 22:21:15.878699 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw8kv" event={"ID":"18a212d1-6c73-4fe5-8cf2-e486049c6b86","Type":"ContainerStarted","Data":"755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299"} Feb 19 22:21:16 crc kubenswrapper[4771]: I0219 22:21:16.891852 4771 generic.go:334] "Generic (PLEG): container finished" podID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerID="755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299" exitCode=0 Feb 19 22:21:16 crc kubenswrapper[4771]: I0219 22:21:16.891992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw8kv" event={"ID":"18a212d1-6c73-4fe5-8cf2-e486049c6b86","Type":"ContainerDied","Data":"755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299"} Feb 19 22:21:17 crc kubenswrapper[4771]: I0219 22:21:17.295415 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:17 crc kubenswrapper[4771]: I0219 22:21:17.295484 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:17 crc kubenswrapper[4771]: I0219 22:21:17.384567 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:17 crc kubenswrapper[4771]: I0219 22:21:17.903944 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw8kv" event={"ID":"18a212d1-6c73-4fe5-8cf2-e486049c6b86","Type":"ContainerStarted","Data":"a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7"} Feb 19 22:21:17 crc kubenswrapper[4771]: I0219 22:21:17.938413 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kw8kv" podStartSLOduration=2.5025450879999998 podStartE2EDuration="4.93835161s" podCreationTimestamp="2026-02-19 22:21:13 +0000 UTC" firstStartedPulling="2026-02-19 22:21:14.869100043 +0000 UTC m=+3175.140542513" lastFinishedPulling="2026-02-19 22:21:17.304906525 +0000 UTC m=+3177.576349035" observedRunningTime="2026-02-19 22:21:17.936056269 +0000 UTC m=+3178.207498749" watchObservedRunningTime="2026-02-19 22:21:17.93835161 +0000 UTC m=+3178.209794110" Feb 19 22:21:17 crc kubenswrapper[4771]: I0219 22:21:17.977140 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:19 crc kubenswrapper[4771]: I0219 22:21:19.727296 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scw4n"] Feb 19 22:21:19 crc kubenswrapper[4771]: I0219 22:21:19.918371 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-scw4n" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="registry-server" containerID="cri-o://5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282" gracePeriod=2 Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.899860 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.933913 4771 generic.go:334] "Generic (PLEG): container finished" podID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerID="5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282" exitCode=0 Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.933987 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scw4n" event={"ID":"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0","Type":"ContainerDied","Data":"5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282"} Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.934220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scw4n" event={"ID":"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0","Type":"ContainerDied","Data":"2bdd479039ec5cf487ebb980e5c8a0bb4f4275101aad8ee7eeabd283de8c5024"} Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.934012 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scw4n" Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.934242 4771 scope.go:117] "RemoveContainer" containerID="5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282" Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.964813 4771 scope.go:117] "RemoveContainer" containerID="4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1" Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.991938 4771 scope.go:117] "RemoveContainer" containerID="09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8" Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.996386 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-utilities\") pod \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.996674 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-catalog-content\") pod \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.996732 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsdr6\" (UniqueName: \"kubernetes.io/projected/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-kube-api-access-qsdr6\") pod \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\" (UID: \"c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0\") " Feb 19 22:21:20 crc kubenswrapper[4771]: I0219 22:21:20.997799 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-utilities" (OuterVolumeSpecName: "utilities") pod "c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" (UID: "c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.004960 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-kube-api-access-qsdr6" (OuterVolumeSpecName: "kube-api-access-qsdr6") pod "c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" (UID: "c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0"). InnerVolumeSpecName "kube-api-access-qsdr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.060383 4771 scope.go:117] "RemoveContainer" containerID="5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282" Feb 19 22:21:21 crc kubenswrapper[4771]: E0219 22:21:21.061552 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282\": container with ID starting with 5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282 not found: ID does not exist" containerID="5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.061617 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282"} err="failed to get container status \"5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282\": rpc error: code = NotFound desc = could not find container \"5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282\": container with ID starting with 5fae3db7ff7b246856d0d9569142ec9b1ec3e5e923081a20c36028e7a248f282 not found: ID does not exist" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.061665 4771 scope.go:117] "RemoveContainer" containerID="4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1" Feb 19 22:21:21 crc kubenswrapper[4771]: E0219 22:21:21.062245 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1\": container with ID starting with 4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1 not found: ID does not exist" containerID="4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.062298 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1"} err="failed to get container status \"4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1\": rpc error: code = NotFound desc = could not find container \"4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1\": container with ID starting with 4a86451e69fb9aae6caadeb94f84b47053a89ce377fe45325a28198c0c1720b1 not found: ID does not exist" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.062332 4771 scope.go:117] "RemoveContainer" containerID="09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8" Feb 19 22:21:21 crc kubenswrapper[4771]: E0219 22:21:21.062850 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8\": container with ID starting with 09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8 not found: ID does not exist" containerID="09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.062902 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8"} err="failed to get container status \"09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8\": rpc error: code = NotFound desc = could not find container \"09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8\": container with ID starting with 09c044f4b6a0a082bc8c88e7f2e0557db4937c0e3026d622513e7ee08f60cbb8 not found: ID does not exist" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.071140 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" (UID: "c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.098898 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.098931 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsdr6\" (UniqueName: \"kubernetes.io/projected/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-kube-api-access-qsdr6\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.098944 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.278718 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scw4n"] Feb 19 22:21:21 crc kubenswrapper[4771]: I0219 22:21:21.284558 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-scw4n"] Feb 19 22:21:22 crc kubenswrapper[4771]: I0219 22:21:22.437763 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:21:22 crc kubenswrapper[4771]: E0219 22:21:22.438499 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:21:22 crc kubenswrapper[4771]: I0219 22:21:22.452348 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" path="/var/lib/kubelet/pods/c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0/volumes" Feb 19 22:21:23 crc kubenswrapper[4771]: I0219 22:21:23.672421 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:23 crc kubenswrapper[4771]: I0219 22:21:23.672789 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:23 crc kubenswrapper[4771]: I0219 22:21:23.753985 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:24 crc kubenswrapper[4771]: I0219 22:21:24.034441 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:24 crc kubenswrapper[4771]: I0219 22:21:24.727985 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw8kv"] Feb 19 22:21:25 crc kubenswrapper[4771]: I0219 22:21:25.976245 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kw8kv" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="registry-server" containerID="cri-o://a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7" gracePeriod=2 Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.424091 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.581446 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-catalog-content\") pod \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.581525 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-utilities\") pod \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.581731 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7h4j\" (UniqueName: \"kubernetes.io/projected/18a212d1-6c73-4fe5-8cf2-e486049c6b86-kube-api-access-l7h4j\") pod \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\" (UID: \"18a212d1-6c73-4fe5-8cf2-e486049c6b86\") " Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.582966 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-utilities" (OuterVolumeSpecName: "utilities") pod "18a212d1-6c73-4fe5-8cf2-e486049c6b86" (UID: "18a212d1-6c73-4fe5-8cf2-e486049c6b86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.584327 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.588962 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a212d1-6c73-4fe5-8cf2-e486049c6b86-kube-api-access-l7h4j" (OuterVolumeSpecName: "kube-api-access-l7h4j") pod "18a212d1-6c73-4fe5-8cf2-e486049c6b86" (UID: "18a212d1-6c73-4fe5-8cf2-e486049c6b86"). InnerVolumeSpecName "kube-api-access-l7h4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.626724 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18a212d1-6c73-4fe5-8cf2-e486049c6b86" (UID: "18a212d1-6c73-4fe5-8cf2-e486049c6b86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.686196 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18a212d1-6c73-4fe5-8cf2-e486049c6b86-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.686239 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7h4j\" (UniqueName: \"kubernetes.io/projected/18a212d1-6c73-4fe5-8cf2-e486049c6b86-kube-api-access-l7h4j\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.985504 4771 generic.go:334] "Generic (PLEG): container finished" podID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerID="a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7" exitCode=0 Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.985585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw8kv" event={"ID":"18a212d1-6c73-4fe5-8cf2-e486049c6b86","Type":"ContainerDied","Data":"a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7"} Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.985620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw8kv" event={"ID":"18a212d1-6c73-4fe5-8cf2-e486049c6b86","Type":"ContainerDied","Data":"66cc1c2ce3ae7964ba6d925a0de5d6ddc3edbb367a2895eeb2da6d5e8d5b83c7"} Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.985677 4771 scope.go:117] "RemoveContainer" containerID="a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7" Feb 19 22:21:26 crc kubenswrapper[4771]: I0219 22:21:26.985882 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw8kv" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.011630 4771 scope.go:117] "RemoveContainer" containerID="755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.025858 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw8kv"] Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.030600 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw8kv"] Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.057296 4771 scope.go:117] "RemoveContainer" containerID="2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.082892 4771 scope.go:117] "RemoveContainer" containerID="a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7" Feb 19 22:21:27 crc kubenswrapper[4771]: E0219 22:21:27.083553 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7\": container with ID starting with a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7 not found: ID does not exist" containerID="a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.083623 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7"} err="failed to get container status \"a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7\": rpc error: code = NotFound desc = could not find container \"a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7\": container with ID starting with a7ab88a32c30d56981c1d9ba826af8329adacc9cee4bb377ad4ab0cafb5137b7 not found: ID does not exist" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.083666 4771 scope.go:117] "RemoveContainer" containerID="755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299" Feb 19 22:21:27 crc kubenswrapper[4771]: E0219 22:21:27.084086 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299\": container with ID starting with 755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299 not found: ID does not exist" containerID="755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.084135 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299"} err="failed to get container status \"755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299\": rpc error: code = NotFound desc = could not find container \"755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299\": container with ID starting with 755c9c71a7d83fc661ef14b3ee867cf43d2e6b43c12558335d5457233a3eb299 not found: ID does not exist" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.084198 4771 scope.go:117] "RemoveContainer" containerID="2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7" Feb 19 22:21:27 crc kubenswrapper[4771]: E0219 22:21:27.084755 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7\": container with ID starting with 2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7 not found: ID does not exist" containerID="2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7" Feb 19 22:21:27 crc kubenswrapper[4771]: I0219 22:21:27.084802 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7"} err="failed to get container status \"2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7\": rpc error: code = NotFound desc = could not find container \"2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7\": container with ID starting with 2ba63e6f2b20da5e69d18c0b4e032abd383dc45217dfdd1de518679002f16fb7 not found: ID does not exist" Feb 19 22:21:28 crc kubenswrapper[4771]: I0219 22:21:28.451760 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" path="/var/lib/kubelet/pods/18a212d1-6c73-4fe5-8cf2-e486049c6b86/volumes" Feb 19 22:21:37 crc kubenswrapper[4771]: I0219 22:21:37.436883 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:21:37 crc kubenswrapper[4771]: E0219 22:21:37.437712 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:21:48 crc kubenswrapper[4771]: I0219 22:21:48.437496 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:21:48 crc kubenswrapper[4771]: E0219 22:21:48.438641 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:22:00 crc kubenswrapper[4771]: I0219 22:22:00.444647 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:22:00 crc kubenswrapper[4771]: E0219 22:22:00.445606 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:22:15 crc kubenswrapper[4771]: I0219 22:22:15.437244 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:22:15 crc kubenswrapper[4771]: E0219 22:22:15.438048 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:22:28 crc kubenswrapper[4771]: I0219 22:22:28.437939 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:22:28 crc kubenswrapper[4771]: E0219 22:22:28.438908 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:22:39 crc kubenswrapper[4771]: I0219 22:22:39.437114 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:22:39 crc kubenswrapper[4771]: E0219 22:22:39.437952 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:22:53 crc kubenswrapper[4771]: I0219 22:22:53.437501 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:22:53 crc kubenswrapper[4771]: E0219 22:22:53.438460 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:23:08 crc kubenswrapper[4771]: I0219 22:23:08.438096 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:23:08 crc kubenswrapper[4771]: E0219 22:23:08.438998 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:23:19 crc kubenswrapper[4771]: I0219 22:23:19.437878 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:23:19 crc kubenswrapper[4771]: E0219 22:23:19.438849 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:23:31 crc kubenswrapper[4771]: I0219 22:23:31.437147 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:23:31 crc kubenswrapper[4771]: E0219 22:23:31.438009 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:23:42 crc kubenswrapper[4771]: I0219 22:23:42.437761 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:23:42 crc kubenswrapper[4771]: E0219 22:23:42.438816 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:23:54 crc kubenswrapper[4771]: I0219 22:23:54.438227 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:23:54 crc kubenswrapper[4771]: E0219 22:23:54.439674 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:24:05 crc kubenswrapper[4771]: I0219 22:24:05.438262 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:24:05 crc kubenswrapper[4771]: E0219 22:24:05.438986 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:24:18 crc kubenswrapper[4771]: I0219 22:24:18.437930 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:24:18 crc kubenswrapper[4771]: E0219 22:24:18.439128 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.475462 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xbg4s"] Feb 19 22:24:29 crc kubenswrapper[4771]: E0219 22:24:29.478212 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="registry-server" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478242 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="registry-server" Feb 19 22:24:29 crc kubenswrapper[4771]: E0219 22:24:29.478265 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="extract-utilities" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478278 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="extract-utilities" Feb 19 22:24:29 crc kubenswrapper[4771]: E0219 22:24:29.478316 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="extract-content" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478332 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="extract-content" Feb 19 22:24:29 crc kubenswrapper[4771]: E0219 22:24:29.478364 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="extract-content" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478380 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="extract-content" Feb 19 22:24:29 crc kubenswrapper[4771]: E0219 22:24:29.478402 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="extract-utilities" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478416 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="extract-utilities" Feb 19 22:24:29 crc kubenswrapper[4771]: E0219 22:24:29.478439 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="registry-server" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478452 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="registry-server" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478729 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a212d1-6c73-4fe5-8cf2-e486049c6b86" containerName="registry-server" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.478758 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8eb1c25-f55a-491b-a9dd-c5e9aae8d1d0" containerName="registry-server" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.480830 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.550868 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xbg4s"] Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.574930 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-catalog-content\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.575003 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5jpb\" (UniqueName: \"kubernetes.io/projected/339b7efa-3646-4a81-ac0e-cd40e7298536-kube-api-access-s5jpb\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.575059 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-utilities\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.676173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-catalog-content\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.676226 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5jpb\" (UniqueName: \"kubernetes.io/projected/339b7efa-3646-4a81-ac0e-cd40e7298536-kube-api-access-s5jpb\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.676256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-utilities\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.676743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-catalog-content\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.676808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-utilities\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.713885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5jpb\" (UniqueName: \"kubernetes.io/projected/339b7efa-3646-4a81-ac0e-cd40e7298536-kube-api-access-s5jpb\") pod \"community-operators-xbg4s\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:29 crc kubenswrapper[4771]: I0219 22:24:29.847364 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:30 crc kubenswrapper[4771]: I0219 22:24:30.159380 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xbg4s"] Feb 19 22:24:30 crc kubenswrapper[4771]: I0219 22:24:30.691877 4771 generic.go:334] "Generic (PLEG): container finished" podID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerID="2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc" exitCode=0 Feb 19 22:24:30 crc kubenswrapper[4771]: I0219 22:24:30.691968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbg4s" event={"ID":"339b7efa-3646-4a81-ac0e-cd40e7298536","Type":"ContainerDied","Data":"2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc"} Feb 19 22:24:30 crc kubenswrapper[4771]: I0219 22:24:30.692077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbg4s" event={"ID":"339b7efa-3646-4a81-ac0e-cd40e7298536","Type":"ContainerStarted","Data":"b86500552e182887aa4b3749f997f4c62d15ba5310a3eae37f9d775732c26174"} Feb 19 22:24:31 crc kubenswrapper[4771]: I0219 22:24:31.704850 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbg4s" event={"ID":"339b7efa-3646-4a81-ac0e-cd40e7298536","Type":"ContainerStarted","Data":"5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465"} Feb 19 22:24:32 crc kubenswrapper[4771]: I0219 22:24:32.437572 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:24:32 crc kubenswrapper[4771]: E0219 22:24:32.437865 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:24:32 crc kubenswrapper[4771]: I0219 22:24:32.716804 4771 generic.go:334] "Generic (PLEG): container finished" podID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerID="5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465" exitCode=0 Feb 19 22:24:32 crc kubenswrapper[4771]: I0219 22:24:32.716846 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbg4s" event={"ID":"339b7efa-3646-4a81-ac0e-cd40e7298536","Type":"ContainerDied","Data":"5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465"} Feb 19 22:24:33 crc kubenswrapper[4771]: I0219 22:24:33.730264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbg4s" event={"ID":"339b7efa-3646-4a81-ac0e-cd40e7298536","Type":"ContainerStarted","Data":"6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44"} Feb 19 22:24:33 crc kubenswrapper[4771]: I0219 22:24:33.762276 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xbg4s" podStartSLOduration=2.315080052 podStartE2EDuration="4.762260092s" podCreationTimestamp="2026-02-19 22:24:29 +0000 UTC" firstStartedPulling="2026-02-19 22:24:30.693751487 +0000 UTC m=+3370.965193997" lastFinishedPulling="2026-02-19 22:24:33.140931527 +0000 UTC m=+3373.412374037" observedRunningTime="2026-02-19 22:24:33.758622725 +0000 UTC m=+3374.030065265" watchObservedRunningTime="2026-02-19 22:24:33.762260092 +0000 UTC m=+3374.033702562" Feb 19 22:24:39 crc kubenswrapper[4771]: I0219 22:24:39.848164 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:39 crc kubenswrapper[4771]: I0219 22:24:39.849073 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:39 crc kubenswrapper[4771]: I0219 22:24:39.923253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:40 crc kubenswrapper[4771]: I0219 22:24:40.867206 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:40 crc kubenswrapper[4771]: I0219 22:24:40.928110 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xbg4s"] Feb 19 22:24:42 crc kubenswrapper[4771]: I0219 22:24:42.816924 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xbg4s" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="registry-server" containerID="cri-o://6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44" gracePeriod=2 Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.272129 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.340629 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-catalog-content\") pod \"339b7efa-3646-4a81-ac0e-cd40e7298536\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.340774 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5jpb\" (UniqueName: \"kubernetes.io/projected/339b7efa-3646-4a81-ac0e-cd40e7298536-kube-api-access-s5jpb\") pod \"339b7efa-3646-4a81-ac0e-cd40e7298536\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.340871 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-utilities\") pod \"339b7efa-3646-4a81-ac0e-cd40e7298536\" (UID: \"339b7efa-3646-4a81-ac0e-cd40e7298536\") " Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.350983 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-utilities" (OuterVolumeSpecName: "utilities") pod "339b7efa-3646-4a81-ac0e-cd40e7298536" (UID: "339b7efa-3646-4a81-ac0e-cd40e7298536"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.351215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339b7efa-3646-4a81-ac0e-cd40e7298536-kube-api-access-s5jpb" (OuterVolumeSpecName: "kube-api-access-s5jpb") pod "339b7efa-3646-4a81-ac0e-cd40e7298536" (UID: "339b7efa-3646-4a81-ac0e-cd40e7298536"). InnerVolumeSpecName "kube-api-access-s5jpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.415082 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "339b7efa-3646-4a81-ac0e-cd40e7298536" (UID: "339b7efa-3646-4a81-ac0e-cd40e7298536"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.443688 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.443734 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5jpb\" (UniqueName: \"kubernetes.io/projected/339b7efa-3646-4a81-ac0e-cd40e7298536-kube-api-access-s5jpb\") on node \"crc\" DevicePath \"\"" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.443755 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/339b7efa-3646-4a81-ac0e-cd40e7298536-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.831820 4771 generic.go:334] "Generic (PLEG): container finished" podID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerID="6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44" exitCode=0 Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.831896 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbg4s" event={"ID":"339b7efa-3646-4a81-ac0e-cd40e7298536","Type":"ContainerDied","Data":"6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44"} Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.831908 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbg4s" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.831937 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbg4s" event={"ID":"339b7efa-3646-4a81-ac0e-cd40e7298536","Type":"ContainerDied","Data":"b86500552e182887aa4b3749f997f4c62d15ba5310a3eae37f9d775732c26174"} Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.831976 4771 scope.go:117] "RemoveContainer" containerID="6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.861079 4771 scope.go:117] "RemoveContainer" containerID="5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.888409 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xbg4s"] Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.900356 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xbg4s"] Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.909149 4771 scope.go:117] "RemoveContainer" containerID="2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.937314 4771 scope.go:117] "RemoveContainer" containerID="6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44" Feb 19 22:24:43 crc kubenswrapper[4771]: E0219 22:24:43.938497 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44\": container with ID starting with 6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44 not found: ID does not exist" containerID="6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.938595 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44"} err="failed to get container status \"6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44\": rpc error: code = NotFound desc = could not find container \"6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44\": container with ID starting with 6e518e6f2de08d5e46476a944af46b5d183abdd6a009c25f99a5f964810c9e44 not found: ID does not exist" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.938644 4771 scope.go:117] "RemoveContainer" containerID="5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465" Feb 19 22:24:43 crc kubenswrapper[4771]: E0219 22:24:43.939484 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465\": container with ID starting with 5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465 not found: ID does not exist" containerID="5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.939623 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465"} err="failed to get container status \"5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465\": rpc error: code = NotFound desc = could not find container \"5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465\": container with ID starting with 5e7e464553cc576695694c8b075d8523a1f58397a30b072db01ddc8b440f2465 not found: ID does not exist" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.939776 4771 scope.go:117] "RemoveContainer" containerID="2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc" Feb 19 22:24:43 crc kubenswrapper[4771]: E0219 22:24:43.940589 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc\": container with ID starting with 2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc not found: ID does not exist" containerID="2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc" Feb 19 22:24:43 crc kubenswrapper[4771]: I0219 22:24:43.940707 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc"} err="failed to get container status \"2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc\": rpc error: code = NotFound desc = could not find container \"2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc\": container with ID starting with 2da32abc7fb6eb5d1d01c5a91ab732ae50808e52ceb559df792b0045eaee1bcc not found: ID does not exist" Feb 19 22:24:44 crc kubenswrapper[4771]: I0219 22:24:44.454519 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" path="/var/lib/kubelet/pods/339b7efa-3646-4a81-ac0e-cd40e7298536/volumes" Feb 19 22:24:46 crc kubenswrapper[4771]: I0219 22:24:46.437610 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:24:46 crc kubenswrapper[4771]: E0219 22:24:46.438514 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:25:00 crc kubenswrapper[4771]: I0219 22:25:00.445129 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:25:00 crc kubenswrapper[4771]: E0219 22:25:00.445927 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:25:12 crc kubenswrapper[4771]: I0219 22:25:12.438420 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:25:12 crc kubenswrapper[4771]: E0219 22:25:12.439500 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:25:24 crc kubenswrapper[4771]: I0219 22:25:24.452571 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:25:24 crc kubenswrapper[4771]: E0219 22:25:24.453712 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:25:39 crc kubenswrapper[4771]: I0219 22:25:39.437373 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:25:39 crc kubenswrapper[4771]: E0219 22:25:39.438439 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:25:52 crc kubenswrapper[4771]: I0219 22:25:52.437151 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:25:53 crc kubenswrapper[4771]: I0219 22:25:53.508308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"43b05ff0a0829c15247628ed1ab5831518c6a36b49fe93c53bcb1679a87ee6dc"} Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.102500 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xrrjl"] Feb 19 22:28:03 crc kubenswrapper[4771]: E0219 22:28:03.105525 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="extract-content" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.105550 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="extract-content" Feb 19 22:28:03 crc kubenswrapper[4771]: E0219 22:28:03.105580 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="extract-utilities" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.105592 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="extract-utilities" Feb 19 22:28:03 crc kubenswrapper[4771]: E0219 22:28:03.105614 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="registry-server" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.105626 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="registry-server" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.105921 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="339b7efa-3646-4a81-ac0e-cd40e7298536" containerName="registry-server" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.107651 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.111328 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrrjl"] Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.189618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-catalog-content\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.189746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-utilities\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.189767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9gss\" (UniqueName: \"kubernetes.io/projected/640c617b-80be-4f7f-a790-b7ad689586fd-kube-api-access-w9gss\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.290710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-utilities\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.290756 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9gss\" (UniqueName: \"kubernetes.io/projected/640c617b-80be-4f7f-a790-b7ad689586fd-kube-api-access-w9gss\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.290787 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-catalog-content\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.291230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-utilities\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.291341 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-catalog-content\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.312214 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9gss\" (UniqueName: \"kubernetes.io/projected/640c617b-80be-4f7f-a790-b7ad689586fd-kube-api-access-w9gss\") pod \"redhat-operators-xrrjl\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.442442 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.663584 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrrjl"] Feb 19 22:28:03 crc kubenswrapper[4771]: I0219 22:28:03.740546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrrjl" event={"ID":"640c617b-80be-4f7f-a790-b7ad689586fd","Type":"ContainerStarted","Data":"ab63feb58d7ee02975a92f82aed1c31d5b3261afc5f32d1a952ae4a4a49bc15a"} Feb 19 22:28:04 crc kubenswrapper[4771]: I0219 22:28:04.748200 4771 generic.go:334] "Generic (PLEG): container finished" podID="640c617b-80be-4f7f-a790-b7ad689586fd" containerID="79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a" exitCode=0 Feb 19 22:28:04 crc kubenswrapper[4771]: I0219 22:28:04.748249 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrrjl" event={"ID":"640c617b-80be-4f7f-a790-b7ad689586fd","Type":"ContainerDied","Data":"79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a"} Feb 19 22:28:04 crc kubenswrapper[4771]: I0219 22:28:04.750136 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:28:05 crc kubenswrapper[4771]: I0219 22:28:05.757861 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrrjl" event={"ID":"640c617b-80be-4f7f-a790-b7ad689586fd","Type":"ContainerStarted","Data":"24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb"} Feb 19 22:28:06 crc kubenswrapper[4771]: I0219 22:28:06.771570 4771 generic.go:334] "Generic (PLEG): container finished" podID="640c617b-80be-4f7f-a790-b7ad689586fd" containerID="24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb" exitCode=0 Feb 19 22:28:06 crc kubenswrapper[4771]: I0219 22:28:06.771663 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrrjl" event={"ID":"640c617b-80be-4f7f-a790-b7ad689586fd","Type":"ContainerDied","Data":"24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb"} Feb 19 22:28:07 crc kubenswrapper[4771]: I0219 22:28:07.780578 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrrjl" event={"ID":"640c617b-80be-4f7f-a790-b7ad689586fd","Type":"ContainerStarted","Data":"be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375"} Feb 19 22:28:07 crc kubenswrapper[4771]: I0219 22:28:07.802465 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xrrjl" podStartSLOduration=2.371210267 podStartE2EDuration="4.802436279s" podCreationTimestamp="2026-02-19 22:28:03 +0000 UTC" firstStartedPulling="2026-02-19 22:28:04.749766054 +0000 UTC m=+3585.021208534" lastFinishedPulling="2026-02-19 22:28:07.180992036 +0000 UTC m=+3587.452434546" observedRunningTime="2026-02-19 22:28:07.80208508 +0000 UTC m=+3588.073527570" watchObservedRunningTime="2026-02-19 22:28:07.802436279 +0000 UTC m=+3588.073878779" Feb 19 22:28:12 crc kubenswrapper[4771]: I0219 22:28:12.957185 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:28:12 crc kubenswrapper[4771]: I0219 22:28:12.957942 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:28:13 crc kubenswrapper[4771]: I0219 22:28:13.444428 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:13 crc kubenswrapper[4771]: I0219 22:28:13.444488 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:14 crc kubenswrapper[4771]: I0219 22:28:14.504458 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrrjl" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="registry-server" probeResult="failure" output=< Feb 19 22:28:14 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 22:28:14 crc kubenswrapper[4771]: > Feb 19 22:28:23 crc kubenswrapper[4771]: I0219 22:28:23.522774 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:23 crc kubenswrapper[4771]: I0219 22:28:23.612639 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:23 crc kubenswrapper[4771]: I0219 22:28:23.770894 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrrjl"] Feb 19 22:28:24 crc kubenswrapper[4771]: I0219 22:28:24.929447 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xrrjl" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="registry-server" containerID="cri-o://be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375" gracePeriod=2 Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.512514 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.610391 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-utilities\") pod \"640c617b-80be-4f7f-a790-b7ad689586fd\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.610540 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-catalog-content\") pod \"640c617b-80be-4f7f-a790-b7ad689586fd\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.610616 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9gss\" (UniqueName: \"kubernetes.io/projected/640c617b-80be-4f7f-a790-b7ad689586fd-kube-api-access-w9gss\") pod \"640c617b-80be-4f7f-a790-b7ad689586fd\" (UID: \"640c617b-80be-4f7f-a790-b7ad689586fd\") " Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.612781 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-utilities" (OuterVolumeSpecName: "utilities") pod "640c617b-80be-4f7f-a790-b7ad689586fd" (UID: "640c617b-80be-4f7f-a790-b7ad689586fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.624148 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640c617b-80be-4f7f-a790-b7ad689586fd-kube-api-access-w9gss" (OuterVolumeSpecName: "kube-api-access-w9gss") pod "640c617b-80be-4f7f-a790-b7ad689586fd" (UID: "640c617b-80be-4f7f-a790-b7ad689586fd"). InnerVolumeSpecName "kube-api-access-w9gss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.712311 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9gss\" (UniqueName: \"kubernetes.io/projected/640c617b-80be-4f7f-a790-b7ad689586fd-kube-api-access-w9gss\") on node \"crc\" DevicePath \"\"" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.712349 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.774433 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "640c617b-80be-4f7f-a790-b7ad689586fd" (UID: "640c617b-80be-4f7f-a790-b7ad689586fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.814549 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/640c617b-80be-4f7f-a790-b7ad689586fd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.944930 4771 generic.go:334] "Generic (PLEG): container finished" podID="640c617b-80be-4f7f-a790-b7ad689586fd" containerID="be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375" exitCode=0 Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.945009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrrjl" event={"ID":"640c617b-80be-4f7f-a790-b7ad689586fd","Type":"ContainerDied","Data":"be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375"} Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.945097 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrrjl" event={"ID":"640c617b-80be-4f7f-a790-b7ad689586fd","Type":"ContainerDied","Data":"ab63feb58d7ee02975a92f82aed1c31d5b3261afc5f32d1a952ae4a4a49bc15a"} Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.945134 4771 scope.go:117] "RemoveContainer" containerID="be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.945463 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrrjl" Feb 19 22:28:25 crc kubenswrapper[4771]: I0219 22:28:25.980485 4771 scope.go:117] "RemoveContainer" containerID="24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.008396 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrrjl"] Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.019779 4771 scope.go:117] "RemoveContainer" containerID="79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.027046 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xrrjl"] Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.053890 4771 scope.go:117] "RemoveContainer" containerID="be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375" Feb 19 22:28:26 crc kubenswrapper[4771]: E0219 22:28:26.056618 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375\": container with ID starting with be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375 not found: ID does not exist" containerID="be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.056658 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375"} err="failed to get container status \"be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375\": rpc error: code = NotFound desc = could not find container \"be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375\": container with ID starting with be06469d9686a423288a97a3f33426eef3ee754b3d486eb0afa27c2a1b488375 not found: ID does not exist" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.056688 4771 scope.go:117] "RemoveContainer" containerID="24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb" Feb 19 22:28:26 crc kubenswrapper[4771]: E0219 22:28:26.057373 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb\": container with ID starting with 24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb not found: ID does not exist" containerID="24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.057426 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb"} err="failed to get container status \"24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb\": rpc error: code = NotFound desc = could not find container \"24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb\": container with ID starting with 24e6ef9d376cc8e0204a293cee7b85004e5d139239c695df6be0cab61bd54dfb not found: ID does not exist" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.057465 4771 scope.go:117] "RemoveContainer" containerID="79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a" Feb 19 22:28:26 crc kubenswrapper[4771]: E0219 22:28:26.057977 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a\": container with ID starting with 79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a not found: ID does not exist" containerID="79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.058004 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a"} err="failed to get container status \"79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a\": rpc error: code = NotFound desc = could not find container \"79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a\": container with ID starting with 79a44446902cd1e98249ac3722892bfd82d3fde1099f5d5770e1dec7cf87369a not found: ID does not exist" Feb 19 22:28:26 crc kubenswrapper[4771]: I0219 22:28:26.449894 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" path="/var/lib/kubelet/pods/640c617b-80be-4f7f-a790-b7ad689586fd/volumes" Feb 19 22:28:42 crc kubenswrapper[4771]: I0219 22:28:42.956927 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:28:42 crc kubenswrapper[4771]: I0219 22:28:42.957744 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:29:12 crc kubenswrapper[4771]: I0219 22:29:12.956841 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:29:12 crc kubenswrapper[4771]: I0219 22:29:12.957563 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:29:12 crc kubenswrapper[4771]: I0219 22:29:12.957636 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:29:12 crc kubenswrapper[4771]: I0219 22:29:12.958548 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43b05ff0a0829c15247628ed1ab5831518c6a36b49fe93c53bcb1679a87ee6dc"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:29:12 crc kubenswrapper[4771]: I0219 22:29:12.958636 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://43b05ff0a0829c15247628ed1ab5831518c6a36b49fe93c53bcb1679a87ee6dc" gracePeriod=600 Feb 19 22:29:13 crc kubenswrapper[4771]: I0219 22:29:13.427494 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="43b05ff0a0829c15247628ed1ab5831518c6a36b49fe93c53bcb1679a87ee6dc" exitCode=0 Feb 19 22:29:13 crc kubenswrapper[4771]: I0219 22:29:13.427544 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"43b05ff0a0829c15247628ed1ab5831518c6a36b49fe93c53bcb1679a87ee6dc"} Feb 19 22:29:13 crc kubenswrapper[4771]: I0219 22:29:13.428071 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806"} Feb 19 22:29:13 crc kubenswrapper[4771]: I0219 22:29:13.428121 4771 scope.go:117] "RemoveContainer" containerID="02ed801c7304f2b838f008fe6ba6523dbe317df2ec9ab7b570bf614cf0d3f690" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.194333 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc"] Feb 19 22:30:00 crc kubenswrapper[4771]: E0219 22:30:00.195340 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.195361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4771]: E0219 22:30:00.195392 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="extract-content" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.195403 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="extract-content" Feb 19 22:30:00 crc kubenswrapper[4771]: E0219 22:30:00.195449 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="extract-utilities" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.195461 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="extract-utilities" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.195683 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="640c617b-80be-4f7f-a790-b7ad689586fd" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.196404 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.199492 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.199769 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.208134 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc"] Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.309261 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/feca65d4-e1dc-4ff9-9f72-870368c444da-config-volume\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.309371 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/feca65d4-e1dc-4ff9-9f72-870368c444da-secret-volume\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.309457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtx44\" (UniqueName: \"kubernetes.io/projected/feca65d4-e1dc-4ff9-9f72-870368c444da-kube-api-access-vtx44\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.411368 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/feca65d4-e1dc-4ff9-9f72-870368c444da-config-volume\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.411432 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/feca65d4-e1dc-4ff9-9f72-870368c444da-secret-volume\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.411493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtx44\" (UniqueName: \"kubernetes.io/projected/feca65d4-e1dc-4ff9-9f72-870368c444da-kube-api-access-vtx44\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.413107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/feca65d4-e1dc-4ff9-9f72-870368c444da-config-volume\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.421740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/feca65d4-e1dc-4ff9-9f72-870368c444da-secret-volume\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.440576 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtx44\" (UniqueName: \"kubernetes.io/projected/feca65d4-e1dc-4ff9-9f72-870368c444da-kube-api-access-vtx44\") pod \"collect-profiles-29525670-628zc\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.523349 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.822263 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc"] Feb 19 22:30:00 crc kubenswrapper[4771]: I0219 22:30:00.858140 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" event={"ID":"feca65d4-e1dc-4ff9-9f72-870368c444da","Type":"ContainerStarted","Data":"0b7116a9bd3feeb61ccf44b793e88750bbdfe642534c373ca16363e4e061042c"} Feb 19 22:30:01 crc kubenswrapper[4771]: I0219 22:30:01.878369 4771 generic.go:334] "Generic (PLEG): container finished" podID="feca65d4-e1dc-4ff9-9f72-870368c444da" containerID="520046f48a74454766559a14f452ac54dba46f8c86207b4f39f6045aa65712f4" exitCode=0 Feb 19 22:30:01 crc kubenswrapper[4771]: I0219 22:30:01.878539 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" event={"ID":"feca65d4-e1dc-4ff9-9f72-870368c444da","Type":"ContainerDied","Data":"520046f48a74454766559a14f452ac54dba46f8c86207b4f39f6045aa65712f4"} Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.265439 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.398976 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/feca65d4-e1dc-4ff9-9f72-870368c444da-secret-volume\") pod \"feca65d4-e1dc-4ff9-9f72-870368c444da\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.399229 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtx44\" (UniqueName: \"kubernetes.io/projected/feca65d4-e1dc-4ff9-9f72-870368c444da-kube-api-access-vtx44\") pod \"feca65d4-e1dc-4ff9-9f72-870368c444da\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.399297 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/feca65d4-e1dc-4ff9-9f72-870368c444da-config-volume\") pod \"feca65d4-e1dc-4ff9-9f72-870368c444da\" (UID: \"feca65d4-e1dc-4ff9-9f72-870368c444da\") " Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.400686 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feca65d4-e1dc-4ff9-9f72-870368c444da-config-volume" (OuterVolumeSpecName: "config-volume") pod "feca65d4-e1dc-4ff9-9f72-870368c444da" (UID: "feca65d4-e1dc-4ff9-9f72-870368c444da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.405298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feca65d4-e1dc-4ff9-9f72-870368c444da-kube-api-access-vtx44" (OuterVolumeSpecName: "kube-api-access-vtx44") pod "feca65d4-e1dc-4ff9-9f72-870368c444da" (UID: "feca65d4-e1dc-4ff9-9f72-870368c444da"). InnerVolumeSpecName "kube-api-access-vtx44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.410319 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feca65d4-e1dc-4ff9-9f72-870368c444da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "feca65d4-e1dc-4ff9-9f72-870368c444da" (UID: "feca65d4-e1dc-4ff9-9f72-870368c444da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.501759 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/feca65d4-e1dc-4ff9-9f72-870368c444da-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.501813 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtx44\" (UniqueName: \"kubernetes.io/projected/feca65d4-e1dc-4ff9-9f72-870368c444da-kube-api-access-vtx44\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.501838 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/feca65d4-e1dc-4ff9-9f72-870368c444da-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.902453 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" event={"ID":"feca65d4-e1dc-4ff9-9f72-870368c444da","Type":"ContainerDied","Data":"0b7116a9bd3feeb61ccf44b793e88750bbdfe642534c373ca16363e4e061042c"} Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.902509 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b7116a9bd3feeb61ccf44b793e88750bbdfe642534c373ca16363e4e061042c" Feb 19 22:30:03 crc kubenswrapper[4771]: I0219 22:30:03.902516 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc" Feb 19 22:30:04 crc kubenswrapper[4771]: I0219 22:30:04.356749 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx"] Feb 19 22:30:04 crc kubenswrapper[4771]: I0219 22:30:04.364890 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-pl4kx"] Feb 19 22:30:04 crc kubenswrapper[4771]: I0219 22:30:04.472497 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d455c12-5554-439d-ac6e-61b738c8ea5c" path="/var/lib/kubelet/pods/7d455c12-5554-439d-ac6e-61b738c8ea5c/volumes" Feb 19 22:30:33 crc kubenswrapper[4771]: I0219 22:30:33.662893 4771 scope.go:117] "RemoveContainer" containerID="d910dccaf47ede0010e7a4440a4c1aff300bfd58af602a4d9af3ad760c05385d" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.288746 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j9fh8"] Feb 19 22:31:26 crc kubenswrapper[4771]: E0219 22:31:26.289396 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feca65d4-e1dc-4ff9-9f72-870368c444da" containerName="collect-profiles" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.289407 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="feca65d4-e1dc-4ff9-9f72-870368c444da" containerName="collect-profiles" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.289549 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="feca65d4-e1dc-4ff9-9f72-870368c444da" containerName="collect-profiles" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.290422 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.360440 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9fh8"] Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.402267 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r27q8\" (UniqueName: \"kubernetes.io/projected/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-kube-api-access-r27q8\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.402359 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-utilities\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.402383 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-catalog-content\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.503611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-utilities\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.503660 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-catalog-content\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.503710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r27q8\" (UniqueName: \"kubernetes.io/projected/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-kube-api-access-r27q8\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.504289 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-utilities\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.504388 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-catalog-content\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.547917 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r27q8\" (UniqueName: \"kubernetes.io/projected/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-kube-api-access-r27q8\") pod \"certified-operators-j9fh8\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:26 crc kubenswrapper[4771]: I0219 22:31:26.604886 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:27 crc kubenswrapper[4771]: I0219 22:31:27.024536 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9fh8"] Feb 19 22:31:27 crc kubenswrapper[4771]: E0219 22:31:27.359341 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792b17f7_6a6c_4c5d_a4f6_e9932a27ff81.slice/crio-conmon-8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792b17f7_6a6c_4c5d_a4f6_e9932a27ff81.slice/crio-8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:31:27 crc kubenswrapper[4771]: I0219 22:31:27.670002 4771 generic.go:334] "Generic (PLEG): container finished" podID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerID="8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999" exitCode=0 Feb 19 22:31:27 crc kubenswrapper[4771]: I0219 22:31:27.670063 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9fh8" event={"ID":"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81","Type":"ContainerDied","Data":"8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999"} Feb 19 22:31:27 crc kubenswrapper[4771]: I0219 22:31:27.670089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9fh8" event={"ID":"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81","Type":"ContainerStarted","Data":"730c44141a86d29ae6787253e4412c1daa12b53679126567aeb1e3273ea5afcc"} Feb 19 22:31:29 crc kubenswrapper[4771]: I0219 22:31:29.689848 4771 generic.go:334] "Generic (PLEG): container finished" podID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerID="9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9" exitCode=0 Feb 19 22:31:29 crc kubenswrapper[4771]: I0219 22:31:29.689967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9fh8" event={"ID":"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81","Type":"ContainerDied","Data":"9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9"} Feb 19 22:31:31 crc kubenswrapper[4771]: I0219 22:31:31.705754 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9fh8" event={"ID":"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81","Type":"ContainerStarted","Data":"49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7"} Feb 19 22:31:31 crc kubenswrapper[4771]: I0219 22:31:31.734051 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j9fh8" podStartSLOduration=2.256630898 podStartE2EDuration="5.734003101s" podCreationTimestamp="2026-02-19 22:31:26 +0000 UTC" firstStartedPulling="2026-02-19 22:31:27.671707445 +0000 UTC m=+3787.943149915" lastFinishedPulling="2026-02-19 22:31:31.149079608 +0000 UTC m=+3791.420522118" observedRunningTime="2026-02-19 22:31:31.727922639 +0000 UTC m=+3791.999365139" watchObservedRunningTime="2026-02-19 22:31:31.734003101 +0000 UTC m=+3792.005445591" Feb 19 22:31:35 crc kubenswrapper[4771]: I0219 22:31:35.810099 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-schf9"] Feb 19 22:31:35 crc kubenswrapper[4771]: I0219 22:31:35.812744 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:35 crc kubenswrapper[4771]: I0219 22:31:35.828571 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-schf9"] Feb 19 22:31:35 crc kubenswrapper[4771]: I0219 22:31:35.945997 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtx6\" (UniqueName: \"kubernetes.io/projected/e0687eda-261d-444e-9ebf-cfa620f50ba1-kube-api-access-7mtx6\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:35 crc kubenswrapper[4771]: I0219 22:31:35.946559 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-utilities\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:35 crc kubenswrapper[4771]: I0219 22:31:35.946668 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-catalog-content\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.047462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtx6\" (UniqueName: \"kubernetes.io/projected/e0687eda-261d-444e-9ebf-cfa620f50ba1-kube-api-access-7mtx6\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.047517 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-utilities\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.047556 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-catalog-content\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.048120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-catalog-content\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.048403 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-utilities\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.083414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtx6\" (UniqueName: \"kubernetes.io/projected/e0687eda-261d-444e-9ebf-cfa620f50ba1-kube-api-access-7mtx6\") pod \"redhat-marketplace-schf9\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.144067 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.605338 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.605766 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.622245 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-schf9"] Feb 19 22:31:36 crc kubenswrapper[4771]: W0219 22:31:36.630140 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0687eda_261d_444e_9ebf_cfa620f50ba1.slice/crio-858bd318887655cb67d57981cefe0b03468cd3369b1bde70cdfd53e25705e031 WatchSource:0}: Error finding container 858bd318887655cb67d57981cefe0b03468cd3369b1bde70cdfd53e25705e031: Status 404 returned error can't find the container with id 858bd318887655cb67d57981cefe0b03468cd3369b1bde70cdfd53e25705e031 Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.659175 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.746119 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-schf9" event={"ID":"e0687eda-261d-444e-9ebf-cfa620f50ba1","Type":"ContainerStarted","Data":"858bd318887655cb67d57981cefe0b03468cd3369b1bde70cdfd53e25705e031"} Feb 19 22:31:36 crc kubenswrapper[4771]: I0219 22:31:36.799009 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:37 crc kubenswrapper[4771]: I0219 22:31:37.754452 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerID="293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89" exitCode=0 Feb 19 22:31:37 crc kubenswrapper[4771]: I0219 22:31:37.756823 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-schf9" event={"ID":"e0687eda-261d-444e-9ebf-cfa620f50ba1","Type":"ContainerDied","Data":"293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89"} Feb 19 22:31:38 crc kubenswrapper[4771]: I0219 22:31:38.773400 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-schf9" event={"ID":"e0687eda-261d-444e-9ebf-cfa620f50ba1","Type":"ContainerStarted","Data":"c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9"} Feb 19 22:31:38 crc kubenswrapper[4771]: I0219 22:31:38.979287 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9fh8"] Feb 19 22:31:38 crc kubenswrapper[4771]: I0219 22:31:38.979655 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j9fh8" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="registry-server" containerID="cri-o://49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7" gracePeriod=2 Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.467079 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.612389 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-utilities\") pod \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.612861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r27q8\" (UniqueName: \"kubernetes.io/projected/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-kube-api-access-r27q8\") pod \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.612912 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-catalog-content\") pod \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\" (UID: \"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81\") " Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.613405 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-utilities" (OuterVolumeSpecName: "utilities") pod "792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" (UID: "792b17f7-6a6c-4c5d-a4f6-e9932a27ff81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.613611 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.617703 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-kube-api-access-r27q8" (OuterVolumeSpecName: "kube-api-access-r27q8") pod "792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" (UID: "792b17f7-6a6c-4c5d-a4f6-e9932a27ff81"). InnerVolumeSpecName "kube-api-access-r27q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.715373 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r27q8\" (UniqueName: \"kubernetes.io/projected/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-kube-api-access-r27q8\") on node \"crc\" DevicePath \"\"" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.787576 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerID="c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9" exitCode=0 Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.787651 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-schf9" event={"ID":"e0687eda-261d-444e-9ebf-cfa620f50ba1","Type":"ContainerDied","Data":"c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9"} Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.795412 4771 generic.go:334] "Generic (PLEG): container finished" podID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerID="49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7" exitCode=0 Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.795485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9fh8" event={"ID":"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81","Type":"ContainerDied","Data":"49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7"} Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.795536 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9fh8" event={"ID":"792b17f7-6a6c-4c5d-a4f6-e9932a27ff81","Type":"ContainerDied","Data":"730c44141a86d29ae6787253e4412c1daa12b53679126567aeb1e3273ea5afcc"} Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.795573 4771 scope.go:117] "RemoveContainer" containerID="49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.795781 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9fh8" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.842992 4771 scope.go:117] "RemoveContainer" containerID="9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.843273 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" (UID: "792b17f7-6a6c-4c5d-a4f6-e9932a27ff81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.867131 4771 scope.go:117] "RemoveContainer" containerID="8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.916535 4771 scope.go:117] "RemoveContainer" containerID="49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7" Feb 19 22:31:39 crc kubenswrapper[4771]: E0219 22:31:39.917955 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7\": container with ID starting with 49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7 not found: ID does not exist" containerID="49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.918097 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7"} err="failed to get container status \"49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7\": rpc error: code = NotFound desc = could not find container \"49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7\": container with ID starting with 49d7a1a22e8825b9cf90acacaf998778f8c3877b5eb59a74556e49d142373cf7 not found: ID does not exist" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.918145 4771 scope.go:117] "RemoveContainer" containerID="9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9" Feb 19 22:31:39 crc kubenswrapper[4771]: E0219 22:31:39.918659 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9\": container with ID starting with 9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9 not found: ID does not exist" containerID="9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.918704 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9"} err="failed to get container status \"9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9\": rpc error: code = NotFound desc = could not find container \"9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9\": container with ID starting with 9823a9f4afa19d04c695b25a602decf31b976d41e5d9c199b14945555104d8d9 not found: ID does not exist" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.918740 4771 scope.go:117] "RemoveContainer" containerID="8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999" Feb 19 22:31:39 crc kubenswrapper[4771]: E0219 22:31:39.919419 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999\": container with ID starting with 8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999 not found: ID does not exist" containerID="8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.919465 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999"} err="failed to get container status \"8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999\": rpc error: code = NotFound desc = could not find container \"8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999\": container with ID starting with 8a8c045601669eecd537686af1e7666784a5148fc97033662b90de1b20448999 not found: ID does not exist" Feb 19 22:31:39 crc kubenswrapper[4771]: I0219 22:31:39.925074 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:31:40 crc kubenswrapper[4771]: I0219 22:31:40.150478 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9fh8"] Feb 19 22:31:40 crc kubenswrapper[4771]: I0219 22:31:40.161627 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j9fh8"] Feb 19 22:31:40 crc kubenswrapper[4771]: I0219 22:31:40.458575 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" path="/var/lib/kubelet/pods/792b17f7-6a6c-4c5d-a4f6-e9932a27ff81/volumes" Feb 19 22:31:40 crc kubenswrapper[4771]: I0219 22:31:40.811143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-schf9" event={"ID":"e0687eda-261d-444e-9ebf-cfa620f50ba1","Type":"ContainerStarted","Data":"719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb"} Feb 19 22:31:40 crc kubenswrapper[4771]: I0219 22:31:40.845654 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-schf9" podStartSLOduration=3.369657102 podStartE2EDuration="5.845622021s" podCreationTimestamp="2026-02-19 22:31:35 +0000 UTC" firstStartedPulling="2026-02-19 22:31:37.757342332 +0000 UTC m=+3798.028784842" lastFinishedPulling="2026-02-19 22:31:40.233307261 +0000 UTC m=+3800.504749761" observedRunningTime="2026-02-19 22:31:40.84178669 +0000 UTC m=+3801.113229200" watchObservedRunningTime="2026-02-19 22:31:40.845622021 +0000 UTC m=+3801.117064531" Feb 19 22:31:42 crc kubenswrapper[4771]: I0219 22:31:42.957134 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:31:42 crc kubenswrapper[4771]: I0219 22:31:42.957222 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:31:46 crc kubenswrapper[4771]: I0219 22:31:46.144532 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:46 crc kubenswrapper[4771]: I0219 22:31:46.144960 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:46 crc kubenswrapper[4771]: I0219 22:31:46.211486 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:46 crc kubenswrapper[4771]: I0219 22:31:46.938004 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:47 crc kubenswrapper[4771]: I0219 22:31:47.012354 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-schf9"] Feb 19 22:31:48 crc kubenswrapper[4771]: I0219 22:31:48.889422 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-schf9" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="registry-server" containerID="cri-o://719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb" gracePeriod=2 Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.359250 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.478428 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-utilities\") pod \"e0687eda-261d-444e-9ebf-cfa620f50ba1\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.478499 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mtx6\" (UniqueName: \"kubernetes.io/projected/e0687eda-261d-444e-9ebf-cfa620f50ba1-kube-api-access-7mtx6\") pod \"e0687eda-261d-444e-9ebf-cfa620f50ba1\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.478668 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-catalog-content\") pod \"e0687eda-261d-444e-9ebf-cfa620f50ba1\" (UID: \"e0687eda-261d-444e-9ebf-cfa620f50ba1\") " Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.479744 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-utilities" (OuterVolumeSpecName: "utilities") pod "e0687eda-261d-444e-9ebf-cfa620f50ba1" (UID: "e0687eda-261d-444e-9ebf-cfa620f50ba1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.485653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0687eda-261d-444e-9ebf-cfa620f50ba1-kube-api-access-7mtx6" (OuterVolumeSpecName: "kube-api-access-7mtx6") pod "e0687eda-261d-444e-9ebf-cfa620f50ba1" (UID: "e0687eda-261d-444e-9ebf-cfa620f50ba1"). InnerVolumeSpecName "kube-api-access-7mtx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.502614 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0687eda-261d-444e-9ebf-cfa620f50ba1" (UID: "e0687eda-261d-444e-9ebf-cfa620f50ba1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.580973 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.581052 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0687eda-261d-444e-9ebf-cfa620f50ba1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.581074 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mtx6\" (UniqueName: \"kubernetes.io/projected/e0687eda-261d-444e-9ebf-cfa620f50ba1-kube-api-access-7mtx6\") on node \"crc\" DevicePath \"\"" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.901903 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerID="719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb" exitCode=0 Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.901958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-schf9" event={"ID":"e0687eda-261d-444e-9ebf-cfa620f50ba1","Type":"ContainerDied","Data":"719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb"} Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.901996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-schf9" event={"ID":"e0687eda-261d-444e-9ebf-cfa620f50ba1","Type":"ContainerDied","Data":"858bd318887655cb67d57981cefe0b03468cd3369b1bde70cdfd53e25705e031"} Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.902041 4771 scope.go:117] "RemoveContainer" containerID="719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.902079 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-schf9" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.931651 4771 scope.go:117] "RemoveContainer" containerID="c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.964101 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-schf9"] Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.974466 4771 scope.go:117] "RemoveContainer" containerID="293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89" Feb 19 22:31:49 crc kubenswrapper[4771]: I0219 22:31:49.977882 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-schf9"] Feb 19 22:31:50 crc kubenswrapper[4771]: I0219 22:31:50.007697 4771 scope.go:117] "RemoveContainer" containerID="719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb" Feb 19 22:31:50 crc kubenswrapper[4771]: E0219 22:31:50.008820 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb\": container with ID starting with 719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb not found: ID does not exist" containerID="719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb" Feb 19 22:31:50 crc kubenswrapper[4771]: I0219 22:31:50.008901 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb"} err="failed to get container status \"719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb\": rpc error: code = NotFound desc = could not find container \"719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb\": container with ID starting with 719b589ed3b209beb843596c5e6b2a58615b34af86eb9ba219d132ae1e08d5cb not found: ID does not exist" Feb 19 22:31:50 crc kubenswrapper[4771]: I0219 22:31:50.008936 4771 scope.go:117] "RemoveContainer" containerID="c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9" Feb 19 22:31:50 crc kubenswrapper[4771]: E0219 22:31:50.009743 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9\": container with ID starting with c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9 not found: ID does not exist" containerID="c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9" Feb 19 22:31:50 crc kubenswrapper[4771]: I0219 22:31:50.009834 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9"} err="failed to get container status \"c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9\": rpc error: code = NotFound desc = could not find container \"c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9\": container with ID starting with c87ca15c9ba22a24eef704708ebd151d71fdc0edcec8db10d8deb575c27d84e9 not found: ID does not exist" Feb 19 22:31:50 crc kubenswrapper[4771]: I0219 22:31:50.009875 4771 scope.go:117] "RemoveContainer" containerID="293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89" Feb 19 22:31:50 crc kubenswrapper[4771]: E0219 22:31:50.010454 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89\": container with ID starting with 293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89 not found: ID does not exist" containerID="293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89" Feb 19 22:31:50 crc kubenswrapper[4771]: I0219 22:31:50.010655 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89"} err="failed to get container status \"293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89\": rpc error: code = NotFound desc = could not find container \"293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89\": container with ID starting with 293f4679f93e5f478f5b3b637da38fa17a8097674cbd1319e8406c5e2c5f9c89 not found: ID does not exist" Feb 19 22:31:50 crc kubenswrapper[4771]: I0219 22:31:50.452122 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" path="/var/lib/kubelet/pods/e0687eda-261d-444e-9ebf-cfa620f50ba1/volumes" Feb 19 22:32:12 crc kubenswrapper[4771]: I0219 22:32:12.956963 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:32:12 crc kubenswrapper[4771]: I0219 22:32:12.959154 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:32:42 crc kubenswrapper[4771]: I0219 22:32:42.956921 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:32:42 crc kubenswrapper[4771]: I0219 22:32:42.957659 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:32:42 crc kubenswrapper[4771]: I0219 22:32:42.957726 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:32:42 crc kubenswrapper[4771]: I0219 22:32:42.958599 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:32:42 crc kubenswrapper[4771]: I0219 22:32:42.958698 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" gracePeriod=600 Feb 19 22:32:43 crc kubenswrapper[4771]: E0219 22:32:43.094816 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:32:43 crc kubenswrapper[4771]: I0219 22:32:43.446259 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" exitCode=0 Feb 19 22:32:43 crc kubenswrapper[4771]: I0219 22:32:43.446323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806"} Feb 19 22:32:43 crc kubenswrapper[4771]: I0219 22:32:43.446368 4771 scope.go:117] "RemoveContainer" containerID="43b05ff0a0829c15247628ed1ab5831518c6a36b49fe93c53bcb1679a87ee6dc" Feb 19 22:32:43 crc kubenswrapper[4771]: I0219 22:32:43.447170 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:32:43 crc kubenswrapper[4771]: E0219 22:32:43.447684 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:32:56 crc kubenswrapper[4771]: I0219 22:32:56.437776 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:32:56 crc kubenswrapper[4771]: E0219 22:32:56.438939 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:33:10 crc kubenswrapper[4771]: I0219 22:33:10.444611 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:33:10 crc kubenswrapper[4771]: E0219 22:33:10.445628 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:33:25 crc kubenswrapper[4771]: I0219 22:33:25.438594 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:33:25 crc kubenswrapper[4771]: E0219 22:33:25.440738 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:33:39 crc kubenswrapper[4771]: I0219 22:33:39.437391 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:33:39 crc kubenswrapper[4771]: E0219 22:33:39.438236 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:33:50 crc kubenswrapper[4771]: I0219 22:33:50.450711 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:33:50 crc kubenswrapper[4771]: E0219 22:33:50.451937 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:34:03 crc kubenswrapper[4771]: I0219 22:34:03.437254 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:34:03 crc kubenswrapper[4771]: E0219 22:34:03.437995 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:34:18 crc kubenswrapper[4771]: I0219 22:34:18.437852 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:34:18 crc kubenswrapper[4771]: E0219 22:34:18.438758 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:34:30 crc kubenswrapper[4771]: I0219 22:34:30.443981 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:34:30 crc kubenswrapper[4771]: E0219 22:34:30.445006 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:34:45 crc kubenswrapper[4771]: I0219 22:34:45.438302 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:34:45 crc kubenswrapper[4771]: E0219 22:34:45.439258 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:35:00 crc kubenswrapper[4771]: I0219 22:35:00.446279 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:35:00 crc kubenswrapper[4771]: E0219 22:35:00.447296 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:35:12 crc kubenswrapper[4771]: I0219 22:35:12.437171 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:35:12 crc kubenswrapper[4771]: E0219 22:35:12.438525 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:35:26 crc kubenswrapper[4771]: I0219 22:35:26.437676 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:35:26 crc kubenswrapper[4771]: E0219 22:35:26.438758 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.733192 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c8nz4"] Feb 19 22:35:31 crc kubenswrapper[4771]: E0219 22:35:31.734122 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="extract-utilities" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.734151 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="extract-utilities" Feb 19 22:35:31 crc kubenswrapper[4771]: E0219 22:35:31.737312 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="registry-server" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.737352 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="registry-server" Feb 19 22:35:31 crc kubenswrapper[4771]: E0219 22:35:31.737435 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="extract-content" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.737449 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="extract-content" Feb 19 22:35:31 crc kubenswrapper[4771]: E0219 22:35:31.737542 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="extract-utilities" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.737612 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="extract-utilities" Feb 19 22:35:31 crc kubenswrapper[4771]: E0219 22:35:31.737650 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="extract-content" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.737746 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="extract-content" Feb 19 22:35:31 crc kubenswrapper[4771]: E0219 22:35:31.737817 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="registry-server" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.737836 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="registry-server" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.738174 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="792b17f7-6a6c-4c5d-a4f6-e9932a27ff81" containerName="registry-server" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.738194 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0687eda-261d-444e-9ebf-cfa620f50ba1" containerName="registry-server" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.741562 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.758686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8nz4"] Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.920479 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-catalog-content\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.920543 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj4hm\" (UniqueName: \"kubernetes.io/projected/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-kube-api-access-qj4hm\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:31 crc kubenswrapper[4771]: I0219 22:35:31.920586 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-utilities\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.023107 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-catalog-content\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.023155 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj4hm\" (UniqueName: \"kubernetes.io/projected/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-kube-api-access-qj4hm\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.023191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-utilities\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.023755 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-catalog-content\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.023809 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-utilities\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.058690 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj4hm\" (UniqueName: \"kubernetes.io/projected/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-kube-api-access-qj4hm\") pod \"community-operators-c8nz4\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.087796 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:32 crc kubenswrapper[4771]: I0219 22:35:32.604636 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8nz4"] Feb 19 22:35:33 crc kubenswrapper[4771]: I0219 22:35:33.011084 4771 generic.go:334] "Generic (PLEG): container finished" podID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerID="8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314" exitCode=0 Feb 19 22:35:33 crc kubenswrapper[4771]: I0219 22:35:33.011231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8nz4" event={"ID":"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5","Type":"ContainerDied","Data":"8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314"} Feb 19 22:35:33 crc kubenswrapper[4771]: I0219 22:35:33.011605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8nz4" event={"ID":"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5","Type":"ContainerStarted","Data":"3df3ad8143daab20542326c88fe199b0f6dce9752a93ee48fd1ca83561b9d3e6"} Feb 19 22:35:33 crc kubenswrapper[4771]: I0219 22:35:33.015590 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:35:35 crc kubenswrapper[4771]: I0219 22:35:35.033337 4771 generic.go:334] "Generic (PLEG): container finished" podID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerID="e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b" exitCode=0 Feb 19 22:35:35 crc kubenswrapper[4771]: I0219 22:35:35.033437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8nz4" event={"ID":"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5","Type":"ContainerDied","Data":"e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b"} Feb 19 22:35:36 crc kubenswrapper[4771]: I0219 22:35:36.045148 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8nz4" event={"ID":"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5","Type":"ContainerStarted","Data":"47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525"} Feb 19 22:35:36 crc kubenswrapper[4771]: I0219 22:35:36.081560 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c8nz4" podStartSLOduration=2.68446303 podStartE2EDuration="5.081537395s" podCreationTimestamp="2026-02-19 22:35:31 +0000 UTC" firstStartedPulling="2026-02-19 22:35:33.014692574 +0000 UTC m=+4033.286135084" lastFinishedPulling="2026-02-19 22:35:35.411766939 +0000 UTC m=+4035.683209449" observedRunningTime="2026-02-19 22:35:36.0707922 +0000 UTC m=+4036.342234700" watchObservedRunningTime="2026-02-19 22:35:36.081537395 +0000 UTC m=+4036.352979895" Feb 19 22:35:39 crc kubenswrapper[4771]: I0219 22:35:39.437909 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:35:39 crc kubenswrapper[4771]: E0219 22:35:39.438545 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:35:42 crc kubenswrapper[4771]: I0219 22:35:42.088868 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:42 crc kubenswrapper[4771]: I0219 22:35:42.092055 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:42 crc kubenswrapper[4771]: I0219 22:35:42.157429 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:43 crc kubenswrapper[4771]: I0219 22:35:43.183697 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:43 crc kubenswrapper[4771]: I0219 22:35:43.244246 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8nz4"] Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.133676 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c8nz4" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="registry-server" containerID="cri-o://47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525" gracePeriod=2 Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.638473 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.740808 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-catalog-content\") pod \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.740940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj4hm\" (UniqueName: \"kubernetes.io/projected/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-kube-api-access-qj4hm\") pod \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.740990 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-utilities\") pod \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\" (UID: \"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5\") " Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.742013 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-utilities" (OuterVolumeSpecName: "utilities") pod "352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" (UID: "352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.764084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-kube-api-access-qj4hm" (OuterVolumeSpecName: "kube-api-access-qj4hm") pod "352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" (UID: "352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5"). InnerVolumeSpecName "kube-api-access-qj4hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.799641 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" (UID: "352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.842991 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj4hm\" (UniqueName: \"kubernetes.io/projected/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-kube-api-access-qj4hm\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.843044 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:45 crc kubenswrapper[4771]: I0219 22:35:45.843058 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.145229 4771 generic.go:334] "Generic (PLEG): container finished" podID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerID="47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525" exitCode=0 Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.145280 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8nz4" event={"ID":"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5","Type":"ContainerDied","Data":"47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525"} Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.145312 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8nz4" event={"ID":"352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5","Type":"ContainerDied","Data":"3df3ad8143daab20542326c88fe199b0f6dce9752a93ee48fd1ca83561b9d3e6"} Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.145332 4771 scope.go:117] "RemoveContainer" containerID="47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.145358 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8nz4" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.169832 4771 scope.go:117] "RemoveContainer" containerID="e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.196808 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8nz4"] Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.211444 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c8nz4"] Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.215715 4771 scope.go:117] "RemoveContainer" containerID="8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.253719 4771 scope.go:117] "RemoveContainer" containerID="47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525" Feb 19 22:35:46 crc kubenswrapper[4771]: E0219 22:35:46.254317 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525\": container with ID starting with 47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525 not found: ID does not exist" containerID="47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.254371 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525"} err="failed to get container status \"47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525\": rpc error: code = NotFound desc = could not find container \"47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525\": container with ID starting with 47bbdc6b1e14bf85bf3709e2f795f4e0fb33763148cba7b09e3b34cea1ff2525 not found: ID does not exist" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.254403 4771 scope.go:117] "RemoveContainer" containerID="e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b" Feb 19 22:35:46 crc kubenswrapper[4771]: E0219 22:35:46.255117 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b\": container with ID starting with e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b not found: ID does not exist" containerID="e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.255150 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b"} err="failed to get container status \"e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b\": rpc error: code = NotFound desc = could not find container \"e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b\": container with ID starting with e71e716d113fec101ddfaf245d290a6bf7017767bc89d26379c24fdc3cb0647b not found: ID does not exist" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.255172 4771 scope.go:117] "RemoveContainer" containerID="8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314" Feb 19 22:35:46 crc kubenswrapper[4771]: E0219 22:35:46.255708 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314\": container with ID starting with 8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314 not found: ID does not exist" containerID="8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.255814 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314"} err="failed to get container status \"8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314\": rpc error: code = NotFound desc = could not find container \"8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314\": container with ID starting with 8dfa552bf15d91949648f4906a21f0d7856ab91ec5b0637a05c12e896e4aa314 not found: ID does not exist" Feb 19 22:35:46 crc kubenswrapper[4771]: I0219 22:35:46.451544 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" path="/var/lib/kubelet/pods/352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5/volumes" Feb 19 22:35:52 crc kubenswrapper[4771]: I0219 22:35:52.438638 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:35:52 crc kubenswrapper[4771]: E0219 22:35:52.439850 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:36:03 crc kubenswrapper[4771]: I0219 22:36:03.437567 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:36:03 crc kubenswrapper[4771]: E0219 22:36:03.438293 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:36:15 crc kubenswrapper[4771]: I0219 22:36:15.436875 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:36:15 crc kubenswrapper[4771]: E0219 22:36:15.437616 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:36:30 crc kubenswrapper[4771]: I0219 22:36:30.440627 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:36:30 crc kubenswrapper[4771]: E0219 22:36:30.441778 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:36:43 crc kubenswrapper[4771]: I0219 22:36:43.437800 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:36:43 crc kubenswrapper[4771]: E0219 22:36:43.438698 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:36:55 crc kubenswrapper[4771]: I0219 22:36:55.437918 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:36:55 crc kubenswrapper[4771]: E0219 22:36:55.439068 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:37:09 crc kubenswrapper[4771]: I0219 22:37:09.437183 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:37:09 crc kubenswrapper[4771]: E0219 22:37:09.438370 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:37:21 crc kubenswrapper[4771]: I0219 22:37:21.437769 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:37:21 crc kubenswrapper[4771]: E0219 22:37:21.439056 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:37:33 crc kubenswrapper[4771]: I0219 22:37:33.437603 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:37:33 crc kubenswrapper[4771]: E0219 22:37:33.438659 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:37:45 crc kubenswrapper[4771]: I0219 22:37:45.437465 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:37:46 crc kubenswrapper[4771]: I0219 22:37:46.256537 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"839bc3fd4cf2e05190b0691c7243968475e6bf2d1d5d4130ec5b8261ba14b6a5"} Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.376732 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j67lw"] Feb 19 22:39:23 crc kubenswrapper[4771]: E0219 22:39:23.378172 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="extract-content" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.378212 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="extract-content" Feb 19 22:39:23 crc kubenswrapper[4771]: E0219 22:39:23.378256 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="registry-server" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.378271 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="registry-server" Feb 19 22:39:23 crc kubenswrapper[4771]: E0219 22:39:23.378305 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="extract-utilities" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.378322 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="extract-utilities" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.378761 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="352c04e1-e2a0-4f0f-bcdc-a85ce1da20d5" containerName="registry-server" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.381012 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.386009 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j67lw"] Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.463013 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-utilities\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.463100 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-catalog-content\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.463184 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq9n8\" (UniqueName: \"kubernetes.io/projected/24e7e014-a74b-4faf-8107-e38458854f40-kube-api-access-kq9n8\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.564708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-utilities\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.564790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-catalog-content\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.564918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq9n8\" (UniqueName: \"kubernetes.io/projected/24e7e014-a74b-4faf-8107-e38458854f40-kube-api-access-kq9n8\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.565803 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-utilities\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.566754 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-catalog-content\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.589642 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq9n8\" (UniqueName: \"kubernetes.io/projected/24e7e014-a74b-4faf-8107-e38458854f40-kube-api-access-kq9n8\") pod \"redhat-operators-j67lw\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:23 crc kubenswrapper[4771]: I0219 22:39:23.705997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:24 crc kubenswrapper[4771]: I0219 22:39:24.178966 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j67lw"] Feb 19 22:39:24 crc kubenswrapper[4771]: W0219 22:39:24.191173 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e7e014_a74b_4faf_8107_e38458854f40.slice/crio-dd760af07d068da7447ab798809a05e4f17a61c594ed144c47ce941eac818f86 WatchSource:0}: Error finding container dd760af07d068da7447ab798809a05e4f17a61c594ed144c47ce941eac818f86: Status 404 returned error can't find the container with id dd760af07d068da7447ab798809a05e4f17a61c594ed144c47ce941eac818f86 Feb 19 22:39:25 crc kubenswrapper[4771]: I0219 22:39:25.191369 4771 generic.go:334] "Generic (PLEG): container finished" podID="24e7e014-a74b-4faf-8107-e38458854f40" containerID="799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38" exitCode=0 Feb 19 22:39:25 crc kubenswrapper[4771]: I0219 22:39:25.191471 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j67lw" event={"ID":"24e7e014-a74b-4faf-8107-e38458854f40","Type":"ContainerDied","Data":"799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38"} Feb 19 22:39:25 crc kubenswrapper[4771]: I0219 22:39:25.191634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j67lw" event={"ID":"24e7e014-a74b-4faf-8107-e38458854f40","Type":"ContainerStarted","Data":"dd760af07d068da7447ab798809a05e4f17a61c594ed144c47ce941eac818f86"} Feb 19 22:39:26 crc kubenswrapper[4771]: I0219 22:39:26.201451 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j67lw" event={"ID":"24e7e014-a74b-4faf-8107-e38458854f40","Type":"ContainerStarted","Data":"91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93"} Feb 19 22:39:27 crc kubenswrapper[4771]: I0219 22:39:27.211797 4771 generic.go:334] "Generic (PLEG): container finished" podID="24e7e014-a74b-4faf-8107-e38458854f40" containerID="91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93" exitCode=0 Feb 19 22:39:27 crc kubenswrapper[4771]: I0219 22:39:27.211867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j67lw" event={"ID":"24e7e014-a74b-4faf-8107-e38458854f40","Type":"ContainerDied","Data":"91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93"} Feb 19 22:39:28 crc kubenswrapper[4771]: I0219 22:39:28.224799 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j67lw" event={"ID":"24e7e014-a74b-4faf-8107-e38458854f40","Type":"ContainerStarted","Data":"2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c"} Feb 19 22:39:28 crc kubenswrapper[4771]: I0219 22:39:28.266248 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j67lw" podStartSLOduration=2.822434078 podStartE2EDuration="5.266221783s" podCreationTimestamp="2026-02-19 22:39:23 +0000 UTC" firstStartedPulling="2026-02-19 22:39:25.193952608 +0000 UTC m=+4265.465395098" lastFinishedPulling="2026-02-19 22:39:27.637740293 +0000 UTC m=+4267.909182803" observedRunningTime="2026-02-19 22:39:28.255541959 +0000 UTC m=+4268.526984489" watchObservedRunningTime="2026-02-19 22:39:28.266221783 +0000 UTC m=+4268.537664293" Feb 19 22:39:33 crc kubenswrapper[4771]: I0219 22:39:33.706312 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:33 crc kubenswrapper[4771]: I0219 22:39:33.706623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:34 crc kubenswrapper[4771]: I0219 22:39:34.793461 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j67lw" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="registry-server" probeResult="failure" output=< Feb 19 22:39:34 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 22:39:34 crc kubenswrapper[4771]: > Feb 19 22:39:43 crc kubenswrapper[4771]: I0219 22:39:43.783872 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:43 crc kubenswrapper[4771]: I0219 22:39:43.854739 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:44 crc kubenswrapper[4771]: I0219 22:39:44.034148 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j67lw"] Feb 19 22:39:45 crc kubenswrapper[4771]: I0219 22:39:45.383294 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j67lw" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="registry-server" containerID="cri-o://2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c" gracePeriod=2 Feb 19 22:39:45 crc kubenswrapper[4771]: I0219 22:39:45.894362 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.033094 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-utilities\") pod \"24e7e014-a74b-4faf-8107-e38458854f40\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.033151 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq9n8\" (UniqueName: \"kubernetes.io/projected/24e7e014-a74b-4faf-8107-e38458854f40-kube-api-access-kq9n8\") pod \"24e7e014-a74b-4faf-8107-e38458854f40\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.033181 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-catalog-content\") pod \"24e7e014-a74b-4faf-8107-e38458854f40\" (UID: \"24e7e014-a74b-4faf-8107-e38458854f40\") " Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.034648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-utilities" (OuterVolumeSpecName: "utilities") pod "24e7e014-a74b-4faf-8107-e38458854f40" (UID: "24e7e014-a74b-4faf-8107-e38458854f40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.040772 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e7e014-a74b-4faf-8107-e38458854f40-kube-api-access-kq9n8" (OuterVolumeSpecName: "kube-api-access-kq9n8") pod "24e7e014-a74b-4faf-8107-e38458854f40" (UID: "24e7e014-a74b-4faf-8107-e38458854f40"). InnerVolumeSpecName "kube-api-access-kq9n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.135953 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.135999 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq9n8\" (UniqueName: \"kubernetes.io/projected/24e7e014-a74b-4faf-8107-e38458854f40-kube-api-access-kq9n8\") on node \"crc\" DevicePath \"\"" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.227522 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24e7e014-a74b-4faf-8107-e38458854f40" (UID: "24e7e014-a74b-4faf-8107-e38458854f40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.243761 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e7e014-a74b-4faf-8107-e38458854f40-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.396300 4771 generic.go:334] "Generic (PLEG): container finished" podID="24e7e014-a74b-4faf-8107-e38458854f40" containerID="2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c" exitCode=0 Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.396395 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j67lw" event={"ID":"24e7e014-a74b-4faf-8107-e38458854f40","Type":"ContainerDied","Data":"2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c"} Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.396437 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j67lw" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.396468 4771 scope.go:117] "RemoveContainer" containerID="2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.396446 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j67lw" event={"ID":"24e7e014-a74b-4faf-8107-e38458854f40","Type":"ContainerDied","Data":"dd760af07d068da7447ab798809a05e4f17a61c594ed144c47ce941eac818f86"} Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.432526 4771 scope.go:117] "RemoveContainer" containerID="91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.453170 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j67lw"] Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.453489 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j67lw"] Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.497403 4771 scope.go:117] "RemoveContainer" containerID="799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.523207 4771 scope.go:117] "RemoveContainer" containerID="2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c" Feb 19 22:39:46 crc kubenswrapper[4771]: E0219 22:39:46.523655 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c\": container with ID starting with 2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c not found: ID does not exist" containerID="2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.523712 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c"} err="failed to get container status \"2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c\": rpc error: code = NotFound desc = could not find container \"2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c\": container with ID starting with 2182b3a1d59658ec77852e3381d0af2bb10b195139d7448d0283abc43454996c not found: ID does not exist" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.523744 4771 scope.go:117] "RemoveContainer" containerID="91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93" Feb 19 22:39:46 crc kubenswrapper[4771]: E0219 22:39:46.524337 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93\": container with ID starting with 91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93 not found: ID does not exist" containerID="91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.524376 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93"} err="failed to get container status \"91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93\": rpc error: code = NotFound desc = could not find container \"91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93\": container with ID starting with 91f80a898cd5d84e79852fb82071a3e0cd7ca762a8f63e5b6a76e03dc840ab93 not found: ID does not exist" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.524403 4771 scope.go:117] "RemoveContainer" containerID="799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38" Feb 19 22:39:46 crc kubenswrapper[4771]: E0219 22:39:46.524799 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38\": container with ID starting with 799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38 not found: ID does not exist" containerID="799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38" Feb 19 22:39:46 crc kubenswrapper[4771]: I0219 22:39:46.524897 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38"} err="failed to get container status \"799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38\": rpc error: code = NotFound desc = could not find container \"799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38\": container with ID starting with 799399f04b48d6e8aeb5e4fa8f3b1de4c6389f73d8396d48bb6796acd9f89d38 not found: ID does not exist" Feb 19 22:39:48 crc kubenswrapper[4771]: I0219 22:39:48.448856 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e7e014-a74b-4faf-8107-e38458854f40" path="/var/lib/kubelet/pods/24e7e014-a74b-4faf-8107-e38458854f40/volumes" Feb 19 22:40:12 crc kubenswrapper[4771]: I0219 22:40:12.957619 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:40:12 crc kubenswrapper[4771]: I0219 22:40:12.958449 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:40:42 crc kubenswrapper[4771]: I0219 22:40:42.956530 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:40:42 crc kubenswrapper[4771]: I0219 22:40:42.957245 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:41:12 crc kubenswrapper[4771]: I0219 22:41:12.956801 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:41:12 crc kubenswrapper[4771]: I0219 22:41:12.957674 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:41:12 crc kubenswrapper[4771]: I0219 22:41:12.957757 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:41:12 crc kubenswrapper[4771]: I0219 22:41:12.958801 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"839bc3fd4cf2e05190b0691c7243968475e6bf2d1d5d4130ec5b8261ba14b6a5"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:41:12 crc kubenswrapper[4771]: I0219 22:41:12.958905 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://839bc3fd4cf2e05190b0691c7243968475e6bf2d1d5d4130ec5b8261ba14b6a5" gracePeriod=600 Feb 19 22:41:13 crc kubenswrapper[4771]: I0219 22:41:13.629993 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="839bc3fd4cf2e05190b0691c7243968475e6bf2d1d5d4130ec5b8261ba14b6a5" exitCode=0 Feb 19 22:41:13 crc kubenswrapper[4771]: I0219 22:41:13.630081 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"839bc3fd4cf2e05190b0691c7243968475e6bf2d1d5d4130ec5b8261ba14b6a5"} Feb 19 22:41:13 crc kubenswrapper[4771]: I0219 22:41:13.630442 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc"} Feb 19 22:41:13 crc kubenswrapper[4771]: I0219 22:41:13.630478 4771 scope.go:117] "RemoveContainer" containerID="39dc77b4de7ef8d9d8f77d5580edab69cf1cceede006138a4c6405a2eb450806" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.851473 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4z2fz"] Feb 19 22:42:14 crc kubenswrapper[4771]: E0219 22:42:14.852715 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="extract-content" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.853926 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="extract-content" Feb 19 22:42:14 crc kubenswrapper[4771]: E0219 22:42:14.853950 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="registry-server" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.853962 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="registry-server" Feb 19 22:42:14 crc kubenswrapper[4771]: E0219 22:42:14.854003 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="extract-utilities" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.854014 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="extract-utilities" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.854531 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e7e014-a74b-4faf-8107-e38458854f40" containerName="registry-server" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.858411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.870967 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z2fz"] Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.980863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-utilities\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.981270 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4q98\" (UniqueName: \"kubernetes.io/projected/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-kube-api-access-g4q98\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:14 crc kubenswrapper[4771]: I0219 22:42:14.981322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-catalog-content\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.082897 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-utilities\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.082987 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4q98\" (UniqueName: \"kubernetes.io/projected/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-kube-api-access-g4q98\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.083061 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-catalog-content\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.083661 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-catalog-content\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.083843 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-utilities\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.111755 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4q98\" (UniqueName: \"kubernetes.io/projected/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-kube-api-access-g4q98\") pod \"redhat-marketplace-4z2fz\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.194968 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:15 crc kubenswrapper[4771]: I0219 22:42:15.725526 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z2fz"] Feb 19 22:42:15 crc kubenswrapper[4771]: W0219 22:42:15.741595 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod436a6698_4e8a_4376_a5eb_9f9e496f1bd2.slice/crio-737977cea75ad1d6aff9888e121debed9e5e9131f453cfc719c73e0e5d195e88 WatchSource:0}: Error finding container 737977cea75ad1d6aff9888e121debed9e5e9131f453cfc719c73e0e5d195e88: Status 404 returned error can't find the container with id 737977cea75ad1d6aff9888e121debed9e5e9131f453cfc719c73e0e5d195e88 Feb 19 22:42:16 crc kubenswrapper[4771]: I0219 22:42:16.188626 4771 generic.go:334] "Generic (PLEG): container finished" podID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerID="847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557" exitCode=0 Feb 19 22:42:16 crc kubenswrapper[4771]: I0219 22:42:16.188739 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z2fz" event={"ID":"436a6698-4e8a-4376-a5eb-9f9e496f1bd2","Type":"ContainerDied","Data":"847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557"} Feb 19 22:42:16 crc kubenswrapper[4771]: I0219 22:42:16.189067 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z2fz" event={"ID":"436a6698-4e8a-4376-a5eb-9f9e496f1bd2","Type":"ContainerStarted","Data":"737977cea75ad1d6aff9888e121debed9e5e9131f453cfc719c73e0e5d195e88"} Feb 19 22:42:16 crc kubenswrapper[4771]: I0219 22:42:16.191759 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:42:17 crc kubenswrapper[4771]: I0219 22:42:17.201652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z2fz" event={"ID":"436a6698-4e8a-4376-a5eb-9f9e496f1bd2","Type":"ContainerStarted","Data":"e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a"} Feb 19 22:42:18 crc kubenswrapper[4771]: I0219 22:42:18.213791 4771 generic.go:334] "Generic (PLEG): container finished" podID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerID="e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a" exitCode=0 Feb 19 22:42:18 crc kubenswrapper[4771]: I0219 22:42:18.214054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z2fz" event={"ID":"436a6698-4e8a-4376-a5eb-9f9e496f1bd2","Type":"ContainerDied","Data":"e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a"} Feb 19 22:42:19 crc kubenswrapper[4771]: I0219 22:42:19.228223 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z2fz" event={"ID":"436a6698-4e8a-4376-a5eb-9f9e496f1bd2","Type":"ContainerStarted","Data":"e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b"} Feb 19 22:42:19 crc kubenswrapper[4771]: I0219 22:42:19.259244 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4z2fz" podStartSLOduration=2.8488641550000002 podStartE2EDuration="5.259216711s" podCreationTimestamp="2026-02-19 22:42:14 +0000 UTC" firstStartedPulling="2026-02-19 22:42:16.191070996 +0000 UTC m=+4436.462513506" lastFinishedPulling="2026-02-19 22:42:18.601423592 +0000 UTC m=+4438.872866062" observedRunningTime="2026-02-19 22:42:19.253805567 +0000 UTC m=+4439.525248107" watchObservedRunningTime="2026-02-19 22:42:19.259216711 +0000 UTC m=+4439.530659221" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.633099 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wjv5m"] Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.635465 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.639823 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjv5m"] Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.688730 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-catalog-content\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.688783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lml7l\" (UniqueName: \"kubernetes.io/projected/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-kube-api-access-lml7l\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.688803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-utilities\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.792573 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-catalog-content\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.792675 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lml7l\" (UniqueName: \"kubernetes.io/projected/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-kube-api-access-lml7l\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.792702 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-utilities\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.793433 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-utilities\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.793430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-catalog-content\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.825464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lml7l\" (UniqueName: \"kubernetes.io/projected/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-kube-api-access-lml7l\") pod \"certified-operators-wjv5m\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:21 crc kubenswrapper[4771]: I0219 22:42:21.969065 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:22 crc kubenswrapper[4771]: I0219 22:42:22.237341 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjv5m"] Feb 19 22:42:22 crc kubenswrapper[4771]: W0219 22:42:22.242119 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5044c6f5_3379_43c7_ba4b_98bf3d7630a4.slice/crio-8b21b919f2d2f221dd57ff51897bcb47435859c496afbd1d6419678935476140 WatchSource:0}: Error finding container 8b21b919f2d2f221dd57ff51897bcb47435859c496afbd1d6419678935476140: Status 404 returned error can't find the container with id 8b21b919f2d2f221dd57ff51897bcb47435859c496afbd1d6419678935476140 Feb 19 22:42:22 crc kubenswrapper[4771]: I0219 22:42:22.256416 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjv5m" event={"ID":"5044c6f5-3379-43c7-ba4b-98bf3d7630a4","Type":"ContainerStarted","Data":"8b21b919f2d2f221dd57ff51897bcb47435859c496afbd1d6419678935476140"} Feb 19 22:42:23 crc kubenswrapper[4771]: I0219 22:42:23.267827 4771 generic.go:334] "Generic (PLEG): container finished" podID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerID="57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0" exitCode=0 Feb 19 22:42:23 crc kubenswrapper[4771]: I0219 22:42:23.267944 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjv5m" event={"ID":"5044c6f5-3379-43c7-ba4b-98bf3d7630a4","Type":"ContainerDied","Data":"57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0"} Feb 19 22:42:24 crc kubenswrapper[4771]: I0219 22:42:24.281677 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjv5m" event={"ID":"5044c6f5-3379-43c7-ba4b-98bf3d7630a4","Type":"ContainerStarted","Data":"db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb"} Feb 19 22:42:25 crc kubenswrapper[4771]: I0219 22:42:25.195530 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:25 crc kubenswrapper[4771]: I0219 22:42:25.195930 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:25 crc kubenswrapper[4771]: I0219 22:42:25.262604 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:25 crc kubenswrapper[4771]: I0219 22:42:25.293896 4771 generic.go:334] "Generic (PLEG): container finished" podID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerID="db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb" exitCode=0 Feb 19 22:42:25 crc kubenswrapper[4771]: I0219 22:42:25.294049 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjv5m" event={"ID":"5044c6f5-3379-43c7-ba4b-98bf3d7630a4","Type":"ContainerDied","Data":"db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb"} Feb 19 22:42:25 crc kubenswrapper[4771]: I0219 22:42:25.362627 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:26 crc kubenswrapper[4771]: I0219 22:42:26.305568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjv5m" event={"ID":"5044c6f5-3379-43c7-ba4b-98bf3d7630a4","Type":"ContainerStarted","Data":"91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980"} Feb 19 22:42:26 crc kubenswrapper[4771]: I0219 22:42:26.337428 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wjv5m" podStartSLOduration=2.89519907 podStartE2EDuration="5.337408923s" podCreationTimestamp="2026-02-19 22:42:21 +0000 UTC" firstStartedPulling="2026-02-19 22:42:23.270416669 +0000 UTC m=+4443.541859179" lastFinishedPulling="2026-02-19 22:42:25.712626532 +0000 UTC m=+4445.984069032" observedRunningTime="2026-02-19 22:42:26.328706022 +0000 UTC m=+4446.600148522" watchObservedRunningTime="2026-02-19 22:42:26.337408923 +0000 UTC m=+4446.608851403" Feb 19 22:42:27 crc kubenswrapper[4771]: I0219 22:42:27.615789 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z2fz"] Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.326481 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4z2fz" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="registry-server" containerID="cri-o://e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b" gracePeriod=2 Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.792967 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.904884 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4q98\" (UniqueName: \"kubernetes.io/projected/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-kube-api-access-g4q98\") pod \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.904952 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-utilities\") pod \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.905157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-catalog-content\") pod \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\" (UID: \"436a6698-4e8a-4376-a5eb-9f9e496f1bd2\") " Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.907079 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-utilities" (OuterVolumeSpecName: "utilities") pod "436a6698-4e8a-4376-a5eb-9f9e496f1bd2" (UID: "436a6698-4e8a-4376-a5eb-9f9e496f1bd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.929366 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-kube-api-access-g4q98" (OuterVolumeSpecName: "kube-api-access-g4q98") pod "436a6698-4e8a-4376-a5eb-9f9e496f1bd2" (UID: "436a6698-4e8a-4376-a5eb-9f9e496f1bd2"). InnerVolumeSpecName "kube-api-access-g4q98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:42:28 crc kubenswrapper[4771]: I0219 22:42:28.946115 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "436a6698-4e8a-4376-a5eb-9f9e496f1bd2" (UID: "436a6698-4e8a-4376-a5eb-9f9e496f1bd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.007703 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.007785 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4q98\" (UniqueName: \"kubernetes.io/projected/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-kube-api-access-g4q98\") on node \"crc\" DevicePath \"\"" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.007837 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/436a6698-4e8a-4376-a5eb-9f9e496f1bd2-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.337220 4771 generic.go:334] "Generic (PLEG): container finished" podID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerID="e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b" exitCode=0 Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.337257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z2fz" event={"ID":"436a6698-4e8a-4376-a5eb-9f9e496f1bd2","Type":"ContainerDied","Data":"e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b"} Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.337283 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z2fz" event={"ID":"436a6698-4e8a-4376-a5eb-9f9e496f1bd2","Type":"ContainerDied","Data":"737977cea75ad1d6aff9888e121debed9e5e9131f453cfc719c73e0e5d195e88"} Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.337301 4771 scope.go:117] "RemoveContainer" containerID="e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.337364 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z2fz" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.360221 4771 scope.go:117] "RemoveContainer" containerID="e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.391609 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z2fz"] Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.393125 4771 scope.go:117] "RemoveContainer" containerID="847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.402239 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z2fz"] Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.434488 4771 scope.go:117] "RemoveContainer" containerID="e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b" Feb 19 22:42:29 crc kubenswrapper[4771]: E0219 22:42:29.435069 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b\": container with ID starting with e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b not found: ID does not exist" containerID="e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.435119 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b"} err="failed to get container status \"e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b\": rpc error: code = NotFound desc = could not find container \"e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b\": container with ID starting with e7714f7e293c0d74f9d6587e8ccb4e56a33e248f0e6b4f5e2c78caa5c2a6ff6b not found: ID does not exist" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.435152 4771 scope.go:117] "RemoveContainer" containerID="e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a" Feb 19 22:42:29 crc kubenswrapper[4771]: E0219 22:42:29.435643 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a\": container with ID starting with e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a not found: ID does not exist" containerID="e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.435700 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a"} err="failed to get container status \"e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a\": rpc error: code = NotFound desc = could not find container \"e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a\": container with ID starting with e2e107d907cb1d64fbb0b0a1c2e03481bea0d5b36e221866fa630e648c48cf0a not found: ID does not exist" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.435728 4771 scope.go:117] "RemoveContainer" containerID="847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557" Feb 19 22:42:29 crc kubenswrapper[4771]: E0219 22:42:29.436166 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557\": container with ID starting with 847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557 not found: ID does not exist" containerID="847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557" Feb 19 22:42:29 crc kubenswrapper[4771]: I0219 22:42:29.436236 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557"} err="failed to get container status \"847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557\": rpc error: code = NotFound desc = could not find container \"847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557\": container with ID starting with 847abe5086d313efcdcf4604f5580c03486771aab7a33be880b079beef498557 not found: ID does not exist" Feb 19 22:42:30 crc kubenswrapper[4771]: I0219 22:42:30.452961 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" path="/var/lib/kubelet/pods/436a6698-4e8a-4376-a5eb-9f9e496f1bd2/volumes" Feb 19 22:42:31 crc kubenswrapper[4771]: I0219 22:42:31.970150 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:31 crc kubenswrapper[4771]: I0219 22:42:31.970527 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:32 crc kubenswrapper[4771]: I0219 22:42:32.040563 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:32 crc kubenswrapper[4771]: I0219 22:42:32.424578 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:32 crc kubenswrapper[4771]: I0219 22:42:32.612919 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjv5m"] Feb 19 22:42:34 crc kubenswrapper[4771]: I0219 22:42:34.392698 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wjv5m" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="registry-server" containerID="cri-o://91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980" gracePeriod=2 Feb 19 22:42:34 crc kubenswrapper[4771]: I0219 22:42:34.893649 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:34 crc kubenswrapper[4771]: I0219 22:42:34.996513 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-utilities\") pod \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " Feb 19 22:42:34 crc kubenswrapper[4771]: I0219 22:42:34.996692 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-catalog-content\") pod \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " Feb 19 22:42:34 crc kubenswrapper[4771]: I0219 22:42:34.996750 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lml7l\" (UniqueName: \"kubernetes.io/projected/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-kube-api-access-lml7l\") pod \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\" (UID: \"5044c6f5-3379-43c7-ba4b-98bf3d7630a4\") " Feb 19 22:42:34 crc kubenswrapper[4771]: I0219 22:42:34.998528 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-utilities" (OuterVolumeSpecName: "utilities") pod "5044c6f5-3379-43c7-ba4b-98bf3d7630a4" (UID: "5044c6f5-3379-43c7-ba4b-98bf3d7630a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.005303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-kube-api-access-lml7l" (OuterVolumeSpecName: "kube-api-access-lml7l") pod "5044c6f5-3379-43c7-ba4b-98bf3d7630a4" (UID: "5044c6f5-3379-43c7-ba4b-98bf3d7630a4"). InnerVolumeSpecName "kube-api-access-lml7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.099529 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.099583 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lml7l\" (UniqueName: \"kubernetes.io/projected/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-kube-api-access-lml7l\") on node \"crc\" DevicePath \"\"" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.147155 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5044c6f5-3379-43c7-ba4b-98bf3d7630a4" (UID: "5044c6f5-3379-43c7-ba4b-98bf3d7630a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.200987 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5044c6f5-3379-43c7-ba4b-98bf3d7630a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.406794 4771 generic.go:334] "Generic (PLEG): container finished" podID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerID="91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980" exitCode=0 Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.406854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjv5m" event={"ID":"5044c6f5-3379-43c7-ba4b-98bf3d7630a4","Type":"ContainerDied","Data":"91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980"} Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.406893 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjv5m" event={"ID":"5044c6f5-3379-43c7-ba4b-98bf3d7630a4","Type":"ContainerDied","Data":"8b21b919f2d2f221dd57ff51897bcb47435859c496afbd1d6419678935476140"} Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.406943 4771 scope.go:117] "RemoveContainer" containerID="91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.407137 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjv5m" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.440814 4771 scope.go:117] "RemoveContainer" containerID="db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.473562 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjv5m"] Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.484930 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wjv5m"] Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.486280 4771 scope.go:117] "RemoveContainer" containerID="57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.521046 4771 scope.go:117] "RemoveContainer" containerID="91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980" Feb 19 22:42:35 crc kubenswrapper[4771]: E0219 22:42:35.521679 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980\": container with ID starting with 91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980 not found: ID does not exist" containerID="91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.521822 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980"} err="failed to get container status \"91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980\": rpc error: code = NotFound desc = could not find container \"91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980\": container with ID starting with 91f6887ba2bdb067090c691cfdd9a622797f73d4d9df618bb0625bac86756980 not found: ID does not exist" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.521911 4771 scope.go:117] "RemoveContainer" containerID="db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb" Feb 19 22:42:35 crc kubenswrapper[4771]: E0219 22:42:35.522410 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb\": container with ID starting with db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb not found: ID does not exist" containerID="db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.522447 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb"} err="failed to get container status \"db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb\": rpc error: code = NotFound desc = could not find container \"db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb\": container with ID starting with db4a7d2f319889c46b670f8324783cd8374e8740795d1f5e42d455ed507a6feb not found: ID does not exist" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.522478 4771 scope.go:117] "RemoveContainer" containerID="57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0" Feb 19 22:42:35 crc kubenswrapper[4771]: E0219 22:42:35.522897 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0\": container with ID starting with 57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0 not found: ID does not exist" containerID="57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0" Feb 19 22:42:35 crc kubenswrapper[4771]: I0219 22:42:35.522963 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0"} err="failed to get container status \"57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0\": rpc error: code = NotFound desc = could not find container \"57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0\": container with ID starting with 57c44a0a574ee6f540eebd697bc2fa21733fc29b957b10c3b6afb88cfb15d8c0 not found: ID does not exist" Feb 19 22:42:36 crc kubenswrapper[4771]: I0219 22:42:36.458204 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" path="/var/lib/kubelet/pods/5044c6f5-3379-43c7-ba4b-98bf3d7630a4/volumes" Feb 19 22:43:42 crc kubenswrapper[4771]: I0219 22:43:42.957126 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:43:42 crc kubenswrapper[4771]: I0219 22:43:42.960167 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:44:12 crc kubenswrapper[4771]: I0219 22:44:12.957587 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:44:12 crc kubenswrapper[4771]: I0219 22:44:12.958366 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:44:42 crc kubenswrapper[4771]: I0219 22:44:42.957555 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:44:42 crc kubenswrapper[4771]: I0219 22:44:42.958388 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:44:42 crc kubenswrapper[4771]: I0219 22:44:42.958457 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:44:42 crc kubenswrapper[4771]: I0219 22:44:42.959420 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:44:42 crc kubenswrapper[4771]: I0219 22:44:42.959511 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" gracePeriod=600 Feb 19 22:44:43 crc kubenswrapper[4771]: E0219 22:44:43.105003 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:44:43 crc kubenswrapper[4771]: I0219 22:44:43.565746 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" exitCode=0 Feb 19 22:44:43 crc kubenswrapper[4771]: I0219 22:44:43.565809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc"} Feb 19 22:44:43 crc kubenswrapper[4771]: I0219 22:44:43.565858 4771 scope.go:117] "RemoveContainer" containerID="839bc3fd4cf2e05190b0691c7243968475e6bf2d1d5d4130ec5b8261ba14b6a5" Feb 19 22:44:43 crc kubenswrapper[4771]: I0219 22:44:43.566580 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:44:43 crc kubenswrapper[4771]: E0219 22:44:43.566977 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:44:55 crc kubenswrapper[4771]: I0219 22:44:55.437161 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:44:55 crc kubenswrapper[4771]: E0219 22:44:55.438079 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.173806 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd"] Feb 19 22:45:00 crc kubenswrapper[4771]: E0219 22:45:00.174868 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="extract-content" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.174892 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="extract-content" Feb 19 22:45:00 crc kubenswrapper[4771]: E0219 22:45:00.174918 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="extract-content" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.174928 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="extract-content" Feb 19 22:45:00 crc kubenswrapper[4771]: E0219 22:45:00.174946 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="extract-utilities" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.174957 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="extract-utilities" Feb 19 22:45:00 crc kubenswrapper[4771]: E0219 22:45:00.174982 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="registry-server" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.174992 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="registry-server" Feb 19 22:45:00 crc kubenswrapper[4771]: E0219 22:45:00.175012 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="registry-server" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.175046 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="registry-server" Feb 19 22:45:00 crc kubenswrapper[4771]: E0219 22:45:00.175064 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="extract-utilities" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.175074 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="extract-utilities" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.175304 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5044c6f5-3379-43c7-ba4b-98bf3d7630a4" containerName="registry-server" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.175347 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="436a6698-4e8a-4376-a5eb-9f9e496f1bd2" containerName="registry-server" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.175982 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.179054 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.179279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.190071 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd"] Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.349916 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvrpr\" (UniqueName: \"kubernetes.io/projected/e61d94fa-d9a8-45a9-9a72-6225b9a56324-kube-api-access-nvrpr\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.350271 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e61d94fa-d9a8-45a9-9a72-6225b9a56324-config-volume\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.350409 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e61d94fa-d9a8-45a9-9a72-6225b9a56324-secret-volume\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.451538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e61d94fa-d9a8-45a9-9a72-6225b9a56324-config-volume\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.451683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e61d94fa-d9a8-45a9-9a72-6225b9a56324-secret-volume\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.451741 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvrpr\" (UniqueName: \"kubernetes.io/projected/e61d94fa-d9a8-45a9-9a72-6225b9a56324-kube-api-access-nvrpr\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.453158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e61d94fa-d9a8-45a9-9a72-6225b9a56324-config-volume\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.460598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e61d94fa-d9a8-45a9-9a72-6225b9a56324-secret-volume\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.474144 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvrpr\" (UniqueName: \"kubernetes.io/projected/e61d94fa-d9a8-45a9-9a72-6225b9a56324-kube-api-access-nvrpr\") pod \"collect-profiles-29525685-rt6sd\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.562806 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:00 crc kubenswrapper[4771]: I0219 22:45:00.861568 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd"] Feb 19 22:45:00 crc kubenswrapper[4771]: W0219 22:45:00.870663 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61d94fa_d9a8_45a9_9a72_6225b9a56324.slice/crio-c0eec9615742d9e8c01b499038420fee1c155ccc96a28e48c264ea5d3d4a58bd WatchSource:0}: Error finding container c0eec9615742d9e8c01b499038420fee1c155ccc96a28e48c264ea5d3d4a58bd: Status 404 returned error can't find the container with id c0eec9615742d9e8c01b499038420fee1c155ccc96a28e48c264ea5d3d4a58bd Feb 19 22:45:01 crc kubenswrapper[4771]: I0219 22:45:01.727564 4771 generic.go:334] "Generic (PLEG): container finished" podID="e61d94fa-d9a8-45a9-9a72-6225b9a56324" containerID="bda9220221576eac491a463fa6600a90a1595da25e8dab5fb874998da453224e" exitCode=0 Feb 19 22:45:01 crc kubenswrapper[4771]: I0219 22:45:01.727654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" event={"ID":"e61d94fa-d9a8-45a9-9a72-6225b9a56324","Type":"ContainerDied","Data":"bda9220221576eac491a463fa6600a90a1595da25e8dab5fb874998da453224e"} Feb 19 22:45:01 crc kubenswrapper[4771]: I0219 22:45:01.728056 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" event={"ID":"e61d94fa-d9a8-45a9-9a72-6225b9a56324","Type":"ContainerStarted","Data":"c0eec9615742d9e8c01b499038420fee1c155ccc96a28e48c264ea5d3d4a58bd"} Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.122111 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.294664 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e61d94fa-d9a8-45a9-9a72-6225b9a56324-config-volume\") pod \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.294866 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e61d94fa-d9a8-45a9-9a72-6225b9a56324-secret-volume\") pod \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.295002 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvrpr\" (UniqueName: \"kubernetes.io/projected/e61d94fa-d9a8-45a9-9a72-6225b9a56324-kube-api-access-nvrpr\") pod \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\" (UID: \"e61d94fa-d9a8-45a9-9a72-6225b9a56324\") " Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.295383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61d94fa-d9a8-45a9-9a72-6225b9a56324-config-volume" (OuterVolumeSpecName: "config-volume") pod "e61d94fa-d9a8-45a9-9a72-6225b9a56324" (UID: "e61d94fa-d9a8-45a9-9a72-6225b9a56324"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.305090 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61d94fa-d9a8-45a9-9a72-6225b9a56324-kube-api-access-nvrpr" (OuterVolumeSpecName: "kube-api-access-nvrpr") pod "e61d94fa-d9a8-45a9-9a72-6225b9a56324" (UID: "e61d94fa-d9a8-45a9-9a72-6225b9a56324"). InnerVolumeSpecName "kube-api-access-nvrpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.305844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61d94fa-d9a8-45a9-9a72-6225b9a56324-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e61d94fa-d9a8-45a9-9a72-6225b9a56324" (UID: "e61d94fa-d9a8-45a9-9a72-6225b9a56324"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.397190 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvrpr\" (UniqueName: \"kubernetes.io/projected/e61d94fa-d9a8-45a9-9a72-6225b9a56324-kube-api-access-nvrpr\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.397248 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e61d94fa-d9a8-45a9-9a72-6225b9a56324-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.397268 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e61d94fa-d9a8-45a9-9a72-6225b9a56324-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.749841 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" event={"ID":"e61d94fa-d9a8-45a9-9a72-6225b9a56324","Type":"ContainerDied","Data":"c0eec9615742d9e8c01b499038420fee1c155ccc96a28e48c264ea5d3d4a58bd"} Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.749901 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0eec9615742d9e8c01b499038420fee1c155ccc96a28e48c264ea5d3d4a58bd" Feb 19 22:45:03 crc kubenswrapper[4771]: I0219 22:45:03.750432 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd" Feb 19 22:45:04 crc kubenswrapper[4771]: I0219 22:45:04.192317 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm"] Feb 19 22:45:04 crc kubenswrapper[4771]: I0219 22:45:04.196561 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-5wwgm"] Feb 19 22:45:04 crc kubenswrapper[4771]: I0219 22:45:04.447398 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68f6324-2e48-46b5-9bdb-66f1bd733a07" path="/var/lib/kubelet/pods/f68f6324-2e48-46b5-9bdb-66f1bd733a07/volumes" Feb 19 22:45:06 crc kubenswrapper[4771]: I0219 22:45:06.442314 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:45:06 crc kubenswrapper[4771]: E0219 22:45:06.443080 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:45:18 crc kubenswrapper[4771]: I0219 22:45:18.438142 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:45:18 crc kubenswrapper[4771]: E0219 22:45:18.439579 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:45:32 crc kubenswrapper[4771]: I0219 22:45:32.438107 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:45:32 crc kubenswrapper[4771]: E0219 22:45:32.439151 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:45:34 crc kubenswrapper[4771]: I0219 22:45:34.116715 4771 scope.go:117] "RemoveContainer" containerID="be2c97b0a9c68c4a56643d637f40257dbf699d21d472caf754e5fd57419152c7" Feb 19 22:45:44 crc kubenswrapper[4771]: I0219 22:45:44.437928 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:45:44 crc kubenswrapper[4771]: E0219 22:45:44.439168 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:45:55 crc kubenswrapper[4771]: I0219 22:45:55.437344 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:45:55 crc kubenswrapper[4771]: E0219 22:45:55.439088 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:46:07 crc kubenswrapper[4771]: I0219 22:46:07.437495 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:46:07 crc kubenswrapper[4771]: E0219 22:46:07.440328 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:46:19 crc kubenswrapper[4771]: I0219 22:46:19.437972 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:46:19 crc kubenswrapper[4771]: E0219 22:46:19.438986 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:46:30 crc kubenswrapper[4771]: I0219 22:46:30.445102 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:46:30 crc kubenswrapper[4771]: E0219 22:46:30.446305 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:46:45 crc kubenswrapper[4771]: I0219 22:46:45.438567 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:46:45 crc kubenswrapper[4771]: E0219 22:46:45.439518 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:46:58 crc kubenswrapper[4771]: I0219 22:46:58.438963 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:46:58 crc kubenswrapper[4771]: E0219 22:46:58.440365 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:47:09 crc kubenswrapper[4771]: I0219 22:47:09.437516 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:47:09 crc kubenswrapper[4771]: E0219 22:47:09.438316 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:47:20 crc kubenswrapper[4771]: I0219 22:47:20.446994 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:47:20 crc kubenswrapper[4771]: E0219 22:47:20.448150 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:47:33 crc kubenswrapper[4771]: I0219 22:47:33.437346 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:47:33 crc kubenswrapper[4771]: E0219 22:47:33.438237 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:47:44 crc kubenswrapper[4771]: I0219 22:47:44.438091 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:47:44 crc kubenswrapper[4771]: E0219 22:47:44.439399 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:47:57 crc kubenswrapper[4771]: I0219 22:47:57.437857 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:47:57 crc kubenswrapper[4771]: E0219 22:47:57.438921 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:48:09 crc kubenswrapper[4771]: I0219 22:48:09.436843 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:48:09 crc kubenswrapper[4771]: E0219 22:48:09.437541 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:48:23 crc kubenswrapper[4771]: I0219 22:48:23.439102 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:48:23 crc kubenswrapper[4771]: E0219 22:48:23.440154 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:48:35 crc kubenswrapper[4771]: I0219 22:48:35.437315 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:48:35 crc kubenswrapper[4771]: E0219 22:48:35.438408 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:48:48 crc kubenswrapper[4771]: I0219 22:48:48.438303 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:48:48 crc kubenswrapper[4771]: E0219 22:48:48.439406 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:49:03 crc kubenswrapper[4771]: I0219 22:49:03.437118 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:49:03 crc kubenswrapper[4771]: E0219 22:49:03.438098 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.461450 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-w4qz4"] Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.471823 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-w4qz4"] Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.603105 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-dbsfs"] Feb 19 22:49:09 crc kubenswrapper[4771]: E0219 22:49:09.603414 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61d94fa-d9a8-45a9-9a72-6225b9a56324" containerName="collect-profiles" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.603430 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61d94fa-d9a8-45a9-9a72-6225b9a56324" containerName="collect-profiles" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.603611 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61d94fa-d9a8-45a9-9a72-6225b9a56324" containerName="collect-profiles" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.604130 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.605983 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.606242 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.606316 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.607953 4771 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-whtfj" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.626998 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dbsfs"] Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.722046 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/51d80d4f-2270-405e-ad32-d73795bad0ff-node-mnt\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.722156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/51d80d4f-2270-405e-ad32-d73795bad0ff-crc-storage\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.722305 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7fz6\" (UniqueName: \"kubernetes.io/projected/51d80d4f-2270-405e-ad32-d73795bad0ff-kube-api-access-z7fz6\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.823972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/51d80d4f-2270-405e-ad32-d73795bad0ff-node-mnt\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.824125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/51d80d4f-2270-405e-ad32-d73795bad0ff-crc-storage\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.824299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7fz6\" (UniqueName: \"kubernetes.io/projected/51d80d4f-2270-405e-ad32-d73795bad0ff-kube-api-access-z7fz6\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.824454 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/51d80d4f-2270-405e-ad32-d73795bad0ff-node-mnt\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.825525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/51d80d4f-2270-405e-ad32-d73795bad0ff-crc-storage\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.859794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7fz6\" (UniqueName: \"kubernetes.io/projected/51d80d4f-2270-405e-ad32-d73795bad0ff-kube-api-access-z7fz6\") pod \"crc-storage-crc-dbsfs\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:09 crc kubenswrapper[4771]: I0219 22:49:09.931283 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:10 crc kubenswrapper[4771]: I0219 22:49:10.446533 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a344000-2cb1-4aa3-932e-d26a540f079f" path="/var/lib/kubelet/pods/2a344000-2cb1-4aa3-932e-d26a540f079f/volumes" Feb 19 22:49:10 crc kubenswrapper[4771]: I0219 22:49:10.477327 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dbsfs"] Feb 19 22:49:10 crc kubenswrapper[4771]: W0219 22:49:10.485295 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d80d4f_2270_405e_ad32_d73795bad0ff.slice/crio-74331740e542e005e81ce960ef1cde4815e15ebfea42d0f280ca9686ec3a89f4 WatchSource:0}: Error finding container 74331740e542e005e81ce960ef1cde4815e15ebfea42d0f280ca9686ec3a89f4: Status 404 returned error can't find the container with id 74331740e542e005e81ce960ef1cde4815e15ebfea42d0f280ca9686ec3a89f4 Feb 19 22:49:10 crc kubenswrapper[4771]: I0219 22:49:10.496431 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:49:11 crc kubenswrapper[4771]: I0219 22:49:11.454711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dbsfs" event={"ID":"51d80d4f-2270-405e-ad32-d73795bad0ff","Type":"ContainerStarted","Data":"e73e2a891444595dc7bb40215c182f97a7ab94d43f8f8d22c7cc9703d2c3705d"} Feb 19 22:49:11 crc kubenswrapper[4771]: I0219 22:49:11.455641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dbsfs" event={"ID":"51d80d4f-2270-405e-ad32-d73795bad0ff","Type":"ContainerStarted","Data":"74331740e542e005e81ce960ef1cde4815e15ebfea42d0f280ca9686ec3a89f4"} Feb 19 22:49:11 crc kubenswrapper[4771]: I0219 22:49:11.483584 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-dbsfs" podStartSLOduration=2.001135937 podStartE2EDuration="2.483550343s" podCreationTimestamp="2026-02-19 22:49:09 +0000 UTC" firstStartedPulling="2026-02-19 22:49:10.496121359 +0000 UTC m=+4850.767563839" lastFinishedPulling="2026-02-19 22:49:10.978535745 +0000 UTC m=+4851.249978245" observedRunningTime="2026-02-19 22:49:11.481681667 +0000 UTC m=+4851.753124217" watchObservedRunningTime="2026-02-19 22:49:11.483550343 +0000 UTC m=+4851.754992853" Feb 19 22:49:12 crc kubenswrapper[4771]: I0219 22:49:12.465412 4771 generic.go:334] "Generic (PLEG): container finished" podID="51d80d4f-2270-405e-ad32-d73795bad0ff" containerID="e73e2a891444595dc7bb40215c182f97a7ab94d43f8f8d22c7cc9703d2c3705d" exitCode=0 Feb 19 22:49:12 crc kubenswrapper[4771]: I0219 22:49:12.465656 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dbsfs" event={"ID":"51d80d4f-2270-405e-ad32-d73795bad0ff","Type":"ContainerDied","Data":"e73e2a891444595dc7bb40215c182f97a7ab94d43f8f8d22c7cc9703d2c3705d"} Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.823128 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.890893 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/51d80d4f-2270-405e-ad32-d73795bad0ff-crc-storage\") pod \"51d80d4f-2270-405e-ad32-d73795bad0ff\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.891010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/51d80d4f-2270-405e-ad32-d73795bad0ff-node-mnt\") pod \"51d80d4f-2270-405e-ad32-d73795bad0ff\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.891084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7fz6\" (UniqueName: \"kubernetes.io/projected/51d80d4f-2270-405e-ad32-d73795bad0ff-kube-api-access-z7fz6\") pod \"51d80d4f-2270-405e-ad32-d73795bad0ff\" (UID: \"51d80d4f-2270-405e-ad32-d73795bad0ff\") " Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.891228 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51d80d4f-2270-405e-ad32-d73795bad0ff-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "51d80d4f-2270-405e-ad32-d73795bad0ff" (UID: "51d80d4f-2270-405e-ad32-d73795bad0ff"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.891560 4771 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/51d80d4f-2270-405e-ad32-d73795bad0ff-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.900277 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d80d4f-2270-405e-ad32-d73795bad0ff-kube-api-access-z7fz6" (OuterVolumeSpecName: "kube-api-access-z7fz6") pod "51d80d4f-2270-405e-ad32-d73795bad0ff" (UID: "51d80d4f-2270-405e-ad32-d73795bad0ff"). InnerVolumeSpecName "kube-api-access-z7fz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.924477 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d80d4f-2270-405e-ad32-d73795bad0ff-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "51d80d4f-2270-405e-ad32-d73795bad0ff" (UID: "51d80d4f-2270-405e-ad32-d73795bad0ff"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.993502 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7fz6\" (UniqueName: \"kubernetes.io/projected/51d80d4f-2270-405e-ad32-d73795bad0ff-kube-api-access-z7fz6\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:13 crc kubenswrapper[4771]: I0219 22:49:13.993585 4771 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/51d80d4f-2270-405e-ad32-d73795bad0ff-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:14 crc kubenswrapper[4771]: I0219 22:49:14.486982 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dbsfs" event={"ID":"51d80d4f-2270-405e-ad32-d73795bad0ff","Type":"ContainerDied","Data":"74331740e542e005e81ce960ef1cde4815e15ebfea42d0f280ca9686ec3a89f4"} Feb 19 22:49:14 crc kubenswrapper[4771]: I0219 22:49:14.487091 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74331740e542e005e81ce960ef1cde4815e15ebfea42d0f280ca9686ec3a89f4" Feb 19 22:49:14 crc kubenswrapper[4771]: I0219 22:49:14.487430 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dbsfs" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.097495 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-dbsfs"] Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.106426 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-dbsfs"] Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.316959 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9mkn7"] Feb 19 22:49:16 crc kubenswrapper[4771]: E0219 22:49:16.317460 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d80d4f-2270-405e-ad32-d73795bad0ff" containerName="storage" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.317494 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d80d4f-2270-405e-ad32-d73795bad0ff" containerName="storage" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.317822 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d80d4f-2270-405e-ad32-d73795bad0ff" containerName="storage" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.318558 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.324739 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.325151 4771 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-whtfj" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.326169 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.326408 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.336558 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9mkn7"] Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.430053 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds9bw\" (UniqueName: \"kubernetes.io/projected/e0578305-ee7b-4288-a223-67e1a438bfb3-kube-api-access-ds9bw\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.430152 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0578305-ee7b-4288-a223-67e1a438bfb3-node-mnt\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.430275 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0578305-ee7b-4288-a223-67e1a438bfb3-crc-storage\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.450977 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d80d4f-2270-405e-ad32-d73795bad0ff" path="/var/lib/kubelet/pods/51d80d4f-2270-405e-ad32-d73795bad0ff/volumes" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.531181 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0578305-ee7b-4288-a223-67e1a438bfb3-crc-storage\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.531293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds9bw\" (UniqueName: \"kubernetes.io/projected/e0578305-ee7b-4288-a223-67e1a438bfb3-kube-api-access-ds9bw\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.531412 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0578305-ee7b-4288-a223-67e1a438bfb3-node-mnt\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.531781 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0578305-ee7b-4288-a223-67e1a438bfb3-node-mnt\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.532703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0578305-ee7b-4288-a223-67e1a438bfb3-crc-storage\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.569621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds9bw\" (UniqueName: \"kubernetes.io/projected/e0578305-ee7b-4288-a223-67e1a438bfb3-kube-api-access-ds9bw\") pod \"crc-storage-crc-9mkn7\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.655439 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:16 crc kubenswrapper[4771]: I0219 22:49:16.975499 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9mkn7"] Feb 19 22:49:17 crc kubenswrapper[4771]: I0219 22:49:17.515424 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9mkn7" event={"ID":"e0578305-ee7b-4288-a223-67e1a438bfb3","Type":"ContainerStarted","Data":"3f959b6d46b77d212c4b798d43ecca33a233511b7ac92dbc0722cff519345792"} Feb 19 22:49:18 crc kubenswrapper[4771]: I0219 22:49:18.438315 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:49:18 crc kubenswrapper[4771]: E0219 22:49:18.439006 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:49:18 crc kubenswrapper[4771]: I0219 22:49:18.530421 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0578305-ee7b-4288-a223-67e1a438bfb3" containerID="62b21f802fdae24335e58fb1fa81c160fc44195c109fca8037b6bcb20fa5276d" exitCode=0 Feb 19 22:49:18 crc kubenswrapper[4771]: I0219 22:49:18.530490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9mkn7" event={"ID":"e0578305-ee7b-4288-a223-67e1a438bfb3","Type":"ContainerDied","Data":"62b21f802fdae24335e58fb1fa81c160fc44195c109fca8037b6bcb20fa5276d"} Feb 19 22:49:19 crc kubenswrapper[4771]: I0219 22:49:19.952388 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:19 crc kubenswrapper[4771]: I0219 22:49:19.985356 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0578305-ee7b-4288-a223-67e1a438bfb3-node-mnt\") pod \"e0578305-ee7b-4288-a223-67e1a438bfb3\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " Feb 19 22:49:19 crc kubenswrapper[4771]: I0219 22:49:19.985454 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0578305-ee7b-4288-a223-67e1a438bfb3-crc-storage\") pod \"e0578305-ee7b-4288-a223-67e1a438bfb3\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " Feb 19 22:49:19 crc kubenswrapper[4771]: I0219 22:49:19.985494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds9bw\" (UniqueName: \"kubernetes.io/projected/e0578305-ee7b-4288-a223-67e1a438bfb3-kube-api-access-ds9bw\") pod \"e0578305-ee7b-4288-a223-67e1a438bfb3\" (UID: \"e0578305-ee7b-4288-a223-67e1a438bfb3\") " Feb 19 22:49:19 crc kubenswrapper[4771]: I0219 22:49:19.985488 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0578305-ee7b-4288-a223-67e1a438bfb3-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e0578305-ee7b-4288-a223-67e1a438bfb3" (UID: "e0578305-ee7b-4288-a223-67e1a438bfb3"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:49:19 crc kubenswrapper[4771]: I0219 22:49:19.985807 4771 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e0578305-ee7b-4288-a223-67e1a438bfb3-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:20 crc kubenswrapper[4771]: I0219 22:49:20.015129 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0578305-ee7b-4288-a223-67e1a438bfb3-kube-api-access-ds9bw" (OuterVolumeSpecName: "kube-api-access-ds9bw") pod "e0578305-ee7b-4288-a223-67e1a438bfb3" (UID: "e0578305-ee7b-4288-a223-67e1a438bfb3"). InnerVolumeSpecName "kube-api-access-ds9bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:49:20 crc kubenswrapper[4771]: I0219 22:49:20.021261 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0578305-ee7b-4288-a223-67e1a438bfb3-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e0578305-ee7b-4288-a223-67e1a438bfb3" (UID: "e0578305-ee7b-4288-a223-67e1a438bfb3"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:49:20 crc kubenswrapper[4771]: I0219 22:49:20.087262 4771 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e0578305-ee7b-4288-a223-67e1a438bfb3-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:20 crc kubenswrapper[4771]: I0219 22:49:20.087313 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds9bw\" (UniqueName: \"kubernetes.io/projected/e0578305-ee7b-4288-a223-67e1a438bfb3-kube-api-access-ds9bw\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:20 crc kubenswrapper[4771]: I0219 22:49:20.551488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9mkn7" event={"ID":"e0578305-ee7b-4288-a223-67e1a438bfb3","Type":"ContainerDied","Data":"3f959b6d46b77d212c4b798d43ecca33a233511b7ac92dbc0722cff519345792"} Feb 19 22:49:20 crc kubenswrapper[4771]: I0219 22:49:20.551560 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9mkn7" Feb 19 22:49:20 crc kubenswrapper[4771]: I0219 22:49:20.551592 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f959b6d46b77d212c4b798d43ecca33a233511b7ac92dbc0722cff519345792" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.248544 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bv8zf"] Feb 19 22:49:31 crc kubenswrapper[4771]: E0219 22:49:31.249675 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0578305-ee7b-4288-a223-67e1a438bfb3" containerName="storage" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.249698 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0578305-ee7b-4288-a223-67e1a438bfb3" containerName="storage" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.249940 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0578305-ee7b-4288-a223-67e1a438bfb3" containerName="storage" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.253587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.316902 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv8zf"] Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.396431 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5nk8\" (UniqueName: \"kubernetes.io/projected/53220fff-7678-45d4-bbec-b8bb71857727-kube-api-access-n5nk8\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.396512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-catalog-content\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.396560 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-utilities\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.437496 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:49:31 crc kubenswrapper[4771]: E0219 22:49:31.437970 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.498620 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5nk8\" (UniqueName: \"kubernetes.io/projected/53220fff-7678-45d4-bbec-b8bb71857727-kube-api-access-n5nk8\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.498967 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-catalog-content\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.499169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-utilities\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.499572 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-catalog-content\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.499856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-utilities\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.523947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5nk8\" (UniqueName: \"kubernetes.io/projected/53220fff-7678-45d4-bbec-b8bb71857727-kube-api-access-n5nk8\") pod \"community-operators-bv8zf\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:31 crc kubenswrapper[4771]: I0219 22:49:31.636188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:32 crc kubenswrapper[4771]: I0219 22:49:32.103732 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv8zf"] Feb 19 22:49:32 crc kubenswrapper[4771]: I0219 22:49:32.673319 4771 generic.go:334] "Generic (PLEG): container finished" podID="53220fff-7678-45d4-bbec-b8bb71857727" containerID="be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09" exitCode=0 Feb 19 22:49:32 crc kubenswrapper[4771]: I0219 22:49:32.673385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8zf" event={"ID":"53220fff-7678-45d4-bbec-b8bb71857727","Type":"ContainerDied","Data":"be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09"} Feb 19 22:49:32 crc kubenswrapper[4771]: I0219 22:49:32.673424 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8zf" event={"ID":"53220fff-7678-45d4-bbec-b8bb71857727","Type":"ContainerStarted","Data":"2ac201d4e481e1a890b3d1f78c94568157546c25420020a408c8bf6b1151cc88"} Feb 19 22:49:33 crc kubenswrapper[4771]: I0219 22:49:33.696404 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8zf" event={"ID":"53220fff-7678-45d4-bbec-b8bb71857727","Type":"ContainerStarted","Data":"692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd"} Feb 19 22:49:34 crc kubenswrapper[4771]: I0219 22:49:34.275804 4771 scope.go:117] "RemoveContainer" containerID="9c1e890673a69091be9378ffde69f13c4d3aa011167a46b5ac75277d30b54801" Feb 19 22:49:34 crc kubenswrapper[4771]: I0219 22:49:34.709690 4771 generic.go:334] "Generic (PLEG): container finished" podID="53220fff-7678-45d4-bbec-b8bb71857727" containerID="692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd" exitCode=0 Feb 19 22:49:34 crc kubenswrapper[4771]: I0219 22:49:34.709761 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8zf" event={"ID":"53220fff-7678-45d4-bbec-b8bb71857727","Type":"ContainerDied","Data":"692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd"} Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.028823 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8lxp5"] Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.030716 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.056256 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lxp5"] Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.159848 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-catalog-content\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.159925 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-utilities\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.160037 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dfr5\" (UniqueName: \"kubernetes.io/projected/50210691-51b9-4ab8-9b2c-9f2510688093-kube-api-access-7dfr5\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.261498 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-catalog-content\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.261553 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-utilities\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.261604 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dfr5\" (UniqueName: \"kubernetes.io/projected/50210691-51b9-4ab8-9b2c-9f2510688093-kube-api-access-7dfr5\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.262151 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-catalog-content\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.262258 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-utilities\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.287230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dfr5\" (UniqueName: \"kubernetes.io/projected/50210691-51b9-4ab8-9b2c-9f2510688093-kube-api-access-7dfr5\") pod \"redhat-operators-8lxp5\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.360343 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.575220 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lxp5"] Feb 19 22:49:35 crc kubenswrapper[4771]: W0219 22:49:35.585561 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50210691_51b9_4ab8_9b2c_9f2510688093.slice/crio-69d552eecd79778bdad0c970c74c819474d6fd6b92704797f5f78a8bc203884d WatchSource:0}: Error finding container 69d552eecd79778bdad0c970c74c819474d6fd6b92704797f5f78a8bc203884d: Status 404 returned error can't find the container with id 69d552eecd79778bdad0c970c74c819474d6fd6b92704797f5f78a8bc203884d Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.722223 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8zf" event={"ID":"53220fff-7678-45d4-bbec-b8bb71857727","Type":"ContainerStarted","Data":"cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30"} Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.724482 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxp5" event={"ID":"50210691-51b9-4ab8-9b2c-9f2510688093","Type":"ContainerStarted","Data":"69d552eecd79778bdad0c970c74c819474d6fd6b92704797f5f78a8bc203884d"} Feb 19 22:49:35 crc kubenswrapper[4771]: I0219 22:49:35.747333 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bv8zf" podStartSLOduration=2.314418908 podStartE2EDuration="4.74732004s" podCreationTimestamp="2026-02-19 22:49:31 +0000 UTC" firstStartedPulling="2026-02-19 22:49:32.676652385 +0000 UTC m=+4872.948094865" lastFinishedPulling="2026-02-19 22:49:35.109553517 +0000 UTC m=+4875.380995997" observedRunningTime="2026-02-19 22:49:35.746078528 +0000 UTC m=+4876.017521008" watchObservedRunningTime="2026-02-19 22:49:35.74732004 +0000 UTC m=+4876.018762500" Feb 19 22:49:36 crc kubenswrapper[4771]: I0219 22:49:36.735886 4771 generic.go:334] "Generic (PLEG): container finished" podID="50210691-51b9-4ab8-9b2c-9f2510688093" containerID="a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832" exitCode=0 Feb 19 22:49:36 crc kubenswrapper[4771]: I0219 22:49:36.736006 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxp5" event={"ID":"50210691-51b9-4ab8-9b2c-9f2510688093","Type":"ContainerDied","Data":"a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832"} Feb 19 22:49:37 crc kubenswrapper[4771]: I0219 22:49:37.749916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxp5" event={"ID":"50210691-51b9-4ab8-9b2c-9f2510688093","Type":"ContainerStarted","Data":"6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131"} Feb 19 22:49:38 crc kubenswrapper[4771]: I0219 22:49:38.758425 4771 generic.go:334] "Generic (PLEG): container finished" podID="50210691-51b9-4ab8-9b2c-9f2510688093" containerID="6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131" exitCode=0 Feb 19 22:49:38 crc kubenswrapper[4771]: I0219 22:49:38.758476 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxp5" event={"ID":"50210691-51b9-4ab8-9b2c-9f2510688093","Type":"ContainerDied","Data":"6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131"} Feb 19 22:49:39 crc kubenswrapper[4771]: I0219 22:49:39.771732 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxp5" event={"ID":"50210691-51b9-4ab8-9b2c-9f2510688093","Type":"ContainerStarted","Data":"ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb"} Feb 19 22:49:39 crc kubenswrapper[4771]: I0219 22:49:39.797612 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8lxp5" podStartSLOduration=2.383654412 podStartE2EDuration="4.797586778s" podCreationTimestamp="2026-02-19 22:49:35 +0000 UTC" firstStartedPulling="2026-02-19 22:49:36.738229544 +0000 UTC m=+4877.009672014" lastFinishedPulling="2026-02-19 22:49:39.15216191 +0000 UTC m=+4879.423604380" observedRunningTime="2026-02-19 22:49:39.795727449 +0000 UTC m=+4880.067169959" watchObservedRunningTime="2026-02-19 22:49:39.797586778 +0000 UTC m=+4880.069029288" Feb 19 22:49:41 crc kubenswrapper[4771]: I0219 22:49:41.641667 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:41 crc kubenswrapper[4771]: I0219 22:49:41.642086 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:41 crc kubenswrapper[4771]: I0219 22:49:41.711407 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:41 crc kubenswrapper[4771]: I0219 22:49:41.863273 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:42 crc kubenswrapper[4771]: I0219 22:49:42.624338 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bv8zf"] Feb 19 22:49:43 crc kubenswrapper[4771]: I0219 22:49:43.824893 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bv8zf" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="registry-server" containerID="cri-o://cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30" gracePeriod=2 Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.437786 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.805398 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.839214 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"b8d83d2f9b637a931ed9fb55168bd29dffbae091a62af39ff035a8fae4582da4"} Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.851737 4771 generic.go:334] "Generic (PLEG): container finished" podID="53220fff-7678-45d4-bbec-b8bb71857727" containerID="cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30" exitCode=0 Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.851805 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8zf" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.851807 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8zf" event={"ID":"53220fff-7678-45d4-bbec-b8bb71857727","Type":"ContainerDied","Data":"cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30"} Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.851899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8zf" event={"ID":"53220fff-7678-45d4-bbec-b8bb71857727","Type":"ContainerDied","Data":"2ac201d4e481e1a890b3d1f78c94568157546c25420020a408c8bf6b1151cc88"} Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.851930 4771 scope.go:117] "RemoveContainer" containerID="cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.876745 4771 scope.go:117] "RemoveContainer" containerID="692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.905062 4771 scope.go:117] "RemoveContainer" containerID="be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.920096 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-catalog-content\") pod \"53220fff-7678-45d4-bbec-b8bb71857727\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.920153 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5nk8\" (UniqueName: \"kubernetes.io/projected/53220fff-7678-45d4-bbec-b8bb71857727-kube-api-access-n5nk8\") pod \"53220fff-7678-45d4-bbec-b8bb71857727\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.920237 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-utilities\") pod \"53220fff-7678-45d4-bbec-b8bb71857727\" (UID: \"53220fff-7678-45d4-bbec-b8bb71857727\") " Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.922843 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-utilities" (OuterVolumeSpecName: "utilities") pod "53220fff-7678-45d4-bbec-b8bb71857727" (UID: "53220fff-7678-45d4-bbec-b8bb71857727"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.927229 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53220fff-7678-45d4-bbec-b8bb71857727-kube-api-access-n5nk8" (OuterVolumeSpecName: "kube-api-access-n5nk8") pod "53220fff-7678-45d4-bbec-b8bb71857727" (UID: "53220fff-7678-45d4-bbec-b8bb71857727"). InnerVolumeSpecName "kube-api-access-n5nk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.933895 4771 scope.go:117] "RemoveContainer" containerID="cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30" Feb 19 22:49:44 crc kubenswrapper[4771]: E0219 22:49:44.934407 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30\": container with ID starting with cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30 not found: ID does not exist" containerID="cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.934442 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30"} err="failed to get container status \"cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30\": rpc error: code = NotFound desc = could not find container \"cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30\": container with ID starting with cc0c92e2c85bc4ba591fdaf9bcf88ae21ff9a09f1f60c706c1bafd6f46e6dd30 not found: ID does not exist" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.934465 4771 scope.go:117] "RemoveContainer" containerID="692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd" Feb 19 22:49:44 crc kubenswrapper[4771]: E0219 22:49:44.934765 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd\": container with ID starting with 692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd not found: ID does not exist" containerID="692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.934783 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd"} err="failed to get container status \"692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd\": rpc error: code = NotFound desc = could not find container \"692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd\": container with ID starting with 692714f90591fb42928fee8b2c6cc79a983cc4233ac880c72981717a5af9b7cd not found: ID does not exist" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.934795 4771 scope.go:117] "RemoveContainer" containerID="be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09" Feb 19 22:49:44 crc kubenswrapper[4771]: E0219 22:49:44.935884 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09\": container with ID starting with be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09 not found: ID does not exist" containerID="be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.935905 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09"} err="failed to get container status \"be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09\": rpc error: code = NotFound desc = could not find container \"be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09\": container with ID starting with be3771aaaa960a7e696932253c6ed385fbfdcd65491b57a8ed39c915ae0f0c09 not found: ID does not exist" Feb 19 22:49:44 crc kubenswrapper[4771]: I0219 22:49:44.992651 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53220fff-7678-45d4-bbec-b8bb71857727" (UID: "53220fff-7678-45d4-bbec-b8bb71857727"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:49:45 crc kubenswrapper[4771]: I0219 22:49:45.021477 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:45 crc kubenswrapper[4771]: I0219 22:49:45.021512 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5nk8\" (UniqueName: \"kubernetes.io/projected/53220fff-7678-45d4-bbec-b8bb71857727-kube-api-access-n5nk8\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:45 crc kubenswrapper[4771]: I0219 22:49:45.021525 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53220fff-7678-45d4-bbec-b8bb71857727-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:45 crc kubenswrapper[4771]: I0219 22:49:45.191716 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bv8zf"] Feb 19 22:49:45 crc kubenswrapper[4771]: I0219 22:49:45.196075 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bv8zf"] Feb 19 22:49:45 crc kubenswrapper[4771]: I0219 22:49:45.360667 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:45 crc kubenswrapper[4771]: I0219 22:49:45.360770 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:46 crc kubenswrapper[4771]: I0219 22:49:46.415765 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8lxp5" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="registry-server" probeResult="failure" output=< Feb 19 22:49:46 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 22:49:46 crc kubenswrapper[4771]: > Feb 19 22:49:46 crc kubenswrapper[4771]: I0219 22:49:46.447361 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53220fff-7678-45d4-bbec-b8bb71857727" path="/var/lib/kubelet/pods/53220fff-7678-45d4-bbec-b8bb71857727/volumes" Feb 19 22:49:55 crc kubenswrapper[4771]: I0219 22:49:55.437203 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:55 crc kubenswrapper[4771]: I0219 22:49:55.525841 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:55 crc kubenswrapper[4771]: I0219 22:49:55.686656 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lxp5"] Feb 19 22:49:56 crc kubenswrapper[4771]: I0219 22:49:56.979549 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8lxp5" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="registry-server" containerID="cri-o://ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb" gracePeriod=2 Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.508838 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.632794 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-utilities\") pod \"50210691-51b9-4ab8-9b2c-9f2510688093\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.632941 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-catalog-content\") pod \"50210691-51b9-4ab8-9b2c-9f2510688093\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.632990 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dfr5\" (UniqueName: \"kubernetes.io/projected/50210691-51b9-4ab8-9b2c-9f2510688093-kube-api-access-7dfr5\") pod \"50210691-51b9-4ab8-9b2c-9f2510688093\" (UID: \"50210691-51b9-4ab8-9b2c-9f2510688093\") " Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.634369 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-utilities" (OuterVolumeSpecName: "utilities") pod "50210691-51b9-4ab8-9b2c-9f2510688093" (UID: "50210691-51b9-4ab8-9b2c-9f2510688093"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.643145 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50210691-51b9-4ab8-9b2c-9f2510688093-kube-api-access-7dfr5" (OuterVolumeSpecName: "kube-api-access-7dfr5") pod "50210691-51b9-4ab8-9b2c-9f2510688093" (UID: "50210691-51b9-4ab8-9b2c-9f2510688093"). InnerVolumeSpecName "kube-api-access-7dfr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.734546 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.734908 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dfr5\" (UniqueName: \"kubernetes.io/projected/50210691-51b9-4ab8-9b2c-9f2510688093-kube-api-access-7dfr5\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.806635 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50210691-51b9-4ab8-9b2c-9f2510688093" (UID: "50210691-51b9-4ab8-9b2c-9f2510688093"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.836194 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50210691-51b9-4ab8-9b2c-9f2510688093-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.990994 4771 generic.go:334] "Generic (PLEG): container finished" podID="50210691-51b9-4ab8-9b2c-9f2510688093" containerID="ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb" exitCode=0 Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.991107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxp5" event={"ID":"50210691-51b9-4ab8-9b2c-9f2510688093","Type":"ContainerDied","Data":"ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb"} Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.991193 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxp5" event={"ID":"50210691-51b9-4ab8-9b2c-9f2510688093","Type":"ContainerDied","Data":"69d552eecd79778bdad0c970c74c819474d6fd6b92704797f5f78a8bc203884d"} Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.991236 4771 scope.go:117] "RemoveContainer" containerID="ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb" Feb 19 22:49:57 crc kubenswrapper[4771]: I0219 22:49:57.991921 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxp5" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.027160 4771 scope.go:117] "RemoveContainer" containerID="6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.051127 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lxp5"] Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.061800 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8lxp5"] Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.074532 4771 scope.go:117] "RemoveContainer" containerID="a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.100061 4771 scope.go:117] "RemoveContainer" containerID="ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb" Feb 19 22:49:58 crc kubenswrapper[4771]: E0219 22:49:58.100692 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb\": container with ID starting with ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb not found: ID does not exist" containerID="ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.100751 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb"} err="failed to get container status \"ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb\": rpc error: code = NotFound desc = could not find container \"ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb\": container with ID starting with ee9c43489a32849a71084f937c3509f8552a9bf87f69550a6bca6059a31028cb not found: ID does not exist" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.100790 4771 scope.go:117] "RemoveContainer" containerID="6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131" Feb 19 22:49:58 crc kubenswrapper[4771]: E0219 22:49:58.101233 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131\": container with ID starting with 6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131 not found: ID does not exist" containerID="6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.101308 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131"} err="failed to get container status \"6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131\": rpc error: code = NotFound desc = could not find container \"6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131\": container with ID starting with 6eacbf1ccd9bf278b2955abdc8d00f853877f46ff8b6f4d1336faae56a8ff131 not found: ID does not exist" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.101376 4771 scope.go:117] "RemoveContainer" containerID="a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832" Feb 19 22:49:58 crc kubenswrapper[4771]: E0219 22:49:58.101810 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832\": container with ID starting with a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832 not found: ID does not exist" containerID="a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.101861 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832"} err="failed to get container status \"a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832\": rpc error: code = NotFound desc = could not find container \"a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832\": container with ID starting with a83d8253f798d52f62f7ef5b961e647eb3e5c3ce6616bbd6167212ba4ea72832 not found: ID does not exist" Feb 19 22:49:58 crc kubenswrapper[4771]: I0219 22:49:58.452396 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" path="/var/lib/kubelet/pods/50210691-51b9-4ab8-9b2c-9f2510688093/volumes" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.068342 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-2vg2t"] Feb 19 22:51:23 crc kubenswrapper[4771]: E0219 22:51:23.069263 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="extract-content" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069277 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="extract-content" Feb 19 22:51:23 crc kubenswrapper[4771]: E0219 22:51:23.069295 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="registry-server" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069302 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="registry-server" Feb 19 22:51:23 crc kubenswrapper[4771]: E0219 22:51:23.069318 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="extract-utilities" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069326 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="extract-utilities" Feb 19 22:51:23 crc kubenswrapper[4771]: E0219 22:51:23.069343 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="registry-server" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069351 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="registry-server" Feb 19 22:51:23 crc kubenswrapper[4771]: E0219 22:51:23.069367 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="extract-utilities" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069375 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="extract-utilities" Feb 19 22:51:23 crc kubenswrapper[4771]: E0219 22:51:23.069387 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="extract-content" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069393 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="extract-content" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069522 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="50210691-51b9-4ab8-9b2c-9f2510688093" containerName="registry-server" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.069537 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="53220fff-7678-45d4-bbec-b8bb71857727" containerName="registry-server" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.070253 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.072636 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rl4mp" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.073288 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.073758 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.074534 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.081225 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-2vg2t"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.087672 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-wrnjj"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.088743 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.090916 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.109147 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-wrnjj"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.170391 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxt9\" (UniqueName: \"kubernetes.io/projected/e7757420-685d-4ba0-953a-51e08ad9989e-kube-api-access-vgxt9\") pod \"dnsmasq-dns-6f98b88745-2vg2t\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.170443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-dns-svc\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.170476 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nndvb\" (UniqueName: \"kubernetes.io/projected/d6909792-9596-4ab1-a8f3-2597f966d3f3-kube-api-access-nndvb\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.170512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7757420-685d-4ba0-953a-51e08ad9989e-config\") pod \"dnsmasq-dns-6f98b88745-2vg2t\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.170531 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-config\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.271372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-dns-svc\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.271479 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nndvb\" (UniqueName: \"kubernetes.io/projected/d6909792-9596-4ab1-a8f3-2597f966d3f3-kube-api-access-nndvb\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.271532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7757420-685d-4ba0-953a-51e08ad9989e-config\") pod \"dnsmasq-dns-6f98b88745-2vg2t\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.271557 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-config\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.271672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxt9\" (UniqueName: \"kubernetes.io/projected/e7757420-685d-4ba0-953a-51e08ad9989e-kube-api-access-vgxt9\") pod \"dnsmasq-dns-6f98b88745-2vg2t\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.272250 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-dns-svc\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.272428 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-config\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.272628 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7757420-685d-4ba0-953a-51e08ad9989e-config\") pod \"dnsmasq-dns-6f98b88745-2vg2t\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.362425 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nndvb\" (UniqueName: \"kubernetes.io/projected/d6909792-9596-4ab1-a8f3-2597f966d3f3-kube-api-access-nndvb\") pod \"dnsmasq-dns-9d69655f7-wrnjj\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.362510 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxt9\" (UniqueName: \"kubernetes.io/projected/e7757420-685d-4ba0-953a-51e08ad9989e-kube-api-access-vgxt9\") pod \"dnsmasq-dns-6f98b88745-2vg2t\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.388358 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.405450 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.484391 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-2vg2t"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.514636 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-wbs8c"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.515973 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.545431 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-wbs8c"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.590753 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-config\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.590877 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnm48\" (UniqueName: \"kubernetes.io/projected/c610bccd-c798-4fd9-9130-3b99b475b768-kube-api-access-tnm48\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.590956 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.691793 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-config\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.691868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnm48\" (UniqueName: \"kubernetes.io/projected/c610bccd-c798-4fd9-9130-3b99b475b768-kube-api-access-tnm48\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.691938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.692792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.693274 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-config\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.727532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnm48\" (UniqueName: \"kubernetes.io/projected/c610bccd-c798-4fd9-9130-3b99b475b768-kube-api-access-tnm48\") pod \"dnsmasq-dns-7c4c8f55b5-wbs8c\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.790622 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-wbs8c"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.791983 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.826337 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-v75gp"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.832249 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.838686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-v75gp"] Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.897305 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-dns-svc\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.897339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdcvw\" (UniqueName: \"kubernetes.io/projected/06a90808-2706-47f6-8021-a02a43b62284-kube-api-access-qdcvw\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.897423 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-config\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.998933 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-config\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.999079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-dns-svc\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:23 crc kubenswrapper[4771]: I0219 22:51:23.999102 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdcvw\" (UniqueName: \"kubernetes.io/projected/06a90808-2706-47f6-8021-a02a43b62284-kube-api-access-qdcvw\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.000143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-config\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.000648 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-dns-svc\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:24 crc kubenswrapper[4771]: W0219 22:51:24.004436 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7757420_685d_4ba0_953a_51e08ad9989e.slice/crio-2643ef3b74dc42459113e3084b0cb0d65bcebb078833c6382fc480eb42987cd7 WatchSource:0}: Error finding container 2643ef3b74dc42459113e3084b0cb0d65bcebb078833c6382fc480eb42987cd7: Status 404 returned error can't find the container with id 2643ef3b74dc42459113e3084b0cb0d65bcebb078833c6382fc480eb42987cd7 Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.005069 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-2vg2t"] Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.017812 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdcvw\" (UniqueName: \"kubernetes.io/projected/06a90808-2706-47f6-8021-a02a43b62284-kube-api-access-qdcvw\") pod \"dnsmasq-dns-589cf688cc-v75gp\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:24 crc kubenswrapper[4771]: W0219 22:51:24.045772 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6909792_9596_4ab1_a8f3_2597f966d3f3.slice/crio-612676c00959b92ea363913706989430973cdac2447e7d64243fa5a3125c4277 WatchSource:0}: Error finding container 612676c00959b92ea363913706989430973cdac2447e7d64243fa5a3125c4277: Status 404 returned error can't find the container with id 612676c00959b92ea363913706989430973cdac2447e7d64243fa5a3125c4277 Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.049097 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-wrnjj"] Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.200792 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.302451 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-wbs8c"] Feb 19 22:51:24 crc kubenswrapper[4771]: W0219 22:51:24.326277 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc610bccd_c798_4fd9_9130_3b99b475b768.slice/crio-8f2fd29c72b66e2d185ac7a5fd4bc20a012e408a5f0684ebd4a08c8910e379e9 WatchSource:0}: Error finding container 8f2fd29c72b66e2d185ac7a5fd4bc20a012e408a5f0684ebd4a08c8910e379e9: Status 404 returned error can't find the container with id 8f2fd29c72b66e2d185ac7a5fd4bc20a012e408a5f0684ebd4a08c8910e379e9 Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.645813 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.651129 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-v75gp"] Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.651304 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.656078 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.656287 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.656404 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.656506 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.656661 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.656792 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.658942 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ffrtl" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.665608 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722737 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc52c95e-217f-4d63-9825-d6496a419b5d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722911 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc52c95e-217f-4d63-9825-d6496a419b5d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722944 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.722994 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.723014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnnwz\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-kube-api-access-qnnwz\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.723050 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-config-data\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.764508 4771 generic.go:334] "Generic (PLEG): container finished" podID="c610bccd-c798-4fd9-9130-3b99b475b768" containerID="edd7171182a809d6f1e7cd69ab6c67b33dcf081271147a45555645aea83fd385" exitCode=0 Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.764592 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" event={"ID":"c610bccd-c798-4fd9-9130-3b99b475b768","Type":"ContainerDied","Data":"edd7171182a809d6f1e7cd69ab6c67b33dcf081271147a45555645aea83fd385"} Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.764619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" event={"ID":"c610bccd-c798-4fd9-9130-3b99b475b768","Type":"ContainerStarted","Data":"8f2fd29c72b66e2d185ac7a5fd4bc20a012e408a5f0684ebd4a08c8910e379e9"} Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.767175 4771 generic.go:334] "Generic (PLEG): container finished" podID="e7757420-685d-4ba0-953a-51e08ad9989e" containerID="b434f448e496858024c0671c60d4aa9ab14b4aa4883813467c4adaa6e7a0b170" exitCode=0 Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.767379 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" event={"ID":"e7757420-685d-4ba0-953a-51e08ad9989e","Type":"ContainerDied","Data":"b434f448e496858024c0671c60d4aa9ab14b4aa4883813467c4adaa6e7a0b170"} Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.767558 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" event={"ID":"e7757420-685d-4ba0-953a-51e08ad9989e","Type":"ContainerStarted","Data":"2643ef3b74dc42459113e3084b0cb0d65bcebb078833c6382fc480eb42987cd7"} Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.769241 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" event={"ID":"06a90808-2706-47f6-8021-a02a43b62284","Type":"ContainerStarted","Data":"440f355c15608517ffc6736e4850f28d17e635d025c03f5815f497a700bd1eac"} Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.770687 4771 generic.go:334] "Generic (PLEG): container finished" podID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerID="a25e0c519348ea9bcc9827a62fbd23efb309305155f524e6f69fa8a20f72e44e" exitCode=0 Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.770727 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" event={"ID":"d6909792-9596-4ab1-a8f3-2597f966d3f3","Type":"ContainerDied","Data":"a25e0c519348ea9bcc9827a62fbd23efb309305155f524e6f69fa8a20f72e44e"} Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.770749 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" event={"ID":"d6909792-9596-4ab1-a8f3-2597f966d3f3","Type":"ContainerStarted","Data":"612676c00959b92ea363913706989430973cdac2447e7d64243fa5a3125c4277"} Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.824846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.824886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc52c95e-217f-4d63-9825-d6496a419b5d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.824939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.824978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.825001 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.825036 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnnwz\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-kube-api-access-qnnwz\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.825068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-config-data\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.825088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.825118 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc52c95e-217f-4d63-9825-d6496a419b5d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.825161 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.825184 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.826055 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.826609 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-config-data\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.828312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.829151 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.829361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.831322 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.831350 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f608a103ac0cb6b5cf94070de4d632eb9f754ab75e04d34660913feed56de4e/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.869543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.870344 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.870903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc52c95e-217f-4d63-9825-d6496a419b5d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.871014 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc52c95e-217f-4d63-9825-d6496a419b5d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.942781 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.943925 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.949659 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.949952 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.950148 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.950198 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.950408 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.950520 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.950678 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4d55v" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.959755 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.965188 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnnwz\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-kube-api-access-qnnwz\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:24 crc kubenswrapper[4771]: I0219 22:51:24.983810 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " pod="openstack/rabbitmq-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030052 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030166 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507e947c-a8b3-49e6-b84d-6a5df734f2b1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030206 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507e947c-a8b3-49e6-b84d-6a5df734f2b1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030228 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030257 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030286 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030320 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvgb2\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-kube-api-access-mvgb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030344 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.030360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.132502 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133376 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvgb2\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-kube-api-access-mvgb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133455 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133528 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507e947c-a8b3-49e6-b84d-6a5df734f2b1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133567 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507e947c-a8b3-49e6-b84d-6a5df734f2b1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133580 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.133599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.134065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.134711 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.135301 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.135792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.138148 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.138961 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.138991 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d7b6b4c63064f416101e47cc765fa4b1ca43546ea2c8655906aabf5711abe664/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.148456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.148490 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507e947c-a8b3-49e6-b84d-6a5df734f2b1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.148614 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507e947c-a8b3-49e6-b84d-6a5df734f2b1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.150050 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvgb2\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-kube-api-access-mvgb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.151792 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.155789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.177627 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.187330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.234801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-config\") pod \"c610bccd-c798-4fd9-9130-3b99b475b768\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.234866 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnm48\" (UniqueName: \"kubernetes.io/projected/c610bccd-c798-4fd9-9130-3b99b475b768-kube-api-access-tnm48\") pod \"c610bccd-c798-4fd9-9130-3b99b475b768\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.234891 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7757420-685d-4ba0-953a-51e08ad9989e-config\") pod \"e7757420-685d-4ba0-953a-51e08ad9989e\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.234913 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgxt9\" (UniqueName: \"kubernetes.io/projected/e7757420-685d-4ba0-953a-51e08ad9989e-kube-api-access-vgxt9\") pod \"e7757420-685d-4ba0-953a-51e08ad9989e\" (UID: \"e7757420-685d-4ba0-953a-51e08ad9989e\") " Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.234947 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-dns-svc\") pod \"c610bccd-c798-4fd9-9130-3b99b475b768\" (UID: \"c610bccd-c798-4fd9-9130-3b99b475b768\") " Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.259399 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c610bccd-c798-4fd9-9130-3b99b475b768" (UID: "c610bccd-c798-4fd9-9130-3b99b475b768"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.259557 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7757420-685d-4ba0-953a-51e08ad9989e-config" (OuterVolumeSpecName: "config") pod "e7757420-685d-4ba0-953a-51e08ad9989e" (UID: "e7757420-685d-4ba0-953a-51e08ad9989e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.259606 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7757420-685d-4ba0-953a-51e08ad9989e-kube-api-access-vgxt9" (OuterVolumeSpecName: "kube-api-access-vgxt9") pod "e7757420-685d-4ba0-953a-51e08ad9989e" (UID: "e7757420-685d-4ba0-953a-51e08ad9989e"). InnerVolumeSpecName "kube-api-access-vgxt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.263579 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-config" (OuterVolumeSpecName: "config") pod "c610bccd-c798-4fd9-9130-3b99b475b768" (UID: "c610bccd-c798-4fd9-9130-3b99b475b768"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.265604 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c610bccd-c798-4fd9-9130-3b99b475b768-kube-api-access-tnm48" (OuterVolumeSpecName: "kube-api-access-tnm48") pod "c610bccd-c798-4fd9-9130-3b99b475b768" (UID: "c610bccd-c798-4fd9-9130-3b99b475b768"). InnerVolumeSpecName "kube-api-access-tnm48". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.337083 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.337370 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnm48\" (UniqueName: \"kubernetes.io/projected/c610bccd-c798-4fd9-9130-3b99b475b768-kube-api-access-tnm48\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.337382 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7757420-685d-4ba0-953a-51e08ad9989e-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.337392 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgxt9\" (UniqueName: \"kubernetes.io/projected/e7757420-685d-4ba0-953a-51e08ad9989e-kube-api-access-vgxt9\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.337400 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c610bccd-c798-4fd9-9130-3b99b475b768-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.450422 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.711035 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:51:25 crc kubenswrapper[4771]: E0219 22:51:25.711688 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7757420-685d-4ba0-953a-51e08ad9989e" containerName="init" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.711700 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7757420-685d-4ba0-953a-51e08ad9989e" containerName="init" Feb 19 22:51:25 crc kubenswrapper[4771]: E0219 22:51:25.711717 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c610bccd-c798-4fd9-9130-3b99b475b768" containerName="init" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.711723 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c610bccd-c798-4fd9-9130-3b99b475b768" containerName="init" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.711845 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7757420-685d-4ba0-953a-51e08ad9989e" containerName="init" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.711863 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c610bccd-c798-4fd9-9130-3b99b475b768" containerName="init" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.712647 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.716758 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.717507 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wvd6f" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.717791 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.723517 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.725824 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.727117 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741302 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-config-data-default\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcrcj\" (UniqueName: \"kubernetes.io/projected/01b94015-d7c8-4ce0-ae61-aa68065a4c85-kube-api-access-vcrcj\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741533 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-kolla-config\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2246d976-e277-4c93-a478-b47041eea285\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2246d976-e277-4c93-a478-b47041eea285\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-operator-scripts\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b94015-d7c8-4ce0-ae61-aa68065a4c85-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741784 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01b94015-d7c8-4ce0-ae61-aa68065a4c85-config-data-generated\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.741851 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b94015-d7c8-4ce0-ae61-aa68065a4c85-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.788231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" event={"ID":"d6909792-9596-4ab1-a8f3-2597f966d3f3","Type":"ContainerStarted","Data":"ea84cb17ca1b1da48373d84601b08f82b6019d4559928bb3ca7958e4e7ea22e5"} Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.788409 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.791705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" event={"ID":"c610bccd-c798-4fd9-9130-3b99b475b768","Type":"ContainerDied","Data":"8f2fd29c72b66e2d185ac7a5fd4bc20a012e408a5f0684ebd4a08c8910e379e9"} Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.791761 4771 scope.go:117] "RemoveContainer" containerID="edd7171182a809d6f1e7cd69ab6c67b33dcf081271147a45555645aea83fd385" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.791884 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-wbs8c" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.799605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" event={"ID":"e7757420-685d-4ba0-953a-51e08ad9989e","Type":"ContainerDied","Data":"2643ef3b74dc42459113e3084b0cb0d65bcebb078833c6382fc480eb42987cd7"} Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.799614 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-2vg2t" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.801247 4771 generic.go:334] "Generic (PLEG): container finished" podID="06a90808-2706-47f6-8021-a02a43b62284" containerID="8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f" exitCode=0 Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.801292 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" event={"ID":"06a90808-2706-47f6-8021-a02a43b62284","Type":"ContainerDied","Data":"8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f"} Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.820844 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" podStartSLOduration=2.820822905 podStartE2EDuration="2.820822905s" podCreationTimestamp="2026-02-19 22:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:25.818536185 +0000 UTC m=+4986.089978665" watchObservedRunningTime="2026-02-19 22:51:25.820822905 +0000 UTC m=+4986.092265385" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.824600 4771 scope.go:117] "RemoveContainer" containerID="b434f448e496858024c0671c60d4aa9ab14b4aa4883813467c4adaa6e7a0b170" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.846883 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-kolla-config\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.846945 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2246d976-e277-4c93-a478-b47041eea285\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2246d976-e277-4c93-a478-b47041eea285\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.846994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-operator-scripts\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.847099 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b94015-d7c8-4ce0-ae61-aa68065a4c85-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.847119 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01b94015-d7c8-4ce0-ae61-aa68065a4c85-config-data-generated\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.847171 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b94015-d7c8-4ce0-ae61-aa68065a4c85-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.847191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-config-data-default\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.847257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcrcj\" (UniqueName: \"kubernetes.io/projected/01b94015-d7c8-4ce0-ae61-aa68065a4c85-kube-api-access-vcrcj\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.848618 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01b94015-d7c8-4ce0-ae61-aa68065a4c85-config-data-generated\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.849252 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-kolla-config\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.857378 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.857408 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2246d976-e277-4c93-a478-b47041eea285\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2246d976-e277-4c93-a478-b47041eea285\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7c6bc1989735ca99b352eabc09e75505fbb33db6c4f9d7e1c8953b4f0edc97f1/globalmount\"" pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.859456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-operator-scripts\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.860057 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01b94015-d7c8-4ce0-ae61-aa68065a4c85-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.862888 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01b94015-d7c8-4ce0-ae61-aa68065a4c85-config-data-default\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.867817 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b94015-d7c8-4ce0-ae61-aa68065a4c85-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.871288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcrcj\" (UniqueName: \"kubernetes.io/projected/01b94015-d7c8-4ce0-ae61-aa68065a4c85-kube-api-access-vcrcj\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.896928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2246d976-e277-4c93-a478-b47041eea285\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2246d976-e277-4c93-a478-b47041eea285\") pod \"openstack-galera-0\" (UID: \"01b94015-d7c8-4ce0-ae61-aa68065a4c85\") " pod="openstack/openstack-galera-0" Feb 19 22:51:25 crc kubenswrapper[4771]: I0219 22:51:25.969806 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-wbs8c"] Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.022560 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-wbs8c"] Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.035157 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-2vg2t"] Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.038044 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-2vg2t"] Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.047247 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.233161 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.273875 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:51:26 crc kubenswrapper[4771]: W0219 22:51:26.295825 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod507e947c_a8b3_49e6_b84d_6a5df734f2b1.slice/crio-029c1b222ef8516403e8703d6649dfa41d9d46daa24031fd14bd1a991fba7892 WatchSource:0}: Error finding container 029c1b222ef8516403e8703d6649dfa41d9d46daa24031fd14bd1a991fba7892: Status 404 returned error can't find the container with id 029c1b222ef8516403e8703d6649dfa41d9d46daa24031fd14bd1a991fba7892 Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.445712 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c610bccd-c798-4fd9-9130-3b99b475b768" path="/var/lib/kubelet/pods/c610bccd-c798-4fd9-9130-3b99b475b768/volumes" Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.446845 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7757420-685d-4ba0-953a-51e08ad9989e" path="/var/lib/kubelet/pods/e7757420-685d-4ba0-953a-51e08ad9989e/volumes" Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.572490 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.817219 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc52c95e-217f-4d63-9825-d6496a419b5d","Type":"ContainerStarted","Data":"e3b43f6d1f9acdb9bd913cb40349e7b504a1118810c36bc263d9249dfde0eaf1"} Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.820772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" event={"ID":"06a90808-2706-47f6-8021-a02a43b62284","Type":"ContainerStarted","Data":"c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0"} Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.821001 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.822111 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507e947c-a8b3-49e6-b84d-6a5df734f2b1","Type":"ContainerStarted","Data":"029c1b222ef8516403e8703d6649dfa41d9d46daa24031fd14bd1a991fba7892"} Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.824802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b94015-d7c8-4ce0-ae61-aa68065a4c85","Type":"ContainerStarted","Data":"2b3e161170cd6960037ffeb5fad242fb5b7e65567ceae63f91fd49d0c4ac318f"} Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.824857 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b94015-d7c8-4ce0-ae61-aa68065a4c85","Type":"ContainerStarted","Data":"6d7c2ccd4419f01b5aa9dfc7b4c2bbc714fd1a5f1015789e6cdbcfa7bdcc86f8"} Feb 19 22:51:26 crc kubenswrapper[4771]: I0219 22:51:26.851125 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" podStartSLOduration=3.8511030760000002 podStartE2EDuration="3.851103076s" podCreationTimestamp="2026-02-19 22:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:26.846557595 +0000 UTC m=+4987.118000105" watchObservedRunningTime="2026-02-19 22:51:26.851103076 +0000 UTC m=+4987.122545536" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.240740 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.242073 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.244342 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.245736 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.245760 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-flqdp" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.245819 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.258801 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/36dec848-d1c8-4567-be43-2e6f0e10db93-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278281 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq9s2\" (UniqueName: \"kubernetes.io/projected/36dec848-d1c8-4567-be43-2e6f0e10db93-kube-api-access-kq9s2\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-20274456-5492-49bb-b35b-914603ec935c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20274456-5492-49bb-b35b-914603ec935c\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278394 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/36dec848-d1c8-4567-be43-2e6f0e10db93-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278469 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.278498 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dec848-d1c8-4567-be43-2e6f0e10db93-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.379968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dec848-d1c8-4567-be43-2e6f0e10db93-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380052 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/36dec848-d1c8-4567-be43-2e6f0e10db93-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380095 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380140 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq9s2\" (UniqueName: \"kubernetes.io/projected/36dec848-d1c8-4567-be43-2e6f0e10db93-kube-api-access-kq9s2\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380200 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-20274456-5492-49bb-b35b-914603ec935c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20274456-5492-49bb-b35b-914603ec935c\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/36dec848-d1c8-4567-be43-2e6f0e10db93-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380417 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.380741 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/36dec848-d1c8-4567-be43-2e6f0e10db93-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.381472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.383637 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.385047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36dec848-d1c8-4567-be43-2e6f0e10db93-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.385617 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.385648 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-20274456-5492-49bb-b35b-914603ec935c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20274456-5492-49bb-b35b-914603ec935c\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c73eebfd3b18ffd6e801757c14b66227538e96defe9583a26a56f021322922bd/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.387249 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dec848-d1c8-4567-be43-2e6f0e10db93-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.398612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/36dec848-d1c8-4567-be43-2e6f0e10db93-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.401933 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq9s2\" (UniqueName: \"kubernetes.io/projected/36dec848-d1c8-4567-be43-2e6f0e10db93-kube-api-access-kq9s2\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.443283 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-20274456-5492-49bb-b35b-914603ec935c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20274456-5492-49bb-b35b-914603ec935c\") pod \"openstack-cell1-galera-0\" (UID: \"36dec848-d1c8-4567-be43-2e6f0e10db93\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.556609 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.787601 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.793901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.799453 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.799671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.799802 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gwkdd" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.802638 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.865238 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.872009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507e947c-a8b3-49e6-b84d-6a5df734f2b1","Type":"ContainerStarted","Data":"06f769e2ebb381ec5a698d65a39f9fa9e934f5a03e13cf16bd66810b5b0aff68"} Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.881837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc52c95e-217f-4d63-9825-d6496a419b5d","Type":"ContainerStarted","Data":"1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627"} Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.991068 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-config-data\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.991694 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h88d\" (UniqueName: \"kubernetes.io/projected/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-kube-api-access-9h88d\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.991838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.991959 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:27 crc kubenswrapper[4771]: I0219 22:51:27.992074 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-kolla-config\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.093605 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.093688 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.093741 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-kolla-config\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.093784 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-config-data\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.093832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h88d\" (UniqueName: \"kubernetes.io/projected/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-kube-api-access-9h88d\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.094614 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-config-data\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.095235 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-kolla-config\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.100712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.100763 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.114978 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h88d\" (UniqueName: \"kubernetes.io/projected/ddeaff4d-ab09-4f42-ad29-beb21dd735e2-kube-api-access-9h88d\") pod \"memcached-0\" (UID: \"ddeaff4d-ab09-4f42-ad29-beb21dd735e2\") " pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.142469 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.607855 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.904374 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ddeaff4d-ab09-4f42-ad29-beb21dd735e2","Type":"ContainerStarted","Data":"766fffe61febba3fb23b86edea68bad72c4147f0b0c293723570fe31e7ee6a6f"} Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.904959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ddeaff4d-ab09-4f42-ad29-beb21dd735e2","Type":"ContainerStarted","Data":"4ee640074a156926e7d75d48ecbbb62b984328b236adae6c0dc609da45d1f12a"} Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.906526 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.919208 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"36dec848-d1c8-4567-be43-2e6f0e10db93","Type":"ContainerStarted","Data":"75122f654ca97f1fb8c50c6b46e2eb6ce09b6e13ad4033952e7b9597adfb3064"} Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.919267 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"36dec848-d1c8-4567-be43-2e6f0e10db93","Type":"ContainerStarted","Data":"75e62acbace1f5a7ebb5e9f2ff366015ead0fc80e30fec6ef91b70c5fae38875"} Feb 19 22:51:28 crc kubenswrapper[4771]: I0219 22:51:28.951947 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.95192765 podStartE2EDuration="1.95192765s" podCreationTimestamp="2026-02-19 22:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:28.945479938 +0000 UTC m=+4989.216922448" watchObservedRunningTime="2026-02-19 22:51:28.95192765 +0000 UTC m=+4989.223370120" Feb 19 22:51:30 crc kubenswrapper[4771]: I0219 22:51:30.936248 4771 generic.go:334] "Generic (PLEG): container finished" podID="01b94015-d7c8-4ce0-ae61-aa68065a4c85" containerID="2b3e161170cd6960037ffeb5fad242fb5b7e65567ceae63f91fd49d0c4ac318f" exitCode=0 Feb 19 22:51:30 crc kubenswrapper[4771]: I0219 22:51:30.936284 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b94015-d7c8-4ce0-ae61-aa68065a4c85","Type":"ContainerDied","Data":"2b3e161170cd6960037ffeb5fad242fb5b7e65567ceae63f91fd49d0c4ac318f"} Feb 19 22:51:31 crc kubenswrapper[4771]: I0219 22:51:31.949569 4771 generic.go:334] "Generic (PLEG): container finished" podID="36dec848-d1c8-4567-be43-2e6f0e10db93" containerID="75122f654ca97f1fb8c50c6b46e2eb6ce09b6e13ad4033952e7b9597adfb3064" exitCode=0 Feb 19 22:51:31 crc kubenswrapper[4771]: I0219 22:51:31.949680 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"36dec848-d1c8-4567-be43-2e6f0e10db93","Type":"ContainerDied","Data":"75122f654ca97f1fb8c50c6b46e2eb6ce09b6e13ad4033952e7b9597adfb3064"} Feb 19 22:51:31 crc kubenswrapper[4771]: I0219 22:51:31.954666 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"01b94015-d7c8-4ce0-ae61-aa68065a4c85","Type":"ContainerStarted","Data":"9e26e0c424a558c17817807016ae1715173a839ee66246dc4bff7dce9dc90791"} Feb 19 22:51:32 crc kubenswrapper[4771]: I0219 22:51:32.024222 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.024194838 podStartE2EDuration="8.024194838s" podCreationTimestamp="2026-02-19 22:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:32.01752459 +0000 UTC m=+4992.288967060" watchObservedRunningTime="2026-02-19 22:51:32.024194838 +0000 UTC m=+4992.295637318" Feb 19 22:51:32 crc kubenswrapper[4771]: I0219 22:51:32.965569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"36dec848-d1c8-4567-be43-2e6f0e10db93","Type":"ContainerStarted","Data":"c43bd7d3c52452ee11c875489efd7771a9a02564e8eaf89119eede5fdfb23844"} Feb 19 22:51:33 crc kubenswrapper[4771]: I0219 22:51:33.006164 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.006137072 podStartE2EDuration="7.006137072s" podCreationTimestamp="2026-02-19 22:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:32.99851718 +0000 UTC m=+4993.269959690" watchObservedRunningTime="2026-02-19 22:51:33.006137072 +0000 UTC m=+4993.277579552" Feb 19 22:51:33 crc kubenswrapper[4771]: I0219 22:51:33.144298 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 22:51:33 crc kubenswrapper[4771]: I0219 22:51:33.407305 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:34 crc kubenswrapper[4771]: I0219 22:51:34.202228 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:51:34 crc kubenswrapper[4771]: I0219 22:51:34.286764 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-wrnjj"] Feb 19 22:51:34 crc kubenswrapper[4771]: I0219 22:51:34.287250 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" podUID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerName="dnsmasq-dns" containerID="cri-o://ea84cb17ca1b1da48373d84601b08f82b6019d4559928bb3ca7958e4e7ea22e5" gracePeriod=10 Feb 19 22:51:34 crc kubenswrapper[4771]: I0219 22:51:34.987453 4771 generic.go:334] "Generic (PLEG): container finished" podID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerID="ea84cb17ca1b1da48373d84601b08f82b6019d4559928bb3ca7958e4e7ea22e5" exitCode=0 Feb 19 22:51:34 crc kubenswrapper[4771]: I0219 22:51:34.987544 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" event={"ID":"d6909792-9596-4ab1-a8f3-2597f966d3f3","Type":"ContainerDied","Data":"ea84cb17ca1b1da48373d84601b08f82b6019d4559928bb3ca7958e4e7ea22e5"} Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.239836 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.335295 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-config\") pod \"d6909792-9596-4ab1-a8f3-2597f966d3f3\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.335517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-dns-svc\") pod \"d6909792-9596-4ab1-a8f3-2597f966d3f3\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.335611 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nndvb\" (UniqueName: \"kubernetes.io/projected/d6909792-9596-4ab1-a8f3-2597f966d3f3-kube-api-access-nndvb\") pod \"d6909792-9596-4ab1-a8f3-2597f966d3f3\" (UID: \"d6909792-9596-4ab1-a8f3-2597f966d3f3\") " Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.348476 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6909792-9596-4ab1-a8f3-2597f966d3f3-kube-api-access-nndvb" (OuterVolumeSpecName: "kube-api-access-nndvb") pod "d6909792-9596-4ab1-a8f3-2597f966d3f3" (UID: "d6909792-9596-4ab1-a8f3-2597f966d3f3"). InnerVolumeSpecName "kube-api-access-nndvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.378596 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-config" (OuterVolumeSpecName: "config") pod "d6909792-9596-4ab1-a8f3-2597f966d3f3" (UID: "d6909792-9596-4ab1-a8f3-2597f966d3f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.393164 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6909792-9596-4ab1-a8f3-2597f966d3f3" (UID: "d6909792-9596-4ab1-a8f3-2597f966d3f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.440049 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.440090 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6909792-9596-4ab1-a8f3-2597f966d3f3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:35 crc kubenswrapper[4771]: I0219 22:51:35.440102 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nndvb\" (UniqueName: \"kubernetes.io/projected/d6909792-9596-4ab1-a8f3-2597f966d3f3-kube-api-access-nndvb\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.002105 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" event={"ID":"d6909792-9596-4ab1-a8f3-2597f966d3f3","Type":"ContainerDied","Data":"612676c00959b92ea363913706989430973cdac2447e7d64243fa5a3125c4277"} Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.002205 4771 scope.go:117] "RemoveContainer" containerID="ea84cb17ca1b1da48373d84601b08f82b6019d4559928bb3ca7958e4e7ea22e5" Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.002376 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-wrnjj" Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.048976 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.050166 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.057792 4771 scope.go:117] "RemoveContainer" containerID="a25e0c519348ea9bcc9827a62fbd23efb309305155f524e6f69fa8a20f72e44e" Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.058084 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-wrnjj"] Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.071026 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-wrnjj"] Feb 19 22:51:36 crc kubenswrapper[4771]: I0219 22:51:36.453800 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6909792-9596-4ab1-a8f3-2597f966d3f3" path="/var/lib/kubelet/pods/d6909792-9596-4ab1-a8f3-2597f966d3f3/volumes" Feb 19 22:51:37 crc kubenswrapper[4771]: I0219 22:51:37.557408 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:37 crc kubenswrapper[4771]: I0219 22:51:37.557469 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:38 crc kubenswrapper[4771]: I0219 22:51:38.915294 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 22:51:39 crc kubenswrapper[4771]: I0219 22:51:39.003757 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 22:51:40 crc kubenswrapper[4771]: I0219 22:51:40.145604 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:40 crc kubenswrapper[4771]: I0219 22:51:40.271999 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.676785 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hhcsx"] Feb 19 22:51:44 crc kubenswrapper[4771]: E0219 22:51:44.678186 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerName="init" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.678216 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerName="init" Feb 19 22:51:44 crc kubenswrapper[4771]: E0219 22:51:44.678249 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerName="dnsmasq-dns" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.678263 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerName="dnsmasq-dns" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.678539 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6909792-9596-4ab1-a8f3-2597f966d3f3" containerName="dnsmasq-dns" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.679486 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.685820 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.688546 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hhcsx"] Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.708927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqgvm\" (UniqueName: \"kubernetes.io/projected/103ceab7-7046-4478-9896-6f7f220f9f5b-kube-api-access-nqgvm\") pod \"root-account-create-update-hhcsx\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.709147 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103ceab7-7046-4478-9896-6f7f220f9f5b-operator-scripts\") pod \"root-account-create-update-hhcsx\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.810384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqgvm\" (UniqueName: \"kubernetes.io/projected/103ceab7-7046-4478-9896-6f7f220f9f5b-kube-api-access-nqgvm\") pod \"root-account-create-update-hhcsx\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.810623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103ceab7-7046-4478-9896-6f7f220f9f5b-operator-scripts\") pod \"root-account-create-update-hhcsx\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.811856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103ceab7-7046-4478-9896-6f7f220f9f5b-operator-scripts\") pod \"root-account-create-update-hhcsx\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:44 crc kubenswrapper[4771]: I0219 22:51:44.837141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqgvm\" (UniqueName: \"kubernetes.io/projected/103ceab7-7046-4478-9896-6f7f220f9f5b-kube-api-access-nqgvm\") pod \"root-account-create-update-hhcsx\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:45 crc kubenswrapper[4771]: I0219 22:51:45.008622 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:45 crc kubenswrapper[4771]: I0219 22:51:45.528087 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hhcsx"] Feb 19 22:51:46 crc kubenswrapper[4771]: I0219 22:51:46.090247 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hhcsx" event={"ID":"103ceab7-7046-4478-9896-6f7f220f9f5b","Type":"ContainerStarted","Data":"17d5150da9b598c5fc4589be02d4875c9b9e770ab3843cce3e194e5821a7e55d"} Feb 19 22:51:47 crc kubenswrapper[4771]: I0219 22:51:47.101984 4771 generic.go:334] "Generic (PLEG): container finished" podID="103ceab7-7046-4478-9896-6f7f220f9f5b" containerID="6eeaa439c42cdaf5e51a297908c004b04a8444bb1ee9e325924a23252146631d" exitCode=0 Feb 19 22:51:47 crc kubenswrapper[4771]: I0219 22:51:47.102078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hhcsx" event={"ID":"103ceab7-7046-4478-9896-6f7f220f9f5b","Type":"ContainerDied","Data":"6eeaa439c42cdaf5e51a297908c004b04a8444bb1ee9e325924a23252146631d"} Feb 19 22:51:48 crc kubenswrapper[4771]: I0219 22:51:48.541536 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:48 crc kubenswrapper[4771]: I0219 22:51:48.581181 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103ceab7-7046-4478-9896-6f7f220f9f5b-operator-scripts\") pod \"103ceab7-7046-4478-9896-6f7f220f9f5b\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " Feb 19 22:51:48 crc kubenswrapper[4771]: I0219 22:51:48.581299 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqgvm\" (UniqueName: \"kubernetes.io/projected/103ceab7-7046-4478-9896-6f7f220f9f5b-kube-api-access-nqgvm\") pod \"103ceab7-7046-4478-9896-6f7f220f9f5b\" (UID: \"103ceab7-7046-4478-9896-6f7f220f9f5b\") " Feb 19 22:51:48 crc kubenswrapper[4771]: I0219 22:51:48.582419 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/103ceab7-7046-4478-9896-6f7f220f9f5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "103ceab7-7046-4478-9896-6f7f220f9f5b" (UID: "103ceab7-7046-4478-9896-6f7f220f9f5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:51:48 crc kubenswrapper[4771]: I0219 22:51:48.659303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103ceab7-7046-4478-9896-6f7f220f9f5b-kube-api-access-nqgvm" (OuterVolumeSpecName: "kube-api-access-nqgvm") pod "103ceab7-7046-4478-9896-6f7f220f9f5b" (UID: "103ceab7-7046-4478-9896-6f7f220f9f5b"). InnerVolumeSpecName "kube-api-access-nqgvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:48 crc kubenswrapper[4771]: I0219 22:51:48.683199 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqgvm\" (UniqueName: \"kubernetes.io/projected/103ceab7-7046-4478-9896-6f7f220f9f5b-kube-api-access-nqgvm\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:48 crc kubenswrapper[4771]: I0219 22:51:48.683248 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/103ceab7-7046-4478-9896-6f7f220f9f5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:49 crc kubenswrapper[4771]: I0219 22:51:49.130329 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hhcsx" event={"ID":"103ceab7-7046-4478-9896-6f7f220f9f5b","Type":"ContainerDied","Data":"17d5150da9b598c5fc4589be02d4875c9b9e770ab3843cce3e194e5821a7e55d"} Feb 19 22:51:49 crc kubenswrapper[4771]: I0219 22:51:49.130403 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17d5150da9b598c5fc4589be02d4875c9b9e770ab3843cce3e194e5821a7e55d" Feb 19 22:51:49 crc kubenswrapper[4771]: I0219 22:51:49.130412 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhcsx" Feb 19 22:51:51 crc kubenswrapper[4771]: I0219 22:51:51.244094 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hhcsx"] Feb 19 22:51:51 crc kubenswrapper[4771]: I0219 22:51:51.254754 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hhcsx"] Feb 19 22:51:52 crc kubenswrapper[4771]: I0219 22:51:52.454739 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103ceab7-7046-4478-9896-6f7f220f9f5b" path="/var/lib/kubelet/pods/103ceab7-7046-4478-9896-6f7f220f9f5b/volumes" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.238607 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ltm4c"] Feb 19 22:51:56 crc kubenswrapper[4771]: E0219 22:51:56.239492 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103ceab7-7046-4478-9896-6f7f220f9f5b" containerName="mariadb-account-create-update" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.239512 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="103ceab7-7046-4478-9896-6f7f220f9f5b" containerName="mariadb-account-create-update" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.239721 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="103ceab7-7046-4478-9896-6f7f220f9f5b" containerName="mariadb-account-create-update" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.240299 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.242849 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.257725 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ltm4c"] Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.315557 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bacb27-3fc0-4e14-960f-ca5058407555-operator-scripts\") pod \"root-account-create-update-ltm4c\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.315864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqljx\" (UniqueName: \"kubernetes.io/projected/d0bacb27-3fc0-4e14-960f-ca5058407555-kube-api-access-bqljx\") pod \"root-account-create-update-ltm4c\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.423383 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bacb27-3fc0-4e14-960f-ca5058407555-operator-scripts\") pod \"root-account-create-update-ltm4c\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.423684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqljx\" (UniqueName: \"kubernetes.io/projected/d0bacb27-3fc0-4e14-960f-ca5058407555-kube-api-access-bqljx\") pod \"root-account-create-update-ltm4c\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.424370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bacb27-3fc0-4e14-960f-ca5058407555-operator-scripts\") pod \"root-account-create-update-ltm4c\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.454842 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqljx\" (UniqueName: \"kubernetes.io/projected/d0bacb27-3fc0-4e14-960f-ca5058407555-kube-api-access-bqljx\") pod \"root-account-create-update-ltm4c\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:56 crc kubenswrapper[4771]: I0219 22:51:56.590227 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:57 crc kubenswrapper[4771]: I0219 22:51:57.061663 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ltm4c"] Feb 19 22:51:57 crc kubenswrapper[4771]: I0219 22:51:57.207748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ltm4c" event={"ID":"d0bacb27-3fc0-4e14-960f-ca5058407555","Type":"ContainerStarted","Data":"8d4dbbe6ee23f569403ea1ad2fa8050bdbecbe488e29410519c4ccfabcc1316e"} Feb 19 22:51:58 crc kubenswrapper[4771]: I0219 22:51:58.218008 4771 generic.go:334] "Generic (PLEG): container finished" podID="d0bacb27-3fc0-4e14-960f-ca5058407555" containerID="047d9d94cd4852842bbcc5eb0558e71cc4f58600037cf5514d944ecee4f134d1" exitCode=0 Feb 19 22:51:58 crc kubenswrapper[4771]: I0219 22:51:58.218131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ltm4c" event={"ID":"d0bacb27-3fc0-4e14-960f-ca5058407555","Type":"ContainerDied","Data":"047d9d94cd4852842bbcc5eb0558e71cc4f58600037cf5514d944ecee4f134d1"} Feb 19 22:51:59 crc kubenswrapper[4771]: I0219 22:51:59.784518 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ltm4c" Feb 19 22:51:59 crc kubenswrapper[4771]: I0219 22:51:59.885485 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bacb27-3fc0-4e14-960f-ca5058407555-operator-scripts\") pod \"d0bacb27-3fc0-4e14-960f-ca5058407555\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " Feb 19 22:51:59 crc kubenswrapper[4771]: I0219 22:51:59.885535 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqljx\" (UniqueName: \"kubernetes.io/projected/d0bacb27-3fc0-4e14-960f-ca5058407555-kube-api-access-bqljx\") pod \"d0bacb27-3fc0-4e14-960f-ca5058407555\" (UID: \"d0bacb27-3fc0-4e14-960f-ca5058407555\") " Feb 19 22:51:59 crc kubenswrapper[4771]: I0219 22:51:59.886453 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bacb27-3fc0-4e14-960f-ca5058407555-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0bacb27-3fc0-4e14-960f-ca5058407555" (UID: "d0bacb27-3fc0-4e14-960f-ca5058407555"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:51:59 crc kubenswrapper[4771]: I0219 22:51:59.895111 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bacb27-3fc0-4e14-960f-ca5058407555-kube-api-access-bqljx" (OuterVolumeSpecName: "kube-api-access-bqljx") pod "d0bacb27-3fc0-4e14-960f-ca5058407555" (UID: "d0bacb27-3fc0-4e14-960f-ca5058407555"). InnerVolumeSpecName "kube-api-access-bqljx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:59 crc kubenswrapper[4771]: I0219 22:51:59.987170 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bacb27-3fc0-4e14-960f-ca5058407555-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:59 crc kubenswrapper[4771]: I0219 22:51:59.987238 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqljx\" (UniqueName: \"kubernetes.io/projected/d0bacb27-3fc0-4e14-960f-ca5058407555-kube-api-access-bqljx\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:01 crc kubenswrapper[4771]: I0219 22:52:01.419441 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ltm4c" Feb 19 22:52:01 crc kubenswrapper[4771]: I0219 22:52:01.426688 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerID="1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627" exitCode=0 Feb 19 22:52:01 crc kubenswrapper[4771]: I0219 22:52:01.429522 4771 generic.go:334] "Generic (PLEG): container finished" podID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerID="06f769e2ebb381ec5a698d65a39f9fa9e934f5a03e13cf16bd66810b5b0aff68" exitCode=0 Feb 19 22:52:01 crc kubenswrapper[4771]: I0219 22:52:01.427799 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ltm4c" event={"ID":"d0bacb27-3fc0-4e14-960f-ca5058407555","Type":"ContainerDied","Data":"8d4dbbe6ee23f569403ea1ad2fa8050bdbecbe488e29410519c4ccfabcc1316e"} Feb 19 22:52:01 crc kubenswrapper[4771]: I0219 22:52:01.431880 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d4dbbe6ee23f569403ea1ad2fa8050bdbecbe488e29410519c4ccfabcc1316e" Feb 19 22:52:01 crc kubenswrapper[4771]: I0219 22:52:01.431902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc52c95e-217f-4d63-9825-d6496a419b5d","Type":"ContainerDied","Data":"1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627"} Feb 19 22:52:01 crc kubenswrapper[4771]: I0219 22:52:01.431918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507e947c-a8b3-49e6-b84d-6a5df734f2b1","Type":"ContainerDied","Data":"06f769e2ebb381ec5a698d65a39f9fa9e934f5a03e13cf16bd66810b5b0aff68"} Feb 19 22:52:02 crc kubenswrapper[4771]: I0219 22:52:02.435932 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc52c95e-217f-4d63-9825-d6496a419b5d","Type":"ContainerStarted","Data":"3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346"} Feb 19 22:52:02 crc kubenswrapper[4771]: I0219 22:52:02.443812 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 22:52:02 crc kubenswrapper[4771]: I0219 22:52:02.443859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507e947c-a8b3-49e6-b84d-6a5df734f2b1","Type":"ContainerStarted","Data":"de231ad06875f40e2d9d2f6ffa8c0cdeea8ac1d5ec43485759774a86f98fd1b1"} Feb 19 22:52:02 crc kubenswrapper[4771]: I0219 22:52:02.444053 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:02 crc kubenswrapper[4771]: I0219 22:52:02.469702 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.469686807 podStartE2EDuration="39.469686807s" podCreationTimestamp="2026-02-19 22:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:02.465431064 +0000 UTC m=+5022.736873544" watchObservedRunningTime="2026-02-19 22:52:02.469686807 +0000 UTC m=+5022.741129277" Feb 19 22:52:02 crc kubenswrapper[4771]: I0219 22:52:02.496108 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.49609058 podStartE2EDuration="39.49609058s" podCreationTimestamp="2026-02-19 22:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:02.490276496 +0000 UTC m=+5022.761718976" watchObservedRunningTime="2026-02-19 22:52:02.49609058 +0000 UTC m=+5022.767533040" Feb 19 22:52:12 crc kubenswrapper[4771]: I0219 22:52:12.956978 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:52:12 crc kubenswrapper[4771]: I0219 22:52:12.957581 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:52:15 crc kubenswrapper[4771]: I0219 22:52:15.136325 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 22:52:15 crc kubenswrapper[4771]: I0219 22:52:15.453149 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.575434 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-jnj2q"] Feb 19 22:52:20 crc kubenswrapper[4771]: E0219 22:52:20.576801 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bacb27-3fc0-4e14-960f-ca5058407555" containerName="mariadb-account-create-update" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.576824 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bacb27-3fc0-4e14-960f-ca5058407555" containerName="mariadb-account-create-update" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.577080 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bacb27-3fc0-4e14-960f-ca5058407555" containerName="mariadb-account-create-update" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.588496 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-jnj2q"] Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.588601 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.731573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-config\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.731638 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4xp\" (UniqueName: \"kubernetes.io/projected/e8640339-a417-44c2-9fda-cfd1f6838baa-kube-api-access-gm4xp\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.731966 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.834051 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.834658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-config\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.834704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4xp\" (UniqueName: \"kubernetes.io/projected/e8640339-a417-44c2-9fda-cfd1f6838baa-kube-api-access-gm4xp\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.834951 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.835980 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-config\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.873227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4xp\" (UniqueName: \"kubernetes.io/projected/e8640339-a417-44c2-9fda-cfd1f6838baa-kube-api-access-gm4xp\") pod \"dnsmasq-dns-54dc9c94cc-jnj2q\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:20 crc kubenswrapper[4771]: I0219 22:52:20.916205 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:21 crc kubenswrapper[4771]: I0219 22:52:21.407954 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:52:21 crc kubenswrapper[4771]: I0219 22:52:21.564507 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-jnj2q"] Feb 19 22:52:21 crc kubenswrapper[4771]: W0219 22:52:21.567421 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8640339_a417_44c2_9fda_cfd1f6838baa.slice/crio-01c63eb87e6bf2585723f49e95058fd45d57be5c9350df0f4c506d3d38ba53c2 WatchSource:0}: Error finding container 01c63eb87e6bf2585723f49e95058fd45d57be5c9350df0f4c506d3d38ba53c2: Status 404 returned error can't find the container with id 01c63eb87e6bf2585723f49e95058fd45d57be5c9350df0f4c506d3d38ba53c2 Feb 19 22:52:21 crc kubenswrapper[4771]: I0219 22:52:21.608638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" event={"ID":"e8640339-a417-44c2-9fda-cfd1f6838baa","Type":"ContainerStarted","Data":"01c63eb87e6bf2585723f49e95058fd45d57be5c9350df0f4c506d3d38ba53c2"} Feb 19 22:52:22 crc kubenswrapper[4771]: I0219 22:52:22.359301 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:52:22 crc kubenswrapper[4771]: I0219 22:52:22.618220 4771 generic.go:334] "Generic (PLEG): container finished" podID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerID="1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d" exitCode=0 Feb 19 22:52:22 crc kubenswrapper[4771]: I0219 22:52:22.618276 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" event={"ID":"e8640339-a417-44c2-9fda-cfd1f6838baa","Type":"ContainerDied","Data":"1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d"} Feb 19 22:52:23 crc kubenswrapper[4771]: I0219 22:52:23.628342 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" event={"ID":"e8640339-a417-44c2-9fda-cfd1f6838baa","Type":"ContainerStarted","Data":"7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f"} Feb 19 22:52:23 crc kubenswrapper[4771]: I0219 22:52:23.628649 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:23 crc kubenswrapper[4771]: I0219 22:52:23.650263 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" podStartSLOduration=3.65024585 podStartE2EDuration="3.65024585s" podCreationTimestamp="2026-02-19 22:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:23.647091736 +0000 UTC m=+5043.918534216" watchObservedRunningTime="2026-02-19 22:52:23.65024585 +0000 UTC m=+5043.921688320" Feb 19 22:52:25 crc kubenswrapper[4771]: I0219 22:52:25.594807 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerName="rabbitmq" containerID="cri-o://3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346" gracePeriod=604796 Feb 19 22:52:26 crc kubenswrapper[4771]: I0219 22:52:26.302996 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerName="rabbitmq" containerID="cri-o://de231ad06875f40e2d9d2f6ffa8c0cdeea8ac1d5ec43485759774a86f98fd1b1" gracePeriod=604797 Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.194671 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8b54t"] Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.197643 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.222438 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8b54t"] Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.384139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-utilities\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.384224 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-catalog-content\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.384995 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9kdh\" (UniqueName: \"kubernetes.io/projected/a83af5ef-eff9-4f6a-8f60-41376c58a64b-kube-api-access-x9kdh\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.487130 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9kdh\" (UniqueName: \"kubernetes.io/projected/a83af5ef-eff9-4f6a-8f60-41376c58a64b-kube-api-access-x9kdh\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.487430 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-utilities\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.487505 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-catalog-content\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.488422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-utilities\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.488549 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-catalog-content\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.514738 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9kdh\" (UniqueName: \"kubernetes.io/projected/a83af5ef-eff9-4f6a-8f60-41376c58a64b-kube-api-access-x9kdh\") pod \"redhat-marketplace-8b54t\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.547261 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:28 crc kubenswrapper[4771]: I0219 22:52:28.993240 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8b54t"] Feb 19 22:52:29 crc kubenswrapper[4771]: I0219 22:52:29.708738 4771 generic.go:334] "Generic (PLEG): container finished" podID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerID="7b28ad1a6580f9742ae289673ba8985112c9e4b931bc7d2d0b309fadc21b0bb9" exitCode=0 Feb 19 22:52:29 crc kubenswrapper[4771]: I0219 22:52:29.708837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8b54t" event={"ID":"a83af5ef-eff9-4f6a-8f60-41376c58a64b","Type":"ContainerDied","Data":"7b28ad1a6580f9742ae289673ba8985112c9e4b931bc7d2d0b309fadc21b0bb9"} Feb 19 22:52:29 crc kubenswrapper[4771]: I0219 22:52:29.713119 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8b54t" event={"ID":"a83af5ef-eff9-4f6a-8f60-41376c58a64b","Type":"ContainerStarted","Data":"914468f9d44110b18666909151c31519991cc65916792e8a495f7922400a9fb1"} Feb 19 22:52:30 crc kubenswrapper[4771]: I0219 22:52:30.725058 4771 generic.go:334] "Generic (PLEG): container finished" podID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerID="bd602aa023839e675fc058839f61623e2eef51c00944f0306c338a163a5df5ff" exitCode=0 Feb 19 22:52:30 crc kubenswrapper[4771]: I0219 22:52:30.725121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8b54t" event={"ID":"a83af5ef-eff9-4f6a-8f60-41376c58a64b","Type":"ContainerDied","Data":"bd602aa023839e675fc058839f61623e2eef51c00944f0306c338a163a5df5ff"} Feb 19 22:52:30 crc kubenswrapper[4771]: I0219 22:52:30.918341 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:52:30 crc kubenswrapper[4771]: I0219 22:52:30.990214 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-v75gp"] Feb 19 22:52:30 crc kubenswrapper[4771]: I0219 22:52:30.990567 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" podUID="06a90808-2706-47f6-8021-a02a43b62284" containerName="dnsmasq-dns" containerID="cri-o://c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0" gracePeriod=10 Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.407078 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.539140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-config\") pod \"06a90808-2706-47f6-8021-a02a43b62284\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.539244 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-dns-svc\") pod \"06a90808-2706-47f6-8021-a02a43b62284\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.539307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdcvw\" (UniqueName: \"kubernetes.io/projected/06a90808-2706-47f6-8021-a02a43b62284-kube-api-access-qdcvw\") pod \"06a90808-2706-47f6-8021-a02a43b62284\" (UID: \"06a90808-2706-47f6-8021-a02a43b62284\") " Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.544645 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a90808-2706-47f6-8021-a02a43b62284-kube-api-access-qdcvw" (OuterVolumeSpecName: "kube-api-access-qdcvw") pod "06a90808-2706-47f6-8021-a02a43b62284" (UID: "06a90808-2706-47f6-8021-a02a43b62284"). InnerVolumeSpecName "kube-api-access-qdcvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.591506 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-config" (OuterVolumeSpecName: "config") pod "06a90808-2706-47f6-8021-a02a43b62284" (UID: "06a90808-2706-47f6-8021-a02a43b62284"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.606123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06a90808-2706-47f6-8021-a02a43b62284" (UID: "06a90808-2706-47f6-8021-a02a43b62284"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.641085 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.643138 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06a90808-2706-47f6-8021-a02a43b62284-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.643579 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdcvw\" (UniqueName: \"kubernetes.io/projected/06a90808-2706-47f6-8021-a02a43b62284-kube-api-access-qdcvw\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.740199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8b54t" event={"ID":"a83af5ef-eff9-4f6a-8f60-41376c58a64b","Type":"ContainerStarted","Data":"72d0a196f4bd1c66c41f979d9864927e289af32224e6bc83398cf10937729234"} Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.743760 4771 generic.go:334] "Generic (PLEG): container finished" podID="06a90808-2706-47f6-8021-a02a43b62284" containerID="c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0" exitCode=0 Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.743788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" event={"ID":"06a90808-2706-47f6-8021-a02a43b62284","Type":"ContainerDied","Data":"c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0"} Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.743805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" event={"ID":"06a90808-2706-47f6-8021-a02a43b62284","Type":"ContainerDied","Data":"440f355c15608517ffc6736e4850f28d17e635d025c03f5815f497a700bd1eac"} Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.743821 4771 scope.go:117] "RemoveContainer" containerID="c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.743932 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-v75gp" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.763162 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8b54t" podStartSLOduration=2.354273538 podStartE2EDuration="3.763135066s" podCreationTimestamp="2026-02-19 22:52:28 +0000 UTC" firstStartedPulling="2026-02-19 22:52:29.711462263 +0000 UTC m=+5049.982904763" lastFinishedPulling="2026-02-19 22:52:31.120323811 +0000 UTC m=+5051.391766291" observedRunningTime="2026-02-19 22:52:31.755013879 +0000 UTC m=+5052.026456389" watchObservedRunningTime="2026-02-19 22:52:31.763135066 +0000 UTC m=+5052.034577576" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.805221 4771 scope.go:117] "RemoveContainer" containerID="8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.807234 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-v75gp"] Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.823369 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-v75gp"] Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.883352 4771 scope.go:117] "RemoveContainer" containerID="c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0" Feb 19 22:52:31 crc kubenswrapper[4771]: E0219 22:52:31.884225 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0\": container with ID starting with c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0 not found: ID does not exist" containerID="c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.884319 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0"} err="failed to get container status \"c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0\": rpc error: code = NotFound desc = could not find container \"c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0\": container with ID starting with c50cec5588ba3930580f14bc6373fa4b49fc9bac6acac723f86baeda8a55e7e0 not found: ID does not exist" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.884355 4771 scope.go:117] "RemoveContainer" containerID="8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f" Feb 19 22:52:31 crc kubenswrapper[4771]: E0219 22:52:31.885421 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f\": container with ID starting with 8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f not found: ID does not exist" containerID="8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f" Feb 19 22:52:31 crc kubenswrapper[4771]: I0219 22:52:31.885505 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f"} err="failed to get container status \"8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f\": rpc error: code = NotFound desc = could not find container \"8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f\": container with ID starting with 8006f8d4fe647826139df66fe067f45784347fb07ac382e89db62e94eac8a97f not found: ID does not exist" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.129666 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149410 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-confd\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149464 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-erlang-cookie\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149500 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc52c95e-217f-4d63-9825-d6496a419b5d-erlang-cookie-secret\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149525 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnnwz\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-kube-api-access-qnnwz\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149543 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-plugins\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149558 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-plugins-conf\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149576 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-server-conf\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149595 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-config-data\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149723 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149754 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-tls\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.149775 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc52c95e-217f-4d63-9825-d6496a419b5d-pod-info\") pod \"dc52c95e-217f-4d63-9825-d6496a419b5d\" (UID: \"dc52c95e-217f-4d63-9825-d6496a419b5d\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.156874 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dc52c95e-217f-4d63-9825-d6496a419b5d-pod-info" (OuterVolumeSpecName: "pod-info") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.157369 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.157477 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.166330 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.170556 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.171211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc52c95e-217f-4d63-9825-d6496a419b5d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.178104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-kube-api-access-qnnwz" (OuterVolumeSpecName: "kube-api-access-qnnwz") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "kube-api-access-qnnwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.195300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-config-data" (OuterVolumeSpecName: "config-data") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.195937 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-server-conf" (OuterVolumeSpecName: "server-conf") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.211438 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30" (OuterVolumeSpecName: "persistence") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.246926 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dc52c95e-217f-4d63-9825-d6496a419b5d" (UID: "dc52c95e-217f-4d63-9825-d6496a419b5d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251245 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251287 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc52c95e-217f-4d63-9825-d6496a419b5d-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251299 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251312 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251326 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc52c95e-217f-4d63-9825-d6496a419b5d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251338 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnnwz\" (UniqueName: \"kubernetes.io/projected/dc52c95e-217f-4d63-9825-d6496a419b5d-kube-api-access-qnnwz\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251349 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc52c95e-217f-4d63-9825-d6496a419b5d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251361 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251372 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251382 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc52c95e-217f-4d63-9825-d6496a419b5d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.251423 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") on node \"crc\" " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.275504 4771 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.275652 4771 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30") on node "crc" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.352655 4771 reconciler_common.go:293] "Volume detached for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.445199 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a90808-2706-47f6-8021-a02a43b62284" path="/var/lib/kubelet/pods/06a90808-2706-47f6-8021-a02a43b62284/volumes" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.758453 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerID="3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346" exitCode=0 Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.758620 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.758817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc52c95e-217f-4d63-9825-d6496a419b5d","Type":"ContainerDied","Data":"3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346"} Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.758876 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc52c95e-217f-4d63-9825-d6496a419b5d","Type":"ContainerDied","Data":"e3b43f6d1f9acdb9bd913cb40349e7b504a1118810c36bc263d9249dfde0eaf1"} Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.758897 4771 scope.go:117] "RemoveContainer" containerID="3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.764910 4771 generic.go:334] "Generic (PLEG): container finished" podID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerID="de231ad06875f40e2d9d2f6ffa8c0cdeea8ac1d5ec43485759774a86f98fd1b1" exitCode=0 Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.765692 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507e947c-a8b3-49e6-b84d-6a5df734f2b1","Type":"ContainerDied","Data":"de231ad06875f40e2d9d2f6ffa8c0cdeea8ac1d5ec43485759774a86f98fd1b1"} Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.789827 4771 scope.go:117] "RemoveContainer" containerID="1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.794126 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.828871 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.839804 4771 scope.go:117] "RemoveContainer" containerID="3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346" Feb 19 22:52:32 crc kubenswrapper[4771]: E0219 22:52:32.840383 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346\": container with ID starting with 3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346 not found: ID does not exist" containerID="3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.840437 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346"} err="failed to get container status \"3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346\": rpc error: code = NotFound desc = could not find container \"3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346\": container with ID starting with 3abc03512114e2bc1f0709d456cd6121cfb15bbe29e1bc3881935b0d8c466346 not found: ID does not exist" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.840468 4771 scope.go:117] "RemoveContainer" containerID="1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627" Feb 19 22:52:32 crc kubenswrapper[4771]: E0219 22:52:32.847411 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627\": container with ID starting with 1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627 not found: ID does not exist" containerID="1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.847457 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627"} err="failed to get container status \"1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627\": rpc error: code = NotFound desc = could not find container \"1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627\": container with ID starting with 1c33dcf6c698cd8495a1d9b81d6c57cccac25c0b9025049268c7777f656a2627 not found: ID does not exist" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.849604 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:52:32 crc kubenswrapper[4771]: E0219 22:52:32.849965 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a90808-2706-47f6-8021-a02a43b62284" containerName="dnsmasq-dns" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.849980 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a90808-2706-47f6-8021-a02a43b62284" containerName="dnsmasq-dns" Feb 19 22:52:32 crc kubenswrapper[4771]: E0219 22:52:32.849997 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerName="rabbitmq" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.850008 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerName="rabbitmq" Feb 19 22:52:32 crc kubenswrapper[4771]: E0219 22:52:32.850039 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerName="setup-container" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.850047 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerName="setup-container" Feb 19 22:52:32 crc kubenswrapper[4771]: E0219 22:52:32.850078 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a90808-2706-47f6-8021-a02a43b62284" containerName="init" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.850087 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a90808-2706-47f6-8021-a02a43b62284" containerName="init" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.850288 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a90808-2706-47f6-8021-a02a43b62284" containerName="dnsmasq-dns" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.850304 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc52c95e-217f-4d63-9825-d6496a419b5d" containerName="rabbitmq" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.851281 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.854424 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.854941 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.855079 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.855438 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.855460 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.855585 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.855610 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ffrtl" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.855783 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881621 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pn7t\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-kube-api-access-6pn7t\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881688 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881728 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881743 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881768 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881790 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.881841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.932356 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.982427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-tls\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.982474 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-plugins\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.982565 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.982585 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-confd\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.982611 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-server-conf\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.982630 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-plugins-conf\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-erlang-cookie\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983120 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-config-data\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983142 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507e947c-a8b3-49e6-b84d-6a5df734f2b1-pod-info\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983198 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507e947c-a8b3-49e6-b84d-6a5df734f2b1-erlang-cookie-secret\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983219 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvgb2\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-kube-api-access-mvgb2\") pod \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\" (UID: \"507e947c-a8b3-49e6-b84d-6a5df734f2b1\") " Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983311 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983364 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983393 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983399 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983419 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pn7t\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-kube-api-access-6pn7t\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983560 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983651 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983686 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983755 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983819 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.985279 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.985918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.983402 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.986129 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.986559 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.988216 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-kube-api-access-mvgb2" (OuterVolumeSpecName: "kube-api-access-mvgb2") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "kube-api-access-mvgb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.988514 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.991788 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.991995 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.993179 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.993213 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f608a103ac0cb6b5cf94070de4d632eb9f754ab75e04d34660913feed56de4e/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.996048 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/507e947c-a8b3-49e6-b84d-6a5df734f2b1-pod-info" (OuterVolumeSpecName: "pod-info") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 22:52:32 crc kubenswrapper[4771]: I0219 22:52:32.999581 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.002922 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.005462 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pn7t\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-kube-api-access-6pn7t\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.005747 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.008757 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507e947c-a8b3-49e6-b84d-6a5df734f2b1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.017756 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.021898 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71" (OuterVolumeSpecName: "persistence") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "pvc-74144035-fc19-4d57-b66c-93a16c9b2d71". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.028610 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-config-data" (OuterVolumeSpecName: "config-data") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.037135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62b530ba-aed2-4dc1-ad5e-1e3089781d30\") pod \"rabbitmq-server-0\" (UID: \"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3\") " pod="openstack/rabbitmq-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.054597 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-server-conf" (OuterVolumeSpecName: "server-conf") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085270 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085299 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085310 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085321 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/507e947c-a8b3-49e6-b84d-6a5df734f2b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085344 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/507e947c-a8b3-49e6-b84d-6a5df734f2b1-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085353 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/507e947c-a8b3-49e6-b84d-6a5df734f2b1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085362 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvgb2\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-kube-api-access-mvgb2\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085370 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.085397 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") on node \"crc\" " Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.102625 4771 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.102744 4771 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-74144035-fc19-4d57-b66c-93a16c9b2d71" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71") on node "crc" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.109237 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "507e947c-a8b3-49e6-b84d-6a5df734f2b1" (UID: "507e947c-a8b3-49e6-b84d-6a5df734f2b1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.187195 4771 reconciler_common.go:293] "Volume detached for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.187232 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/507e947c-a8b3-49e6-b84d-6a5df734f2b1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.188531 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.648001 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:52:33 crc kubenswrapper[4771]: W0219 22:52:33.653199 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58bdc8c9_d5b9_4b8f_bd73_44afe58bf6c3.slice/crio-411e1056ec2535b7d588dcd2a8e4d6ebf01a37fd13f5f69d52a382a5c603dc2f WatchSource:0}: Error finding container 411e1056ec2535b7d588dcd2a8e4d6ebf01a37fd13f5f69d52a382a5c603dc2f: Status 404 returned error can't find the container with id 411e1056ec2535b7d588dcd2a8e4d6ebf01a37fd13f5f69d52a382a5c603dc2f Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.780104 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"507e947c-a8b3-49e6-b84d-6a5df734f2b1","Type":"ContainerDied","Data":"029c1b222ef8516403e8703d6649dfa41d9d46daa24031fd14bd1a991fba7892"} Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.780163 4771 scope.go:117] "RemoveContainer" containerID="de231ad06875f40e2d9d2f6ffa8c0cdeea8ac1d5ec43485759774a86f98fd1b1" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.780229 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.781939 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3","Type":"ContainerStarted","Data":"411e1056ec2535b7d588dcd2a8e4d6ebf01a37fd13f5f69d52a382a5c603dc2f"} Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.802957 4771 scope.go:117] "RemoveContainer" containerID="06f769e2ebb381ec5a698d65a39f9fa9e934f5a03e13cf16bd66810b5b0aff68" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.835803 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.859483 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.873289 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:52:33 crc kubenswrapper[4771]: E0219 22:52:33.873860 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerName="rabbitmq" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.874204 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerName="rabbitmq" Feb 19 22:52:33 crc kubenswrapper[4771]: E0219 22:52:33.874662 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerName="setup-container" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.874741 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerName="setup-container" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.874999 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" containerName="rabbitmq" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.876419 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.882889 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.883137 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4d55v" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.883312 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.883606 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.883717 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.883615 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.884072 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 22:52:33 crc kubenswrapper[4771]: I0219 22:52:33.890371 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018236 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018304 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018332 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018362 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018405 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t98wx\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-kube-api-access-t98wx\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018452 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018487 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018694 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.018872 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120643 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120682 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120801 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t98wx\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-kube-api-access-t98wx\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120835 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120878 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120930 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.120968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.121056 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.121861 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.122390 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.122420 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.123847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.125774 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.126993 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.127802 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.127853 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.127836 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.127919 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d7b6b4c63064f416101e47cc765fa4b1ca43546ea2c8655906aabf5711abe664/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.128529 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.154063 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t98wx\" (UniqueName: \"kubernetes.io/projected/c005771d-7e6d-44f8-93d0-6bf7c1db97d5-kube-api-access-t98wx\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.180744 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74144035-fc19-4d57-b66c-93a16c9b2d71\") pod \"rabbitmq-cell1-server-0\" (UID: \"c005771d-7e6d-44f8-93d0-6bf7c1db97d5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.196591 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.465383 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507e947c-a8b3-49e6-b84d-6a5df734f2b1" path="/var/lib/kubelet/pods/507e947c-a8b3-49e6-b84d-6a5df734f2b1/volumes" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.472704 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc52c95e-217f-4d63-9825-d6496a419b5d" path="/var/lib/kubelet/pods/dc52c95e-217f-4d63-9825-d6496a419b5d/volumes" Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.532485 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:52:34 crc kubenswrapper[4771]: W0219 22:52:34.772781 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc005771d_7e6d_44f8_93d0_6bf7c1db97d5.slice/crio-252590f5c07a8b03c238d6f339feeca7d58c648c226c60892940aa25f6ab23e9 WatchSource:0}: Error finding container 252590f5c07a8b03c238d6f339feeca7d58c648c226c60892940aa25f6ab23e9: Status 404 returned error can't find the container with id 252590f5c07a8b03c238d6f339feeca7d58c648c226c60892940aa25f6ab23e9 Feb 19 22:52:34 crc kubenswrapper[4771]: I0219 22:52:34.796944 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c005771d-7e6d-44f8-93d0-6bf7c1db97d5","Type":"ContainerStarted","Data":"252590f5c07a8b03c238d6f339feeca7d58c648c226c60892940aa25f6ab23e9"} Feb 19 22:52:35 crc kubenswrapper[4771]: I0219 22:52:35.805174 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3","Type":"ContainerStarted","Data":"8cb1623d35ef2d7eca156c4edb725f28fc1f44a03d62fbb4c87bcf4fb29f8627"} Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.758317 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7wsq9"] Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.762197 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.777096 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wsq9"] Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.828365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c005771d-7e6d-44f8-93d0-6bf7c1db97d5","Type":"ContainerStarted","Data":"a4beb726441637e7581b2d89258db513413b93627498503e252066d061a1b6a9"} Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.891107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdjv\" (UniqueName: \"kubernetes.io/projected/a2e67390-de25-4df3-924f-c3d04c3be4ec-kube-api-access-ccdjv\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.891536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-utilities\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.891818 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-catalog-content\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.993211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-utilities\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.993655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-catalog-content\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.993850 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-utilities\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.994170 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-catalog-content\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:37 crc kubenswrapper[4771]: I0219 22:52:37.994850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdjv\" (UniqueName: \"kubernetes.io/projected/a2e67390-de25-4df3-924f-c3d04c3be4ec-kube-api-access-ccdjv\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.017732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdjv\" (UniqueName: \"kubernetes.io/projected/a2e67390-de25-4df3-924f-c3d04c3be4ec-kube-api-access-ccdjv\") pod \"certified-operators-7wsq9\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.142349 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.547595 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.547845 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.585494 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.628855 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wsq9"] Feb 19 22:52:38 crc kubenswrapper[4771]: W0219 22:52:38.629643 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2e67390_de25_4df3_924f_c3d04c3be4ec.slice/crio-a5c1031f9146070348e519e433a0e99e4b96634bdee212b099c6e70d40e28758 WatchSource:0}: Error finding container a5c1031f9146070348e519e433a0e99e4b96634bdee212b099c6e70d40e28758: Status 404 returned error can't find the container with id a5c1031f9146070348e519e433a0e99e4b96634bdee212b099c6e70d40e28758 Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.837234 4771 generic.go:334] "Generic (PLEG): container finished" podID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerID="eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c" exitCode=0 Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.837334 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wsq9" event={"ID":"a2e67390-de25-4df3-924f-c3d04c3be4ec","Type":"ContainerDied","Data":"eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c"} Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.837766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wsq9" event={"ID":"a2e67390-de25-4df3-924f-c3d04c3be4ec","Type":"ContainerStarted","Data":"a5c1031f9146070348e519e433a0e99e4b96634bdee212b099c6e70d40e28758"} Feb 19 22:52:38 crc kubenswrapper[4771]: I0219 22:52:38.888953 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:39 crc kubenswrapper[4771]: I0219 22:52:39.850194 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wsq9" event={"ID":"a2e67390-de25-4df3-924f-c3d04c3be4ec","Type":"ContainerStarted","Data":"cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12"} Feb 19 22:52:40 crc kubenswrapper[4771]: I0219 22:52:40.860803 4771 generic.go:334] "Generic (PLEG): container finished" podID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerID="cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12" exitCode=0 Feb 19 22:52:40 crc kubenswrapper[4771]: I0219 22:52:40.860920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wsq9" event={"ID":"a2e67390-de25-4df3-924f-c3d04c3be4ec","Type":"ContainerDied","Data":"cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12"} Feb 19 22:52:40 crc kubenswrapper[4771]: I0219 22:52:40.931005 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8b54t"] Feb 19 22:52:40 crc kubenswrapper[4771]: I0219 22:52:40.931366 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8b54t" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="registry-server" containerID="cri-o://72d0a196f4bd1c66c41f979d9864927e289af32224e6bc83398cf10937729234" gracePeriod=2 Feb 19 22:52:41 crc kubenswrapper[4771]: I0219 22:52:41.908179 4771 generic.go:334] "Generic (PLEG): container finished" podID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerID="72d0a196f4bd1c66c41f979d9864927e289af32224e6bc83398cf10937729234" exitCode=0 Feb 19 22:52:41 crc kubenswrapper[4771]: I0219 22:52:41.908481 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8b54t" event={"ID":"a83af5ef-eff9-4f6a-8f60-41376c58a64b","Type":"ContainerDied","Data":"72d0a196f4bd1c66c41f979d9864927e289af32224e6bc83398cf10937729234"} Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.008591 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.173388 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-catalog-content\") pod \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.173572 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9kdh\" (UniqueName: \"kubernetes.io/projected/a83af5ef-eff9-4f6a-8f60-41376c58a64b-kube-api-access-x9kdh\") pod \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.173748 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-utilities\") pod \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\" (UID: \"a83af5ef-eff9-4f6a-8f60-41376c58a64b\") " Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.175058 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-utilities" (OuterVolumeSpecName: "utilities") pod "a83af5ef-eff9-4f6a-8f60-41376c58a64b" (UID: "a83af5ef-eff9-4f6a-8f60-41376c58a64b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.182316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83af5ef-eff9-4f6a-8f60-41376c58a64b-kube-api-access-x9kdh" (OuterVolumeSpecName: "kube-api-access-x9kdh") pod "a83af5ef-eff9-4f6a-8f60-41376c58a64b" (UID: "a83af5ef-eff9-4f6a-8f60-41376c58a64b"). InnerVolumeSpecName "kube-api-access-x9kdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.217480 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a83af5ef-eff9-4f6a-8f60-41376c58a64b" (UID: "a83af5ef-eff9-4f6a-8f60-41376c58a64b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.276345 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.276410 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9kdh\" (UniqueName: \"kubernetes.io/projected/a83af5ef-eff9-4f6a-8f60-41376c58a64b-kube-api-access-x9kdh\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.276440 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a83af5ef-eff9-4f6a-8f60-41376c58a64b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.920210 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wsq9" event={"ID":"a2e67390-de25-4df3-924f-c3d04c3be4ec","Type":"ContainerStarted","Data":"8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9"} Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.924486 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8b54t" event={"ID":"a83af5ef-eff9-4f6a-8f60-41376c58a64b","Type":"ContainerDied","Data":"914468f9d44110b18666909151c31519991cc65916792e8a495f7922400a9fb1"} Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.924527 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8b54t" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.924553 4771 scope.go:117] "RemoveContainer" containerID="72d0a196f4bd1c66c41f979d9864927e289af32224e6bc83398cf10937729234" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.947870 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7wsq9" podStartSLOduration=2.687051324 podStartE2EDuration="5.947843426s" podCreationTimestamp="2026-02-19 22:52:37 +0000 UTC" firstStartedPulling="2026-02-19 22:52:38.840427181 +0000 UTC m=+5059.111869691" lastFinishedPulling="2026-02-19 22:52:42.101219293 +0000 UTC m=+5062.372661793" observedRunningTime="2026-02-19 22:52:42.945980207 +0000 UTC m=+5063.217422687" watchObservedRunningTime="2026-02-19 22:52:42.947843426 +0000 UTC m=+5063.219285936" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.951013 4771 scope.go:117] "RemoveContainer" containerID="bd602aa023839e675fc058839f61623e2eef51c00944f0306c338a163a5df5ff" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.957247 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.957290 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.973811 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8b54t"] Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.977520 4771 scope.go:117] "RemoveContainer" containerID="7b28ad1a6580f9742ae289673ba8985112c9e4b931bc7d2d0b309fadc21b0bb9" Feb 19 22:52:42 crc kubenswrapper[4771]: I0219 22:52:42.983732 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8b54t"] Feb 19 22:52:44 crc kubenswrapper[4771]: I0219 22:52:44.452855 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" path="/var/lib/kubelet/pods/a83af5ef-eff9-4f6a-8f60-41376c58a64b/volumes" Feb 19 22:52:48 crc kubenswrapper[4771]: I0219 22:52:48.142991 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:48 crc kubenswrapper[4771]: I0219 22:52:48.143488 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:48 crc kubenswrapper[4771]: I0219 22:52:48.227078 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:49 crc kubenswrapper[4771]: I0219 22:52:49.068178 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:49 crc kubenswrapper[4771]: I0219 22:52:49.124873 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wsq9"] Feb 19 22:52:50 crc kubenswrapper[4771]: I0219 22:52:50.996863 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7wsq9" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="registry-server" containerID="cri-o://8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9" gracePeriod=2 Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.493886 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.651422 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-catalog-content\") pod \"a2e67390-de25-4df3-924f-c3d04c3be4ec\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.651863 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccdjv\" (UniqueName: \"kubernetes.io/projected/a2e67390-de25-4df3-924f-c3d04c3be4ec-kube-api-access-ccdjv\") pod \"a2e67390-de25-4df3-924f-c3d04c3be4ec\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.652117 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-utilities\") pod \"a2e67390-de25-4df3-924f-c3d04c3be4ec\" (UID: \"a2e67390-de25-4df3-924f-c3d04c3be4ec\") " Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.653590 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-utilities" (OuterVolumeSpecName: "utilities") pod "a2e67390-de25-4df3-924f-c3d04c3be4ec" (UID: "a2e67390-de25-4df3-924f-c3d04c3be4ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.657948 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e67390-de25-4df3-924f-c3d04c3be4ec-kube-api-access-ccdjv" (OuterVolumeSpecName: "kube-api-access-ccdjv") pod "a2e67390-de25-4df3-924f-c3d04c3be4ec" (UID: "a2e67390-de25-4df3-924f-c3d04c3be4ec"). InnerVolumeSpecName "kube-api-access-ccdjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.701226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2e67390-de25-4df3-924f-c3d04c3be4ec" (UID: "a2e67390-de25-4df3-924f-c3d04c3be4ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.754251 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.754285 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccdjv\" (UniqueName: \"kubernetes.io/projected/a2e67390-de25-4df3-924f-c3d04c3be4ec-kube-api-access-ccdjv\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:51 crc kubenswrapper[4771]: I0219 22:52:51.754298 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e67390-de25-4df3-924f-c3d04c3be4ec-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.011075 4771 generic.go:334] "Generic (PLEG): container finished" podID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerID="8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9" exitCode=0 Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.011165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wsq9" event={"ID":"a2e67390-de25-4df3-924f-c3d04c3be4ec","Type":"ContainerDied","Data":"8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9"} Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.011216 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wsq9" event={"ID":"a2e67390-de25-4df3-924f-c3d04c3be4ec","Type":"ContainerDied","Data":"a5c1031f9146070348e519e433a0e99e4b96634bdee212b099c6e70d40e28758"} Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.011225 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wsq9" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.011255 4771 scope.go:117] "RemoveContainer" containerID="8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.044615 4771 scope.go:117] "RemoveContainer" containerID="cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.071720 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wsq9"] Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.081643 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7wsq9"] Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.094999 4771 scope.go:117] "RemoveContainer" containerID="eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.143528 4771 scope.go:117] "RemoveContainer" containerID="8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9" Feb 19 22:52:52 crc kubenswrapper[4771]: E0219 22:52:52.144223 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9\": container with ID starting with 8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9 not found: ID does not exist" containerID="8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.144298 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9"} err="failed to get container status \"8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9\": rpc error: code = NotFound desc = could not find container \"8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9\": container with ID starting with 8a0b20051dfcd8aaa74f8cfc77853041edb19721f1b72b849188b5e081e38eb9 not found: ID does not exist" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.144341 4771 scope.go:117] "RemoveContainer" containerID="cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12" Feb 19 22:52:52 crc kubenswrapper[4771]: E0219 22:52:52.144997 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12\": container with ID starting with cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12 not found: ID does not exist" containerID="cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.145109 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12"} err="failed to get container status \"cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12\": rpc error: code = NotFound desc = could not find container \"cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12\": container with ID starting with cc60488ea47a3f721cbf79fca88c8b6d0a49742d09db8816e74c08df263b3b12 not found: ID does not exist" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.145169 4771 scope.go:117] "RemoveContainer" containerID="eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c" Feb 19 22:52:52 crc kubenswrapper[4771]: E0219 22:52:52.145802 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c\": container with ID starting with eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c not found: ID does not exist" containerID="eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.145880 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c"} err="failed to get container status \"eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c\": rpc error: code = NotFound desc = could not find container \"eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c\": container with ID starting with eb7647509f494754a01311dc26d583a4590166212b07bebed07c0b65191aa74c not found: ID does not exist" Feb 19 22:52:52 crc kubenswrapper[4771]: I0219 22:52:52.453120 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" path="/var/lib/kubelet/pods/a2e67390-de25-4df3-924f-c3d04c3be4ec/volumes" Feb 19 22:53:09 crc kubenswrapper[4771]: I0219 22:53:09.196522 4771 generic.go:334] "Generic (PLEG): container finished" podID="58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3" containerID="8cb1623d35ef2d7eca156c4edb725f28fc1f44a03d62fbb4c87bcf4fb29f8627" exitCode=0 Feb 19 22:53:09 crc kubenswrapper[4771]: I0219 22:53:09.196650 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3","Type":"ContainerDied","Data":"8cb1623d35ef2d7eca156c4edb725f28fc1f44a03d62fbb4c87bcf4fb29f8627"} Feb 19 22:53:10 crc kubenswrapper[4771]: I0219 22:53:10.207837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3","Type":"ContainerStarted","Data":"876800f4442222b42b3439430e6b9e36680223fee439bb46f44eeab19342c1b5"} Feb 19 22:53:10 crc kubenswrapper[4771]: I0219 22:53:10.208801 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 22:53:10 crc kubenswrapper[4771]: I0219 22:53:10.265882 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.265861472 podStartE2EDuration="38.265861472s" podCreationTimestamp="2026-02-19 22:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:53:10.247866261 +0000 UTC m=+5090.519308791" watchObservedRunningTime="2026-02-19 22:53:10.265861472 +0000 UTC m=+5090.537303942" Feb 19 22:53:11 crc kubenswrapper[4771]: I0219 22:53:11.217704 4771 generic.go:334] "Generic (PLEG): container finished" podID="c005771d-7e6d-44f8-93d0-6bf7c1db97d5" containerID="a4beb726441637e7581b2d89258db513413b93627498503e252066d061a1b6a9" exitCode=0 Feb 19 22:53:11 crc kubenswrapper[4771]: I0219 22:53:11.217860 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c005771d-7e6d-44f8-93d0-6bf7c1db97d5","Type":"ContainerDied","Data":"a4beb726441637e7581b2d89258db513413b93627498503e252066d061a1b6a9"} Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.228807 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c005771d-7e6d-44f8-93d0-6bf7c1db97d5","Type":"ContainerStarted","Data":"7b39d924634118486adaca6d1de94049e4434c6b7f1f204f5bfc6a0f197faf68"} Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.229604 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.265068 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.265009134 podStartE2EDuration="39.265009134s" podCreationTimestamp="2026-02-19 22:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:53:12.254105473 +0000 UTC m=+5092.525547983" watchObservedRunningTime="2026-02-19 22:53:12.265009134 +0000 UTC m=+5092.536451644" Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.957392 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.957487 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.957553 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.958707 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8d83d2f9b637a931ed9fb55168bd29dffbae091a62af39ff035a8fae4582da4"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:53:12 crc kubenswrapper[4771]: I0219 22:53:12.958858 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://b8d83d2f9b637a931ed9fb55168bd29dffbae091a62af39ff035a8fae4582da4" gracePeriod=600 Feb 19 22:53:13 crc kubenswrapper[4771]: I0219 22:53:13.240774 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="b8d83d2f9b637a931ed9fb55168bd29dffbae091a62af39ff035a8fae4582da4" exitCode=0 Feb 19 22:53:13 crc kubenswrapper[4771]: I0219 22:53:13.240847 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"b8d83d2f9b637a931ed9fb55168bd29dffbae091a62af39ff035a8fae4582da4"} Feb 19 22:53:13 crc kubenswrapper[4771]: I0219 22:53:13.241445 4771 scope.go:117] "RemoveContainer" containerID="88182133d6a3863270fca8ae4838728e1520fc8e9efd6b5b568c0117802081dc" Feb 19 22:53:14 crc kubenswrapper[4771]: I0219 22:53:14.254459 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41"} Feb 19 22:53:23 crc kubenswrapper[4771]: I0219 22:53:23.193093 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 22:53:24 crc kubenswrapper[4771]: I0219 22:53:24.200688 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.285520 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:53:29 crc kubenswrapper[4771]: E0219 22:53:29.286578 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="extract-content" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.286600 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="extract-content" Feb 19 22:53:29 crc kubenswrapper[4771]: E0219 22:53:29.286629 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="extract-utilities" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.286643 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="extract-utilities" Feb 19 22:53:29 crc kubenswrapper[4771]: E0219 22:53:29.286681 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="extract-utilities" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.286693 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="extract-utilities" Feb 19 22:53:29 crc kubenswrapper[4771]: E0219 22:53:29.286717 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="registry-server" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.286726 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="registry-server" Feb 19 22:53:29 crc kubenswrapper[4771]: E0219 22:53:29.286744 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="extract-content" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.286754 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="extract-content" Feb 19 22:53:29 crc kubenswrapper[4771]: E0219 22:53:29.286768 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="registry-server" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.286778 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="registry-server" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.286982 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83af5ef-eff9-4f6a-8f60-41376c58a64b" containerName="registry-server" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.287065 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e67390-de25-4df3-924f-c3d04c3be4ec" containerName="registry-server" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.287776 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.290890 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bxjzh" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.300405 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.401182 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjbvw\" (UniqueName: \"kubernetes.io/projected/add303ca-f3d8-40f3-a1db-d91697d2a287-kube-api-access-xjbvw\") pod \"mariadb-client\" (UID: \"add303ca-f3d8-40f3-a1db-d91697d2a287\") " pod="openstack/mariadb-client" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.503829 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjbvw\" (UniqueName: \"kubernetes.io/projected/add303ca-f3d8-40f3-a1db-d91697d2a287-kube-api-access-xjbvw\") pod \"mariadb-client\" (UID: \"add303ca-f3d8-40f3-a1db-d91697d2a287\") " pod="openstack/mariadb-client" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.531455 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjbvw\" (UniqueName: \"kubernetes.io/projected/add303ca-f3d8-40f3-a1db-d91697d2a287-kube-api-access-xjbvw\") pod \"mariadb-client\" (UID: \"add303ca-f3d8-40f3-a1db-d91697d2a287\") " pod="openstack/mariadb-client" Feb 19 22:53:29 crc kubenswrapper[4771]: I0219 22:53:29.645073 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:53:30 crc kubenswrapper[4771]: I0219 22:53:30.234250 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:53:30 crc kubenswrapper[4771]: W0219 22:53:30.235261 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadd303ca_f3d8_40f3_a1db_d91697d2a287.slice/crio-2d19b8ff06f41e8f0279f9ac7d3a51e60d28d2ac59300b078a3ca7aa79bee46c WatchSource:0}: Error finding container 2d19b8ff06f41e8f0279f9ac7d3a51e60d28d2ac59300b078a3ca7aa79bee46c: Status 404 returned error can't find the container with id 2d19b8ff06f41e8f0279f9ac7d3a51e60d28d2ac59300b078a3ca7aa79bee46c Feb 19 22:53:30 crc kubenswrapper[4771]: I0219 22:53:30.418566 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"add303ca-f3d8-40f3-a1db-d91697d2a287","Type":"ContainerStarted","Data":"2d19b8ff06f41e8f0279f9ac7d3a51e60d28d2ac59300b078a3ca7aa79bee46c"} Feb 19 22:53:31 crc kubenswrapper[4771]: I0219 22:53:31.431409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"add303ca-f3d8-40f3-a1db-d91697d2a287","Type":"ContainerStarted","Data":"20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe"} Feb 19 22:53:31 crc kubenswrapper[4771]: I0219 22:53:31.453754 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.887706497 podStartE2EDuration="2.453697651s" podCreationTimestamp="2026-02-19 22:53:29 +0000 UTC" firstStartedPulling="2026-02-19 22:53:30.239755514 +0000 UTC m=+5110.511198004" lastFinishedPulling="2026-02-19 22:53:30.805746658 +0000 UTC m=+5111.077189158" observedRunningTime="2026-02-19 22:53:31.451978065 +0000 UTC m=+5111.723420635" watchObservedRunningTime="2026-02-19 22:53:31.453697651 +0000 UTC m=+5111.725140201" Feb 19 22:53:45 crc kubenswrapper[4771]: I0219 22:53:45.517971 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:53:45 crc kubenswrapper[4771]: I0219 22:53:45.519680 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="add303ca-f3d8-40f3-a1db-d91697d2a287" containerName="mariadb-client" containerID="cri-o://20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe" gracePeriod=30 Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.137049 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.203715 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjbvw\" (UniqueName: \"kubernetes.io/projected/add303ca-f3d8-40f3-a1db-d91697d2a287-kube-api-access-xjbvw\") pod \"add303ca-f3d8-40f3-a1db-d91697d2a287\" (UID: \"add303ca-f3d8-40f3-a1db-d91697d2a287\") " Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.213237 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add303ca-f3d8-40f3-a1db-d91697d2a287-kube-api-access-xjbvw" (OuterVolumeSpecName: "kube-api-access-xjbvw") pod "add303ca-f3d8-40f3-a1db-d91697d2a287" (UID: "add303ca-f3d8-40f3-a1db-d91697d2a287"). InnerVolumeSpecName "kube-api-access-xjbvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.305833 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjbvw\" (UniqueName: \"kubernetes.io/projected/add303ca-f3d8-40f3-a1db-d91697d2a287-kube-api-access-xjbvw\") on node \"crc\" DevicePath \"\"" Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.568847 4771 generic.go:334] "Generic (PLEG): container finished" podID="add303ca-f3d8-40f3-a1db-d91697d2a287" containerID="20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe" exitCode=143 Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.568940 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.568945 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"add303ca-f3d8-40f3-a1db-d91697d2a287","Type":"ContainerDied","Data":"20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe"} Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.570242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"add303ca-f3d8-40f3-a1db-d91697d2a287","Type":"ContainerDied","Data":"2d19b8ff06f41e8f0279f9ac7d3a51e60d28d2ac59300b078a3ca7aa79bee46c"} Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.570262 4771 scope.go:117] "RemoveContainer" containerID="20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe" Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.589369 4771 scope.go:117] "RemoveContainer" containerID="20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe" Feb 19 22:53:46 crc kubenswrapper[4771]: E0219 22:53:46.589800 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe\": container with ID starting with 20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe not found: ID does not exist" containerID="20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe" Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.589850 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe"} err="failed to get container status \"20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe\": rpc error: code = NotFound desc = could not find container \"20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe\": container with ID starting with 20299512feb0917d227eb8ca4945112dd4e9dd6706b427742e016456c04750fe not found: ID does not exist" Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.597096 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:53:46 crc kubenswrapper[4771]: I0219 22:53:46.604605 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:53:48 crc kubenswrapper[4771]: I0219 22:53:48.454786 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add303ca-f3d8-40f3-a1db-d91697d2a287" path="/var/lib/kubelet/pods/add303ca-f3d8-40f3-a1db-d91697d2a287/volumes" Feb 19 22:55:34 crc kubenswrapper[4771]: I0219 22:55:34.906897 4771 scope.go:117] "RemoveContainer" containerID="e73e2a891444595dc7bb40215c182f97a7ab94d43f8f8d22c7cc9703d2c3705d" Feb 19 22:55:42 crc kubenswrapper[4771]: I0219 22:55:42.956716 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:55:42 crc kubenswrapper[4771]: I0219 22:55:42.957369 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:56:12 crc kubenswrapper[4771]: I0219 22:56:12.956993 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:56:12 crc kubenswrapper[4771]: I0219 22:56:12.957649 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:56:42 crc kubenswrapper[4771]: I0219 22:56:42.957201 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:56:42 crc kubenswrapper[4771]: I0219 22:56:42.959225 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:56:42 crc kubenswrapper[4771]: I0219 22:56:42.959296 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 22:56:43 crc kubenswrapper[4771]: I0219 22:56:43.088284 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:56:43 crc kubenswrapper[4771]: I0219 22:56:43.088439 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" gracePeriod=600 Feb 19 22:56:43 crc kubenswrapper[4771]: E0219 22:56:43.226960 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:56:44 crc kubenswrapper[4771]: I0219 22:56:44.100862 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" exitCode=0 Feb 19 22:56:44 crc kubenswrapper[4771]: I0219 22:56:44.100925 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41"} Feb 19 22:56:44 crc kubenswrapper[4771]: I0219 22:56:44.101056 4771 scope.go:117] "RemoveContainer" containerID="b8d83d2f9b637a931ed9fb55168bd29dffbae091a62af39ff035a8fae4582da4" Feb 19 22:56:44 crc kubenswrapper[4771]: I0219 22:56:44.102238 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:56:44 crc kubenswrapper[4771]: E0219 22:56:44.102670 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:56:57 crc kubenswrapper[4771]: I0219 22:56:57.437550 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:56:57 crc kubenswrapper[4771]: E0219 22:56:57.440106 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:57:12 crc kubenswrapper[4771]: I0219 22:57:12.439121 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:57:12 crc kubenswrapper[4771]: E0219 22:57:12.440293 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.780466 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:57:14 crc kubenswrapper[4771]: E0219 22:57:14.781492 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add303ca-f3d8-40f3-a1db-d91697d2a287" containerName="mariadb-client" Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.781519 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="add303ca-f3d8-40f3-a1db-d91697d2a287" containerName="mariadb-client" Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.781817 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="add303ca-f3d8-40f3-a1db-d91697d2a287" containerName="mariadb-client" Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.782787 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.785809 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bxjzh" Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.790839 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.938471 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k64nv\" (UniqueName: \"kubernetes.io/projected/cefa8dc6-536d-442a-a4ea-b003328c2d78-kube-api-access-k64nv\") pod \"mariadb-copy-data\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " pod="openstack/mariadb-copy-data" Feb 19 22:57:14 crc kubenswrapper[4771]: I0219 22:57:14.938587 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8774242e-6773-4472-85a3-4f778be5386a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a\") pod \"mariadb-copy-data\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " pod="openstack/mariadb-copy-data" Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.040277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8774242e-6773-4472-85a3-4f778be5386a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a\") pod \"mariadb-copy-data\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " pod="openstack/mariadb-copy-data" Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.040464 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k64nv\" (UniqueName: \"kubernetes.io/projected/cefa8dc6-536d-442a-a4ea-b003328c2d78-kube-api-access-k64nv\") pod \"mariadb-copy-data\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " pod="openstack/mariadb-copy-data" Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.044703 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.044776 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8774242e-6773-4472-85a3-4f778be5386a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a\") pod \"mariadb-copy-data\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ec87746538fd1839047f77bf58a65c8c196b09e3cb685b5347deb241895e3671/globalmount\"" pod="openstack/mariadb-copy-data" Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.069525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k64nv\" (UniqueName: \"kubernetes.io/projected/cefa8dc6-536d-442a-a4ea-b003328c2d78-kube-api-access-k64nv\") pod \"mariadb-copy-data\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " pod="openstack/mariadb-copy-data" Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.100291 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8774242e-6773-4472-85a3-4f778be5386a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a\") pod \"mariadb-copy-data\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " pod="openstack/mariadb-copy-data" Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.122483 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 22:57:15 crc kubenswrapper[4771]: I0219 22:57:15.749490 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:57:15 crc kubenswrapper[4771]: W0219 22:57:15.754475 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcefa8dc6_536d_442a_a4ea_b003328c2d78.slice/crio-31f5396eff18e48b74aeddb6722a55e01fa8db6526de54ebebacaee4ebb3984c WatchSource:0}: Error finding container 31f5396eff18e48b74aeddb6722a55e01fa8db6526de54ebebacaee4ebb3984c: Status 404 returned error can't find the container with id 31f5396eff18e48b74aeddb6722a55e01fa8db6526de54ebebacaee4ebb3984c Feb 19 22:57:16 crc kubenswrapper[4771]: I0219 22:57:16.407928 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cefa8dc6-536d-442a-a4ea-b003328c2d78","Type":"ContainerStarted","Data":"522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea"} Feb 19 22:57:16 crc kubenswrapper[4771]: I0219 22:57:16.407993 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cefa8dc6-536d-442a-a4ea-b003328c2d78","Type":"ContainerStarted","Data":"31f5396eff18e48b74aeddb6722a55e01fa8db6526de54ebebacaee4ebb3984c"} Feb 19 22:57:16 crc kubenswrapper[4771]: I0219 22:57:16.430835 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.430807511 podStartE2EDuration="3.430807511s" podCreationTimestamp="2026-02-19 22:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:57:16.427223416 +0000 UTC m=+5336.698665976" watchObservedRunningTime="2026-02-19 22:57:16.430807511 +0000 UTC m=+5336.702250031" Feb 19 22:57:19 crc kubenswrapper[4771]: I0219 22:57:19.508112 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:19 crc kubenswrapper[4771]: I0219 22:57:19.509235 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:19 crc kubenswrapper[4771]: I0219 22:57:19.516781 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:19 crc kubenswrapper[4771]: I0219 22:57:19.621148 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vwt\" (UniqueName: \"kubernetes.io/projected/783b5fb3-8346-478a-bfe4-3d4dce43d27e-kube-api-access-46vwt\") pod \"mariadb-client\" (UID: \"783b5fb3-8346-478a-bfe4-3d4dce43d27e\") " pod="openstack/mariadb-client" Feb 19 22:57:19 crc kubenswrapper[4771]: I0219 22:57:19.722424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vwt\" (UniqueName: \"kubernetes.io/projected/783b5fb3-8346-478a-bfe4-3d4dce43d27e-kube-api-access-46vwt\") pod \"mariadb-client\" (UID: \"783b5fb3-8346-478a-bfe4-3d4dce43d27e\") " pod="openstack/mariadb-client" Feb 19 22:57:19 crc kubenswrapper[4771]: I0219 22:57:19.755960 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vwt\" (UniqueName: \"kubernetes.io/projected/783b5fb3-8346-478a-bfe4-3d4dce43d27e-kube-api-access-46vwt\") pod \"mariadb-client\" (UID: \"783b5fb3-8346-478a-bfe4-3d4dce43d27e\") " pod="openstack/mariadb-client" Feb 19 22:57:19 crc kubenswrapper[4771]: I0219 22:57:19.843407 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:20 crc kubenswrapper[4771]: W0219 22:57:20.098175 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783b5fb3_8346_478a_bfe4_3d4dce43d27e.slice/crio-24916f0dfa0c454a948550b4120e0e6fd311d30002099b45652305375b9bdc43 WatchSource:0}: Error finding container 24916f0dfa0c454a948550b4120e0e6fd311d30002099b45652305375b9bdc43: Status 404 returned error can't find the container with id 24916f0dfa0c454a948550b4120e0e6fd311d30002099b45652305375b9bdc43 Feb 19 22:57:20 crc kubenswrapper[4771]: I0219 22:57:20.102426 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:20 crc kubenswrapper[4771]: I0219 22:57:20.442406 4771 generic.go:334] "Generic (PLEG): container finished" podID="783b5fb3-8346-478a-bfe4-3d4dce43d27e" containerID="4613b11501b2a7a92c214c2eeb1a4f17b6de53a2a454911e4b3b855e5518a9af" exitCode=0 Feb 19 22:57:20 crc kubenswrapper[4771]: I0219 22:57:20.457513 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"783b5fb3-8346-478a-bfe4-3d4dce43d27e","Type":"ContainerDied","Data":"4613b11501b2a7a92c214c2eeb1a4f17b6de53a2a454911e4b3b855e5518a9af"} Feb 19 22:57:20 crc kubenswrapper[4771]: I0219 22:57:20.457567 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"783b5fb3-8346-478a-bfe4-3d4dce43d27e","Type":"ContainerStarted","Data":"24916f0dfa0c454a948550b4120e0e6fd311d30002099b45652305375b9bdc43"} Feb 19 22:57:21 crc kubenswrapper[4771]: I0219 22:57:21.947821 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:21 crc kubenswrapper[4771]: I0219 22:57:21.984815 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_783b5fb3-8346-478a-bfe4-3d4dce43d27e/mariadb-client/0.log" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.026305 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.038009 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.084217 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46vwt\" (UniqueName: \"kubernetes.io/projected/783b5fb3-8346-478a-bfe4-3d4dce43d27e-kube-api-access-46vwt\") pod \"783b5fb3-8346-478a-bfe4-3d4dce43d27e\" (UID: \"783b5fb3-8346-478a-bfe4-3d4dce43d27e\") " Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.092395 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783b5fb3-8346-478a-bfe4-3d4dce43d27e-kube-api-access-46vwt" (OuterVolumeSpecName: "kube-api-access-46vwt") pod "783b5fb3-8346-478a-bfe4-3d4dce43d27e" (UID: "783b5fb3-8346-478a-bfe4-3d4dce43d27e"). InnerVolumeSpecName "kube-api-access-46vwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.186529 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46vwt\" (UniqueName: \"kubernetes.io/projected/783b5fb3-8346-478a-bfe4-3d4dce43d27e-kube-api-access-46vwt\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.193584 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:22 crc kubenswrapper[4771]: E0219 22:57:22.194083 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783b5fb3-8346-478a-bfe4-3d4dce43d27e" containerName="mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.194111 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="783b5fb3-8346-478a-bfe4-3d4dce43d27e" containerName="mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.194383 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="783b5fb3-8346-478a-bfe4-3d4dce43d27e" containerName="mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.195171 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.206826 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.288529 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6wk8\" (UniqueName: \"kubernetes.io/projected/07a06dd4-9903-4436-92e9-ecefec2789ab-kube-api-access-w6wk8\") pod \"mariadb-client\" (UID: \"07a06dd4-9903-4436-92e9-ecefec2789ab\") " pod="openstack/mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.390334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6wk8\" (UniqueName: \"kubernetes.io/projected/07a06dd4-9903-4436-92e9-ecefec2789ab-kube-api-access-w6wk8\") pod \"mariadb-client\" (UID: \"07a06dd4-9903-4436-92e9-ecefec2789ab\") " pod="openstack/mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.419154 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6wk8\" (UniqueName: \"kubernetes.io/projected/07a06dd4-9903-4436-92e9-ecefec2789ab-kube-api-access-w6wk8\") pod \"mariadb-client\" (UID: \"07a06dd4-9903-4436-92e9-ecefec2789ab\") " pod="openstack/mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.454423 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783b5fb3-8346-478a-bfe4-3d4dce43d27e" path="/var/lib/kubelet/pods/783b5fb3-8346-478a-bfe4-3d4dce43d27e/volumes" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.461722 4771 scope.go:117] "RemoveContainer" containerID="4613b11501b2a7a92c214c2eeb1a4f17b6de53a2a454911e4b3b855e5518a9af" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.461736 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: I0219 22:57:22.523837 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:22 crc kubenswrapper[4771]: E0219 22:57:22.657347 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783b5fb3_8346_478a_bfe4_3d4dce43d27e.slice/crio-24916f0dfa0c454a948550b4120e0e6fd311d30002099b45652305375b9bdc43\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783b5fb3_8346_478a_bfe4_3d4dce43d27e.slice\": RecentStats: unable to find data in memory cache]" Feb 19 22:57:23 crc kubenswrapper[4771]: I0219 22:57:23.041201 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:23 crc kubenswrapper[4771]: W0219 22:57:23.044512 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07a06dd4_9903_4436_92e9_ecefec2789ab.slice/crio-b21ae608540f0d084a97761d2fcce6317c6a5f014252fed061767bc8d1aa2264 WatchSource:0}: Error finding container b21ae608540f0d084a97761d2fcce6317c6a5f014252fed061767bc8d1aa2264: Status 404 returned error can't find the container with id b21ae608540f0d084a97761d2fcce6317c6a5f014252fed061767bc8d1aa2264 Feb 19 22:57:23 crc kubenswrapper[4771]: I0219 22:57:23.477101 4771 generic.go:334] "Generic (PLEG): container finished" podID="07a06dd4-9903-4436-92e9-ecefec2789ab" containerID="5d3bd9f2eb48ede7ab46d1795ce774caf4c52d267590a311874b28d8a84eb867" exitCode=0 Feb 19 22:57:23 crc kubenswrapper[4771]: I0219 22:57:23.477163 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"07a06dd4-9903-4436-92e9-ecefec2789ab","Type":"ContainerDied","Data":"5d3bd9f2eb48ede7ab46d1795ce774caf4c52d267590a311874b28d8a84eb867"} Feb 19 22:57:23 crc kubenswrapper[4771]: I0219 22:57:23.477205 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"07a06dd4-9903-4436-92e9-ecefec2789ab","Type":"ContainerStarted","Data":"b21ae608540f0d084a97761d2fcce6317c6a5f014252fed061767bc8d1aa2264"} Feb 19 22:57:24 crc kubenswrapper[4771]: I0219 22:57:24.934719 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:24 crc kubenswrapper[4771]: I0219 22:57:24.966487 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_07a06dd4-9903-4436-92e9-ecefec2789ab/mariadb-client/0.log" Feb 19 22:57:25 crc kubenswrapper[4771]: I0219 22:57:25.001951 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:25 crc kubenswrapper[4771]: I0219 22:57:25.010736 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:57:25 crc kubenswrapper[4771]: I0219 22:57:25.033364 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6wk8\" (UniqueName: \"kubernetes.io/projected/07a06dd4-9903-4436-92e9-ecefec2789ab-kube-api-access-w6wk8\") pod \"07a06dd4-9903-4436-92e9-ecefec2789ab\" (UID: \"07a06dd4-9903-4436-92e9-ecefec2789ab\") " Feb 19 22:57:25 crc kubenswrapper[4771]: I0219 22:57:25.042313 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a06dd4-9903-4436-92e9-ecefec2789ab-kube-api-access-w6wk8" (OuterVolumeSpecName: "kube-api-access-w6wk8") pod "07a06dd4-9903-4436-92e9-ecefec2789ab" (UID: "07a06dd4-9903-4436-92e9-ecefec2789ab"). InnerVolumeSpecName "kube-api-access-w6wk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:25 crc kubenswrapper[4771]: I0219 22:57:25.135590 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6wk8\" (UniqueName: \"kubernetes.io/projected/07a06dd4-9903-4436-92e9-ecefec2789ab-kube-api-access-w6wk8\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:25 crc kubenswrapper[4771]: I0219 22:57:25.499618 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b21ae608540f0d084a97761d2fcce6317c6a5f014252fed061767bc8d1aa2264" Feb 19 22:57:25 crc kubenswrapper[4771]: I0219 22:57:25.499642 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:57:26 crc kubenswrapper[4771]: I0219 22:57:26.449841 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a06dd4-9903-4436-92e9-ecefec2789ab" path="/var/lib/kubelet/pods/07a06dd4-9903-4436-92e9-ecefec2789ab/volumes" Feb 19 22:57:27 crc kubenswrapper[4771]: I0219 22:57:27.437788 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:57:27 crc kubenswrapper[4771]: E0219 22:57:27.438244 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:57:39 crc kubenswrapper[4771]: I0219 22:57:39.438891 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:57:39 crc kubenswrapper[4771]: E0219 22:57:39.439991 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:57:50 crc kubenswrapper[4771]: I0219 22:57:50.445624 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:57:50 crc kubenswrapper[4771]: E0219 22:57:50.446760 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.033043 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:57:59 crc kubenswrapper[4771]: E0219 22:57:59.034054 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a06dd4-9903-4436-92e9-ecefec2789ab" containerName="mariadb-client" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.034075 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a06dd4-9903-4436-92e9-ecefec2789ab" containerName="mariadb-client" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.034343 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a06dd4-9903-4436-92e9-ecefec2789ab" containerName="mariadb-client" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.035697 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.042002 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.042307 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.042393 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.042605 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dtzhj" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.042943 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.043974 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.054626 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.056130 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.077183 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.080335 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.089090 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.116672 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152256 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152486 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrn9\" (UniqueName: \"kubernetes.io/projected/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-kube-api-access-lbrn9\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152602 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152665 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.152738 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-config\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254338 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254400 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254494 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27e9f00-5617-42ce-ab8b-16002c5ac59b-config\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254597 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-755d6ed8-4d81-4272-a671-812c4cb37069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-755d6ed8-4d81-4272-a671-812c4cb37069\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrn9\" (UniqueName: \"kubernetes.io/projected/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-kube-api-access-lbrn9\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254728 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7fv2\" (UniqueName: \"kubernetes.io/projected/c27e9f00-5617-42ce-ab8b-16002c5ac59b-kube-api-access-r7fv2\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254761 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-config\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254796 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254836 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c27e9f00-5617-42ce-ab8b-16002c5ac59b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254863 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.254873 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.256433 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.256755 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.256864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257042 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257093 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-config\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257169 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtgqj\" (UniqueName: \"kubernetes.io/projected/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-kube-api-access-mtgqj\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257242 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c27e9f00-5617-42ce-ab8b-16002c5ac59b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.257327 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.258331 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-config\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.258827 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.258866 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/37dd7dcfa19d31d03f491a247dde0f16c41b454d8b2cc53f708086f0831694d1/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.267166 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.268989 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.281562 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.282204 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrn9\" (UniqueName: \"kubernetes.io/projected/7ebf8641-9285-4ae4-afc9-0cf3b1bc585e-kube-api-access-lbrn9\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.292974 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df58ac5d-452c-48c5-99fd-2ef13c382123\") pod \"ovsdbserver-nb-0\" (UID: \"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.358890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.358931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-755d6ed8-4d81-4272-a671-812c4cb37069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-755d6ed8-4d81-4272-a671-812c4cb37069\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.358967 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7fv2\" (UniqueName: \"kubernetes.io/projected/c27e9f00-5617-42ce-ab8b-16002c5ac59b-kube-api-access-r7fv2\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.358985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-config\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c27e9f00-5617-42ce-ab8b-16002c5ac59b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359071 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359091 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtgqj\" (UniqueName: \"kubernetes.io/projected/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-kube-api-access-mtgqj\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359133 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c27e9f00-5617-42ce-ab8b-16002c5ac59b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359236 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.359260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27e9f00-5617-42ce-ab8b-16002c5ac59b-config\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.360762 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.361896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-config\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.362725 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c27e9f00-5617-42ce-ab8b-16002c5ac59b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.363338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27e9f00-5617-42ce-ab8b-16002c5ac59b-config\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.363484 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.363710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c27e9f00-5617-42ce-ab8b-16002c5ac59b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.364597 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.364634 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e33b778a0e1235fde8f2d2cd67b0410c01e70a0ea23f995cc50d60ed9df8ce37/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.365515 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.367464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.369548 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.371376 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.372188 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.372520 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.373453 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-755d6ed8-4d81-4272-a671-812c4cb37069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-755d6ed8-4d81-4272-a671-812c4cb37069\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f90d2efffa4a82a59a81b94e650ca424ea992fea16a0c295aa5dacb730a78c28/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.376894 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.382640 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27e9f00-5617-42ce-ab8b-16002c5ac59b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.391891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7fv2\" (UniqueName: \"kubernetes.io/projected/c27e9f00-5617-42ce-ab8b-16002c5ac59b-kube-api-access-r7fv2\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.393512 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtgqj\" (UniqueName: \"kubernetes.io/projected/e51891da-e934-4e4e-afb0-dd9f7bc4a6ef-kube-api-access-mtgqj\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.422807 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eab579e4-8f4f-482b-802e-5fe20bb28ff7\") pod \"ovsdbserver-nb-2\" (UID: \"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.446088 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-755d6ed8-4d81-4272-a671-812c4cb37069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-755d6ed8-4d81-4272-a671-812c4cb37069\") pod \"ovsdbserver-nb-1\" (UID: \"c27e9f00-5617-42ce-ab8b-16002c5ac59b\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.682456 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.698852 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 19 22:57:59 crc kubenswrapper[4771]: I0219 22:57:59.970766 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.070746 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.162158 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:58:00 crc kubenswrapper[4771]: W0219 22:58:00.265476 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ebf8641_9285_4ae4_afc9_0cf3b1bc585e.slice/crio-9f6ac668150ecdaabf67e82c423e53711ba867c631c770393640081d9689d0cb WatchSource:0}: Error finding container 9f6ac668150ecdaabf67e82c423e53711ba867c631c770393640081d9689d0cb: Status 404 returned error can't find the container with id 9f6ac668150ecdaabf67e82c423e53711ba867c631c770393640081d9689d0cb Feb 19 22:58:00 crc kubenswrapper[4771]: W0219 22:58:00.267276 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc27e9f00_5617_42ce_ab8b_16002c5ac59b.slice/crio-60fc96f9794fb09c4dd63ef657cbeaa317710a21b1675ac5ea860242676a696c WatchSource:0}: Error finding container 60fc96f9794fb09c4dd63ef657cbeaa317710a21b1675ac5ea860242676a696c: Status 404 returned error can't find the container with id 60fc96f9794fb09c4dd63ef657cbeaa317710a21b1675ac5ea860242676a696c Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.601282 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.605583 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.613117 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.614243 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.614837 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dthjl" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.615353 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.615712 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.623145 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.627608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.636654 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.639169 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.642397 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.649887 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.785817 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.785885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.785930 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.785963 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786088 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-config\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786122 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f88597b7-4903-4e70-96e6-5dd93135525c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f88597b7-4903-4e70-96e6-5dd93135525c\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786255 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786292 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-config\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786327 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfpj4\" (UniqueName: \"kubernetes.io/projected/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-kube-api-access-mfpj4\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786608 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786685 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786727 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786801 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786847 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwnk\" (UniqueName: \"kubernetes.io/projected/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-kube-api-access-6hwnk\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-config\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.786933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdf6\" (UniqueName: \"kubernetes.io/projected/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-kube-api-access-fmdf6\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.787010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.787112 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.824969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef","Type":"ContainerStarted","Data":"b61d2b87c3b2e5b98876007570dbfb7c6b8e6ca4ce11da48822531006f18af50"} Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.827980 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"c27e9f00-5617-42ce-ab8b-16002c5ac59b","Type":"ContainerStarted","Data":"60fc96f9794fb09c4dd63ef657cbeaa317710a21b1675ac5ea860242676a696c"} Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.829326 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e","Type":"ContainerStarted","Data":"9f6ac668150ecdaabf67e82c423e53711ba867c631c770393640081d9689d0cb"} Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889398 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f88597b7-4903-4e70-96e6-5dd93135525c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f88597b7-4903-4e70-96e6-5dd93135525c\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889516 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-config\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889554 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889596 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfpj4\" (UniqueName: \"kubernetes.io/projected/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-kube-api-access-mfpj4\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889701 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889820 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwnk\" (UniqueName: \"kubernetes.io/projected/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-kube-api-access-6hwnk\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-config\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889933 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdf6\" (UniqueName: \"kubernetes.io/projected/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-kube-api-access-fmdf6\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.889976 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890139 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890264 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890326 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-config\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890406 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.890439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.892275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.893176 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-config\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.893205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.893244 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.893996 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-config\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.896430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.896481 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.897479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.899183 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-config\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.899477 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.899538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.899797 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.900409 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.903657 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.911599 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.911766 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.912394 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.912461 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.912476 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3ced1ebbcf6be163cf6839cd65dcbca1853855f0cc4d7e8607f165c3546999dd/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.912574 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.912618 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.912648 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f88597b7-4903-4e70-96e6-5dd93135525c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f88597b7-4903-4e70-96e6-5dd93135525c\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dd0057ea1dcec96c10557150c3adc33927769dfcbe1d083cb2766437aa0096e8/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.913252 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.913292 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/80833593cf8b2241b3041912bbfdd23cd3750f29bdcd3bc10cebf08921591396/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.916613 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwnk\" (UniqueName: \"kubernetes.io/projected/624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a-kube-api-access-6hwnk\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.927002 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfpj4\" (UniqueName: \"kubernetes.io/projected/8a6bd6eb-9513-46d0-be30-a3ca00254bc1-kube-api-access-mfpj4\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.928881 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdf6\" (UniqueName: \"kubernetes.io/projected/be5a39cb-6ede-4b4e-9b99-239e7bbbf830-kube-api-access-fmdf6\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.944533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e919c435-9ec5-4d87-9c7a-de6c080160d9\") pod \"ovsdbserver-sb-0\" (UID: \"be5a39cb-6ede-4b4e-9b99-239e7bbbf830\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.953497 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f88597b7-4903-4e70-96e6-5dd93135525c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f88597b7-4903-4e70-96e6-5dd93135525c\") pod \"ovsdbserver-sb-2\" (UID: \"8a6bd6eb-9513-46d0-be30-a3ca00254bc1\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:00 crc kubenswrapper[4771]: I0219 22:58:00.954006 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27350c03-1bc6-4d6b-acc8-0dad50e212e7\") pod \"ovsdbserver-sb-1\" (UID: \"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.099786 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.236804 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.238710 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.437695 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:58:01 crc kubenswrapper[4771]: E0219 22:58:01.437970 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.717496 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:58:01 crc kubenswrapper[4771]: W0219 22:58:01.724152 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5a39cb_6ede_4b4e_9b99_239e7bbbf830.slice/crio-a2bf2a68a151fd50388126f89eb5df2c829b55d8ffa8b985b59fe1f30eb04478 WatchSource:0}: Error finding container a2bf2a68a151fd50388126f89eb5df2c829b55d8ffa8b985b59fe1f30eb04478: Status 404 returned error can't find the container with id a2bf2a68a151fd50388126f89eb5df2c829b55d8ffa8b985b59fe1f30eb04478 Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.838848 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.839146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef","Type":"ContainerStarted","Data":"e05d581cb7c1cf2ffa4cc48559f1c5e68f6884bcfe991e786db557b137f70d8a"} Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.840235 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"be5a39cb-6ede-4b4e-9b99-239e7bbbf830","Type":"ContainerStarted","Data":"a2bf2a68a151fd50388126f89eb5df2c829b55d8ffa8b985b59fe1f30eb04478"} Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.841501 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"c27e9f00-5617-42ce-ab8b-16002c5ac59b","Type":"ContainerStarted","Data":"0474de15f383ea15529fb3f349482922ce2fbd3e313d0edb0a6ef12ed1bfe74f"} Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.843172 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e","Type":"ContainerStarted","Data":"e83f0eb7804b1f771738afffb18ab2e316b0756ac4cc7e18ebddee2fc89b722c"} Feb 19 22:58:01 crc kubenswrapper[4771]: W0219 22:58:01.844825 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a6bd6eb_9513_46d0_be30_a3ca00254bc1.slice/crio-87b9e37acc0a89703fde4b4496a28867ff77893263ba26cdd328266664b47e11 WatchSource:0}: Error finding container 87b9e37acc0a89703fde4b4496a28867ff77893263ba26cdd328266664b47e11: Status 404 returned error can't find the container with id 87b9e37acc0a89703fde4b4496a28867ff77893263ba26cdd328266664b47e11 Feb 19 22:58:01 crc kubenswrapper[4771]: I0219 22:58:01.930729 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.856888 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"c27e9f00-5617-42ce-ab8b-16002c5ac59b","Type":"ContainerStarted","Data":"3b6e671b1ea6d22c57f49ba75015a5270f56ad533fca0e4825496f1ec325a01b"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.875415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7ebf8641-9285-4ae4-afc9-0cf3b1bc585e","Type":"ContainerStarted","Data":"862e8cf769d5dcd3f886bf1c997f7db590f8301161574ceca053005737dc9fa8"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.881026 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a","Type":"ContainerStarted","Data":"ae4d4e0e66e1bbd4954d398ba6512696dfa2db20650fc86daa79b8a2b58ea549"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.881121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a","Type":"ContainerStarted","Data":"4fbe8cdecc6234e557b6490591fd8eeb2745bd0c9e6e1e0d4acb4c2a4935b0a4"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.881160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a","Type":"ContainerStarted","Data":"7edcfbf28e9b06681467d734bc914ac9efea05f7cc739b8addc5921d2ef19c3c"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.886326 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e51891da-e934-4e4e-afb0-dd9f7bc4a6ef","Type":"ContainerStarted","Data":"66dde1289c173e354e6ee79cfa66612c624b75c9dc4616dd73c2a6b35043791e"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.894803 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"8a6bd6eb-9513-46d0-be30-a3ca00254bc1","Type":"ContainerStarted","Data":"0e569b6f7459605a7afbe6c2a912af8ec43a2a6753efe5f061e1c72b0460c2ac"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.895242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"8a6bd6eb-9513-46d0-be30-a3ca00254bc1","Type":"ContainerStarted","Data":"aebfc1fc506fd13a5cca25d807a9d5f5c428e64fcfd1673062a205609006a92c"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.895310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"8a6bd6eb-9513-46d0-be30-a3ca00254bc1","Type":"ContainerStarted","Data":"87b9e37acc0a89703fde4b4496a28867ff77893263ba26cdd328266664b47e11"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.896034 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.895997966 podStartE2EDuration="4.895997966s" podCreationTimestamp="2026-02-19 22:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:02.888074345 +0000 UTC m=+5383.159516905" watchObservedRunningTime="2026-02-19 22:58:02.895997966 +0000 UTC m=+5383.167440476" Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.902789 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"be5a39cb-6ede-4b4e-9b99-239e7bbbf830","Type":"ContainerStarted","Data":"d092ce87139294a948f15b2c3ead8ed665452aabebd3432aa2203090469f760c"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.902858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"be5a39cb-6ede-4b4e-9b99-239e7bbbf830","Type":"ContainerStarted","Data":"79823bc0e38cb005ced4e6f8b3c5116f9156c1a37da97ac66831341a58cb5377"} Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.919784 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.91975967 podStartE2EDuration="5.91975967s" podCreationTimestamp="2026-02-19 22:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:02.915589129 +0000 UTC m=+5383.187031679" watchObservedRunningTime="2026-02-19 22:58:02.91975967 +0000 UTC m=+5383.191202180" Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.961073 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.961024782 podStartE2EDuration="3.961024782s" podCreationTimestamp="2026-02-19 22:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:02.944173612 +0000 UTC m=+5383.215616132" watchObservedRunningTime="2026-02-19 22:58:02.961024782 +0000 UTC m=+5383.232467282" Feb 19 22:58:02 crc kubenswrapper[4771]: I0219 22:58:02.981821 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.981796415 podStartE2EDuration="4.981796415s" podCreationTimestamp="2026-02-19 22:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:02.9755845 +0000 UTC m=+5383.247027010" watchObservedRunningTime="2026-02-19 22:58:02.981796415 +0000 UTC m=+5383.253238915" Feb 19 22:58:03 crc kubenswrapper[4771]: I0219 22:58:03.000652 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.000627038 podStartE2EDuration="4.000627038s" podCreationTimestamp="2026-02-19 22:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:03.000403472 +0000 UTC m=+5383.271846002" watchObservedRunningTime="2026-02-19 22:58:03.000627038 +0000 UTC m=+5383.272069538" Feb 19 22:58:03 crc kubenswrapper[4771]: I0219 22:58:03.026066 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.026015716 podStartE2EDuration="4.026015716s" podCreationTimestamp="2026-02-19 22:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:03.019919853 +0000 UTC m=+5383.291362363" watchObservedRunningTime="2026-02-19 22:58:03.026015716 +0000 UTC m=+5383.297458226" Feb 19 22:58:04 crc kubenswrapper[4771]: I0219 22:58:04.100981 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:04 crc kubenswrapper[4771]: I0219 22:58:04.238341 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:04 crc kubenswrapper[4771]: I0219 22:58:04.239459 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:04 crc kubenswrapper[4771]: I0219 22:58:04.370725 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 22:58:04 crc kubenswrapper[4771]: I0219 22:58:04.682957 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 19 22:58:04 crc kubenswrapper[4771]: I0219 22:58:04.699673 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 19 22:58:05 crc kubenswrapper[4771]: I0219 22:58:05.371104 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 22:58:05 crc kubenswrapper[4771]: I0219 22:58:05.435096 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 22:58:05 crc kubenswrapper[4771]: I0219 22:58:05.683238 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 19 22:58:05 crc kubenswrapper[4771]: I0219 22:58:05.700277 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 19 22:58:05 crc kubenswrapper[4771]: I0219 22:58:05.748262 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 19 22:58:05 crc kubenswrapper[4771]: I0219 22:58:05.754405 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 19 22:58:06 crc kubenswrapper[4771]: I0219 22:58:06.100358 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:06 crc kubenswrapper[4771]: I0219 22:58:06.237410 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:06 crc kubenswrapper[4771]: I0219 22:58:06.239368 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.007135 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.008992 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.016089 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.178589 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.224130 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-599cd69c45-76khv"] Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.227214 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.232181 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.246167 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.252169 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599cd69c45-76khv"] Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.294499 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.295952 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.322092 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-dns-svc\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.322183 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kztvh\" (UniqueName: \"kubernetes.io/projected/df9800b9-8c14-49e3-b781-ff8aabaa695b-kube-api-access-kztvh\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.322219 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-config\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.322288 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-ovsdbserver-nb\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.337012 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.337888 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.424158 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-dns-svc\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.424270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kztvh\" (UniqueName: \"kubernetes.io/projected/df9800b9-8c14-49e3-b781-ff8aabaa695b-kube-api-access-kztvh\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.424324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-config\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.424409 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-ovsdbserver-nb\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.425855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-dns-svc\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.426213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-ovsdbserver-nb\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.426646 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-config\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.457635 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kztvh\" (UniqueName: \"kubernetes.io/projected/df9800b9-8c14-49e3-b781-ff8aabaa695b-kube-api-access-kztvh\") pod \"dnsmasq-dns-599cd69c45-76khv\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.557513 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.638312 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599cd69c45-76khv"] Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.668785 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b7c64ccf-rltm6"] Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.669944 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.672572 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.690851 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b7c64ccf-rltm6"] Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.842710 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-config\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.842745 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-nb\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.842807 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-sb\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.842849 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-dns-svc\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.842877 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q8w2\" (UniqueName: \"kubernetes.io/projected/6539f0e3-8b84-4309-bc81-289831a669c7-kube-api-access-2q8w2\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.944001 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-sb\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.944169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-dns-svc\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.944230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q8w2\" (UniqueName: \"kubernetes.io/projected/6539f0e3-8b84-4309-bc81-289831a669c7-kube-api-access-2q8w2\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.944342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-config\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.944371 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-nb\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.945537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-config\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.945573 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-sb\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.946149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-nb\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.946532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-dns-svc\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:07 crc kubenswrapper[4771]: I0219 22:58:07.962709 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q8w2\" (UniqueName: \"kubernetes.io/projected/6539f0e3-8b84-4309-bc81-289831a669c7-kube-api-access-2q8w2\") pod \"dnsmasq-dns-67b7c64ccf-rltm6\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.012843 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.070443 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599cd69c45-76khv"] Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.498646 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b7c64ccf-rltm6"] Feb 19 22:58:08 crc kubenswrapper[4771]: W0219 22:58:08.501658 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6539f0e3_8b84_4309_bc81_289831a669c7.slice/crio-a842236a0f6d4698e0e991c1054f8ea023a96dcdd332b7c7428b1962c9888055 WatchSource:0}: Error finding container a842236a0f6d4698e0e991c1054f8ea023a96dcdd332b7c7428b1962c9888055: Status 404 returned error can't find the container with id a842236a0f6d4698e0e991c1054f8ea023a96dcdd332b7c7428b1962c9888055 Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.965211 4771 generic.go:334] "Generic (PLEG): container finished" podID="df9800b9-8c14-49e3-b781-ff8aabaa695b" containerID="28e2db615747ba53d41d52ea9162aeaa796f4bab5ab27f1221dc724b10b80b58" exitCode=0 Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.965729 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599cd69c45-76khv" event={"ID":"df9800b9-8c14-49e3-b781-ff8aabaa695b","Type":"ContainerDied","Data":"28e2db615747ba53d41d52ea9162aeaa796f4bab5ab27f1221dc724b10b80b58"} Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.965773 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599cd69c45-76khv" event={"ID":"df9800b9-8c14-49e3-b781-ff8aabaa695b","Type":"ContainerStarted","Data":"fdec204f32c9cf31c9d6b31acee3b8b9cd8ad5e96534b8aa7fb84b185c3da12d"} Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.969635 4771 generic.go:334] "Generic (PLEG): container finished" podID="6539f0e3-8b84-4309-bc81-289831a669c7" containerID="3d408ada512836840a64b8a7aa22de182693e97e18fcca96b6d44ca0465c1829" exitCode=0 Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.969676 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" event={"ID":"6539f0e3-8b84-4309-bc81-289831a669c7","Type":"ContainerDied","Data":"3d408ada512836840a64b8a7aa22de182693e97e18fcca96b6d44ca0465c1829"} Feb 19 22:58:08 crc kubenswrapper[4771]: I0219 22:58:08.969705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" event={"ID":"6539f0e3-8b84-4309-bc81-289831a669c7","Type":"ContainerStarted","Data":"a842236a0f6d4698e0e991c1054f8ea023a96dcdd332b7c7428b1962c9888055"} Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.287508 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.380274 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kztvh\" (UniqueName: \"kubernetes.io/projected/df9800b9-8c14-49e3-b781-ff8aabaa695b-kube-api-access-kztvh\") pod \"df9800b9-8c14-49e3-b781-ff8aabaa695b\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.380509 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-dns-svc\") pod \"df9800b9-8c14-49e3-b781-ff8aabaa695b\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.380580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-config\") pod \"df9800b9-8c14-49e3-b781-ff8aabaa695b\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.380630 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-ovsdbserver-nb\") pod \"df9800b9-8c14-49e3-b781-ff8aabaa695b\" (UID: \"df9800b9-8c14-49e3-b781-ff8aabaa695b\") " Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.384252 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9800b9-8c14-49e3-b781-ff8aabaa695b-kube-api-access-kztvh" (OuterVolumeSpecName: "kube-api-access-kztvh") pod "df9800b9-8c14-49e3-b781-ff8aabaa695b" (UID: "df9800b9-8c14-49e3-b781-ff8aabaa695b"). InnerVolumeSpecName "kube-api-access-kztvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.404375 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df9800b9-8c14-49e3-b781-ff8aabaa695b" (UID: "df9800b9-8c14-49e3-b781-ff8aabaa695b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.408801 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-config" (OuterVolumeSpecName: "config") pod "df9800b9-8c14-49e3-b781-ff8aabaa695b" (UID: "df9800b9-8c14-49e3-b781-ff8aabaa695b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.416185 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df9800b9-8c14-49e3-b781-ff8aabaa695b" (UID: "df9800b9-8c14-49e3-b781-ff8aabaa695b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.482342 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.482382 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.482395 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9800b9-8c14-49e3-b781-ff8aabaa695b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.482410 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kztvh\" (UniqueName: \"kubernetes.io/projected/df9800b9-8c14-49e3-b781-ff8aabaa695b-kube-api-access-kztvh\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.976541 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" event={"ID":"6539f0e3-8b84-4309-bc81-289831a669c7","Type":"ContainerStarted","Data":"633c83fac7433f2ea1100a6b7b4b2e5bfc437c8f1ada1ef96e319423d17f483e"} Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.977614 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.979414 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599cd69c45-76khv" event={"ID":"df9800b9-8c14-49e3-b781-ff8aabaa695b","Type":"ContainerDied","Data":"fdec204f32c9cf31c9d6b31acee3b8b9cd8ad5e96534b8aa7fb84b185c3da12d"} Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.979451 4771 scope.go:117] "RemoveContainer" containerID="28e2db615747ba53d41d52ea9162aeaa796f4bab5ab27f1221dc724b10b80b58" Feb 19 22:58:09 crc kubenswrapper[4771]: I0219 22:58:09.979468 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599cd69c45-76khv" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.001092 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" podStartSLOduration=3.001075212 podStartE2EDuration="3.001075212s" podCreationTimestamp="2026-02-19 22:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:09.994760652 +0000 UTC m=+5390.266203142" watchObservedRunningTime="2026-02-19 22:58:10.001075212 +0000 UTC m=+5390.272517692" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.096045 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599cd69c45-76khv"] Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.105944 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-599cd69c45-76khv"] Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.323793 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:58:10 crc kubenswrapper[4771]: E0219 22:58:10.324179 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9800b9-8c14-49e3-b781-ff8aabaa695b" containerName="init" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.324193 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9800b9-8c14-49e3-b781-ff8aabaa695b" containerName="init" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.324340 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9800b9-8c14-49e3-b781-ff8aabaa695b" containerName="init" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.324873 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.327004 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.334527 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.402651 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0855ef56-7675-40fe-81bb-26a511a7d0ff-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.403051 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.403110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg769\" (UniqueName: \"kubernetes.io/projected/0855ef56-7675-40fe-81bb-26a511a7d0ff-kube-api-access-tg769\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.456334 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9800b9-8c14-49e3-b781-ff8aabaa695b" path="/var/lib/kubelet/pods/df9800b9-8c14-49e3-b781-ff8aabaa695b/volumes" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.503848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0855ef56-7675-40fe-81bb-26a511a7d0ff-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.503939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.503993 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg769\" (UniqueName: \"kubernetes.io/projected/0855ef56-7675-40fe-81bb-26a511a7d0ff-kube-api-access-tg769\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.507654 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.507694 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0ec278c4c615d335bbfd61427f92c42c90044ea07c71dcd0801710a8660416a0/globalmount\"" pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.861243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0855ef56-7675-40fe-81bb-26a511a7d0ff-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:10 crc kubenswrapper[4771]: I0219 22:58:10.861640 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg769\" (UniqueName: \"kubernetes.io/projected/0855ef56-7675-40fe-81bb-26a511a7d0ff-kube-api-access-tg769\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:11 crc kubenswrapper[4771]: I0219 22:58:11.009644 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\") pod \"ovn-copy-data\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " pod="openstack/ovn-copy-data" Feb 19 22:58:11 crc kubenswrapper[4771]: I0219 22:58:11.251931 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 22:58:11 crc kubenswrapper[4771]: I0219 22:58:11.986358 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:58:11 crc kubenswrapper[4771]: W0219 22:58:11.989993 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0855ef56_7675_40fe_81bb_26a511a7d0ff.slice/crio-d1563782ac8daa1ecc2c21a36635d14618205e3cdafa0243b80c822b5ca5327d WatchSource:0}: Error finding container d1563782ac8daa1ecc2c21a36635d14618205e3cdafa0243b80c822b5ca5327d: Status 404 returned error can't find the container with id d1563782ac8daa1ecc2c21a36635d14618205e3cdafa0243b80c822b5ca5327d Feb 19 22:58:11 crc kubenswrapper[4771]: I0219 22:58:11.992807 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:58:12 crc kubenswrapper[4771]: I0219 22:58:12.001452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0855ef56-7675-40fe-81bb-26a511a7d0ff","Type":"ContainerStarted","Data":"d1563782ac8daa1ecc2c21a36635d14618205e3cdafa0243b80c822b5ca5327d"} Feb 19 22:58:14 crc kubenswrapper[4771]: I0219 22:58:14.026739 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0855ef56-7675-40fe-81bb-26a511a7d0ff","Type":"ContainerStarted","Data":"aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333"} Feb 19 22:58:14 crc kubenswrapper[4771]: I0219 22:58:14.048947 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=4.550780704 podStartE2EDuration="5.048921058s" podCreationTimestamp="2026-02-19 22:58:09 +0000 UTC" firstStartedPulling="2026-02-19 22:58:11.992484287 +0000 UTC m=+5392.263926757" lastFinishedPulling="2026-02-19 22:58:12.490624631 +0000 UTC m=+5392.762067111" observedRunningTime="2026-02-19 22:58:14.047802828 +0000 UTC m=+5394.319245328" watchObservedRunningTime="2026-02-19 22:58:14.048921058 +0000 UTC m=+5394.320363558" Feb 19 22:58:16 crc kubenswrapper[4771]: I0219 22:58:16.439251 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:58:16 crc kubenswrapper[4771]: E0219 22:58:16.440144 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.014995 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.096255 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-jnj2q"] Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.096600 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" podUID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerName="dnsmasq-dns" containerID="cri-o://7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f" gracePeriod=10 Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.588421 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.705578 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm4xp\" (UniqueName: \"kubernetes.io/projected/e8640339-a417-44c2-9fda-cfd1f6838baa-kube-api-access-gm4xp\") pod \"e8640339-a417-44c2-9fda-cfd1f6838baa\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.705656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-dns-svc\") pod \"e8640339-a417-44c2-9fda-cfd1f6838baa\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.705816 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-config\") pod \"e8640339-a417-44c2-9fda-cfd1f6838baa\" (UID: \"e8640339-a417-44c2-9fda-cfd1f6838baa\") " Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.714351 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8640339-a417-44c2-9fda-cfd1f6838baa-kube-api-access-gm4xp" (OuterVolumeSpecName: "kube-api-access-gm4xp") pod "e8640339-a417-44c2-9fda-cfd1f6838baa" (UID: "e8640339-a417-44c2-9fda-cfd1f6838baa"). InnerVolumeSpecName "kube-api-access-gm4xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.746212 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8640339-a417-44c2-9fda-cfd1f6838baa" (UID: "e8640339-a417-44c2-9fda-cfd1f6838baa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.765580 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-config" (OuterVolumeSpecName: "config") pod "e8640339-a417-44c2-9fda-cfd1f6838baa" (UID: "e8640339-a417-44c2-9fda-cfd1f6838baa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.808538 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.808603 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm4xp\" (UniqueName: \"kubernetes.io/projected/e8640339-a417-44c2-9fda-cfd1f6838baa-kube-api-access-gm4xp\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:18 crc kubenswrapper[4771]: I0219 22:58:18.808619 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8640339-a417-44c2-9fda-cfd1f6838baa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.100652 4771 generic.go:334] "Generic (PLEG): container finished" podID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerID="7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f" exitCode=0 Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.100694 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" event={"ID":"e8640339-a417-44c2-9fda-cfd1f6838baa","Type":"ContainerDied","Data":"7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f"} Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.100719 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" event={"ID":"e8640339-a417-44c2-9fda-cfd1f6838baa","Type":"ContainerDied","Data":"01c63eb87e6bf2585723f49e95058fd45d57be5c9350df0f4c506d3d38ba53c2"} Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.100737 4771 scope.go:117] "RemoveContainer" containerID="7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.100777 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-jnj2q" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.127580 4771 scope.go:117] "RemoveContainer" containerID="1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.156323 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-jnj2q"] Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.165464 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-jnj2q"] Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.175070 4771 scope.go:117] "RemoveContainer" containerID="7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f" Feb 19 22:58:19 crc kubenswrapper[4771]: E0219 22:58:19.175653 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f\": container with ID starting with 7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f not found: ID does not exist" containerID="7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.175687 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f"} err="failed to get container status \"7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f\": rpc error: code = NotFound desc = could not find container \"7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f\": container with ID starting with 7c76a0749980de063af671dfb8a8fe82e1450832ee02c4b03e1e19bfa943325f not found: ID does not exist" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.175713 4771 scope.go:117] "RemoveContainer" containerID="1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d" Feb 19 22:58:19 crc kubenswrapper[4771]: E0219 22:58:19.176247 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d\": container with ID starting with 1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d not found: ID does not exist" containerID="1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.176267 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d"} err="failed to get container status \"1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d\": rpc error: code = NotFound desc = could not find container \"1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d\": container with ID starting with 1caf6997ae28b3e9fce8e9e4411396e2536a2219368c0fe40b7832169e34db0d not found: ID does not exist" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.857344 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:58:19 crc kubenswrapper[4771]: E0219 22:58:19.863382 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerName="dnsmasq-dns" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.863416 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerName="dnsmasq-dns" Feb 19 22:58:19 crc kubenswrapper[4771]: E0219 22:58:19.863446 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerName="init" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.863452 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerName="init" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.869907 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8640339-a417-44c2-9fda-cfd1f6838baa" containerName="dnsmasq-dns" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.870732 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.876783 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.877151 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-x5b22" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.877355 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.877519 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.883283 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.924867 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqmhj\" (UniqueName: \"kubernetes.io/projected/c41e2e76-3b47-431c-a559-aaff440321be-kube-api-access-sqmhj\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.924915 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c41e2e76-3b47-431c-a559-aaff440321be-scripts\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.924935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c41e2e76-3b47-431c-a559-aaff440321be-config\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.924955 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.925012 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.925069 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c41e2e76-3b47-431c-a559-aaff440321be-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:19 crc kubenswrapper[4771]: I0219 22:58:19.925115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.026256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.026603 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c41e2e76-3b47-431c-a559-aaff440321be-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.026635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.026695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmhj\" (UniqueName: \"kubernetes.io/projected/c41e2e76-3b47-431c-a559-aaff440321be-kube-api-access-sqmhj\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.026720 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c41e2e76-3b47-431c-a559-aaff440321be-scripts\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.026740 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c41e2e76-3b47-431c-a559-aaff440321be-config\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.026759 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.027299 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c41e2e76-3b47-431c-a559-aaff440321be-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.028042 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c41e2e76-3b47-431c-a559-aaff440321be-config\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.028049 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c41e2e76-3b47-431c-a559-aaff440321be-scripts\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.030696 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.032097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.040122 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c41e2e76-3b47-431c-a559-aaff440321be-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.047704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqmhj\" (UniqueName: \"kubernetes.io/projected/c41e2e76-3b47-431c-a559-aaff440321be-kube-api-access-sqmhj\") pod \"ovn-northd-0\" (UID: \"c41e2e76-3b47-431c-a559-aaff440321be\") " pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.200250 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.453928 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8640339-a417-44c2-9fda-cfd1f6838baa" path="/var/lib/kubelet/pods/e8640339-a417-44c2-9fda-cfd1f6838baa/volumes" Feb 19 22:58:20 crc kubenswrapper[4771]: I0219 22:58:20.678001 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:58:21 crc kubenswrapper[4771]: I0219 22:58:21.117874 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c41e2e76-3b47-431c-a559-aaff440321be","Type":"ContainerStarted","Data":"5bd68532fe081537ddac4b1ca1480c14adf3a00f9e01660e3f78150e788e2932"} Feb 19 22:58:21 crc kubenswrapper[4771]: I0219 22:58:21.118194 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 22:58:21 crc kubenswrapper[4771]: I0219 22:58:21.118208 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c41e2e76-3b47-431c-a559-aaff440321be","Type":"ContainerStarted","Data":"fd415b4981102012659cdaa9acd9e6489eecf94b997463d1136eed198b9600d7"} Feb 19 22:58:21 crc kubenswrapper[4771]: I0219 22:58:21.118221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c41e2e76-3b47-431c-a559-aaff440321be","Type":"ContainerStarted","Data":"bfd6c22c3d3884b4e9ef93a1c0b0e5b5cf598e04781a3dd69eba017ff984ccb0"} Feb 19 22:58:21 crc kubenswrapper[4771]: I0219 22:58:21.142575 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.142559368 podStartE2EDuration="2.142559368s" podCreationTimestamp="2026-02-19 22:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:21.1422547 +0000 UTC m=+5401.413697210" watchObservedRunningTime="2026-02-19 22:58:21.142559368 +0000 UTC m=+5401.414001838" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.491881 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dfx8t"] Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.493463 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.503524 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dfx8t"] Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.530214 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7d2c11-273d-4266-8f85-f4dec666b9b6-operator-scripts\") pod \"keystone-db-create-dfx8t\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.530651 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4lvj\" (UniqueName: \"kubernetes.io/projected/6c7d2c11-273d-4266-8f85-f4dec666b9b6-kube-api-access-v4lvj\") pod \"keystone-db-create-dfx8t\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.608915 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f8b-account-create-update-d6cch"] Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.610443 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.612771 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.617831 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f8b-account-create-update-d6cch"] Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.632280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhjc\" (UniqueName: \"kubernetes.io/projected/db7eca99-3572-4f98-a1ed-d06666e1b77a-kube-api-access-qmhjc\") pod \"keystone-5f8b-account-create-update-d6cch\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.632343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4lvj\" (UniqueName: \"kubernetes.io/projected/6c7d2c11-273d-4266-8f85-f4dec666b9b6-kube-api-access-v4lvj\") pod \"keystone-db-create-dfx8t\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.632456 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7d2c11-273d-4266-8f85-f4dec666b9b6-operator-scripts\") pod \"keystone-db-create-dfx8t\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.632526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7eca99-3572-4f98-a1ed-d06666e1b77a-operator-scripts\") pod \"keystone-5f8b-account-create-update-d6cch\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.633484 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7d2c11-273d-4266-8f85-f4dec666b9b6-operator-scripts\") pod \"keystone-db-create-dfx8t\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.657982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4lvj\" (UniqueName: \"kubernetes.io/projected/6c7d2c11-273d-4266-8f85-f4dec666b9b6-kube-api-access-v4lvj\") pod \"keystone-db-create-dfx8t\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.733208 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7eca99-3572-4f98-a1ed-d06666e1b77a-operator-scripts\") pod \"keystone-5f8b-account-create-update-d6cch\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.733313 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhjc\" (UniqueName: \"kubernetes.io/projected/db7eca99-3572-4f98-a1ed-d06666e1b77a-kube-api-access-qmhjc\") pod \"keystone-5f8b-account-create-update-d6cch\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.734250 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7eca99-3572-4f98-a1ed-d06666e1b77a-operator-scripts\") pod \"keystone-5f8b-account-create-update-d6cch\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.750107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhjc\" (UniqueName: \"kubernetes.io/projected/db7eca99-3572-4f98-a1ed-d06666e1b77a-kube-api-access-qmhjc\") pod \"keystone-5f8b-account-create-update-d6cch\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.867718 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:25 crc kubenswrapper[4771]: I0219 22:58:25.930083 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:26 crc kubenswrapper[4771]: I0219 22:58:26.339846 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dfx8t"] Feb 19 22:58:26 crc kubenswrapper[4771]: W0219 22:58:26.351779 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c7d2c11_273d_4266_8f85_f4dec666b9b6.slice/crio-04eea46e1dee555029b077d064e7e0b36828344cfff3f3fd7eedf6296adf7938 WatchSource:0}: Error finding container 04eea46e1dee555029b077d064e7e0b36828344cfff3f3fd7eedf6296adf7938: Status 404 returned error can't find the container with id 04eea46e1dee555029b077d064e7e0b36828344cfff3f3fd7eedf6296adf7938 Feb 19 22:58:26 crc kubenswrapper[4771]: I0219 22:58:26.434815 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f8b-account-create-update-d6cch"] Feb 19 22:58:26 crc kubenswrapper[4771]: W0219 22:58:26.435149 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb7eca99_3572_4f98_a1ed_d06666e1b77a.slice/crio-e31fedf44ea5bcbd11bacca3c95e1b700de8b3c29cc0120ec4bb83b6892c9780 WatchSource:0}: Error finding container e31fedf44ea5bcbd11bacca3c95e1b700de8b3c29cc0120ec4bb83b6892c9780: Status 404 returned error can't find the container with id e31fedf44ea5bcbd11bacca3c95e1b700de8b3c29cc0120ec4bb83b6892c9780 Feb 19 22:58:27 crc kubenswrapper[4771]: I0219 22:58:27.180732 4771 generic.go:334] "Generic (PLEG): container finished" podID="db7eca99-3572-4f98-a1ed-d06666e1b77a" containerID="ad53b8b8a68c591c493fcd858b6b131fab7cad18e7607df8c53f89e08f686ba9" exitCode=0 Feb 19 22:58:27 crc kubenswrapper[4771]: I0219 22:58:27.180907 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f8b-account-create-update-d6cch" event={"ID":"db7eca99-3572-4f98-a1ed-d06666e1b77a","Type":"ContainerDied","Data":"ad53b8b8a68c591c493fcd858b6b131fab7cad18e7607df8c53f89e08f686ba9"} Feb 19 22:58:27 crc kubenswrapper[4771]: I0219 22:58:27.181211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f8b-account-create-update-d6cch" event={"ID":"db7eca99-3572-4f98-a1ed-d06666e1b77a","Type":"ContainerStarted","Data":"e31fedf44ea5bcbd11bacca3c95e1b700de8b3c29cc0120ec4bb83b6892c9780"} Feb 19 22:58:27 crc kubenswrapper[4771]: I0219 22:58:27.184327 4771 generic.go:334] "Generic (PLEG): container finished" podID="6c7d2c11-273d-4266-8f85-f4dec666b9b6" containerID="fdba08747089b717ccfe547f67abe469c1570a8c25cd24e10608444785edee00" exitCode=0 Feb 19 22:58:27 crc kubenswrapper[4771]: I0219 22:58:27.184383 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dfx8t" event={"ID":"6c7d2c11-273d-4266-8f85-f4dec666b9b6","Type":"ContainerDied","Data":"fdba08747089b717ccfe547f67abe469c1570a8c25cd24e10608444785edee00"} Feb 19 22:58:27 crc kubenswrapper[4771]: I0219 22:58:27.184423 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dfx8t" event={"ID":"6c7d2c11-273d-4266-8f85-f4dec666b9b6","Type":"ContainerStarted","Data":"04eea46e1dee555029b077d064e7e0b36828344cfff3f3fd7eedf6296adf7938"} Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.438590 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:58:28 crc kubenswrapper[4771]: E0219 22:58:28.438880 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.800578 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.803897 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.897928 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4lvj\" (UniqueName: \"kubernetes.io/projected/6c7d2c11-273d-4266-8f85-f4dec666b9b6-kube-api-access-v4lvj\") pod \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.898089 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhjc\" (UniqueName: \"kubernetes.io/projected/db7eca99-3572-4f98-a1ed-d06666e1b77a-kube-api-access-qmhjc\") pod \"db7eca99-3572-4f98-a1ed-d06666e1b77a\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.898136 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7eca99-3572-4f98-a1ed-d06666e1b77a-operator-scripts\") pod \"db7eca99-3572-4f98-a1ed-d06666e1b77a\" (UID: \"db7eca99-3572-4f98-a1ed-d06666e1b77a\") " Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.898241 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7d2c11-273d-4266-8f85-f4dec666b9b6-operator-scripts\") pod \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\" (UID: \"6c7d2c11-273d-4266-8f85-f4dec666b9b6\") " Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.899223 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c7d2c11-273d-4266-8f85-f4dec666b9b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c7d2c11-273d-4266-8f85-f4dec666b9b6" (UID: "6c7d2c11-273d-4266-8f85-f4dec666b9b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.899241 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db7eca99-3572-4f98-a1ed-d06666e1b77a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db7eca99-3572-4f98-a1ed-d06666e1b77a" (UID: "db7eca99-3572-4f98-a1ed-d06666e1b77a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.904150 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c7d2c11-273d-4266-8f85-f4dec666b9b6-kube-api-access-v4lvj" (OuterVolumeSpecName: "kube-api-access-v4lvj") pod "6c7d2c11-273d-4266-8f85-f4dec666b9b6" (UID: "6c7d2c11-273d-4266-8f85-f4dec666b9b6"). InnerVolumeSpecName "kube-api-access-v4lvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:28 crc kubenswrapper[4771]: I0219 22:58:28.905209 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7eca99-3572-4f98-a1ed-d06666e1b77a-kube-api-access-qmhjc" (OuterVolumeSpecName: "kube-api-access-qmhjc") pod "db7eca99-3572-4f98-a1ed-d06666e1b77a" (UID: "db7eca99-3572-4f98-a1ed-d06666e1b77a"). InnerVolumeSpecName "kube-api-access-qmhjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.000854 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c7d2c11-273d-4266-8f85-f4dec666b9b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.000953 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4lvj\" (UniqueName: \"kubernetes.io/projected/6c7d2c11-273d-4266-8f85-f4dec666b9b6-kube-api-access-v4lvj\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.000981 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhjc\" (UniqueName: \"kubernetes.io/projected/db7eca99-3572-4f98-a1ed-d06666e1b77a-kube-api-access-qmhjc\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.001008 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7eca99-3572-4f98-a1ed-d06666e1b77a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.209218 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dfx8t" event={"ID":"6c7d2c11-273d-4266-8f85-f4dec666b9b6","Type":"ContainerDied","Data":"04eea46e1dee555029b077d064e7e0b36828344cfff3f3fd7eedf6296adf7938"} Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.209273 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04eea46e1dee555029b077d064e7e0b36828344cfff3f3fd7eedf6296adf7938" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.209232 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dfx8t" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.211553 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f8b-account-create-update-d6cch" event={"ID":"db7eca99-3572-4f98-a1ed-d06666e1b77a","Type":"ContainerDied","Data":"e31fedf44ea5bcbd11bacca3c95e1b700de8b3c29cc0120ec4bb83b6892c9780"} Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.211600 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e31fedf44ea5bcbd11bacca3c95e1b700de8b3c29cc0120ec4bb83b6892c9780" Feb 19 22:58:29 crc kubenswrapper[4771]: I0219 22:58:29.211639 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f8b-account-create-update-d6cch" Feb 19 22:58:30 crc kubenswrapper[4771]: I0219 22:58:30.305775 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.026828 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-k7869"] Feb 19 22:58:31 crc kubenswrapper[4771]: E0219 22:58:31.027422 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7eca99-3572-4f98-a1ed-d06666e1b77a" containerName="mariadb-account-create-update" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.027436 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7eca99-3572-4f98-a1ed-d06666e1b77a" containerName="mariadb-account-create-update" Feb 19 22:58:31 crc kubenswrapper[4771]: E0219 22:58:31.027466 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7d2c11-273d-4266-8f85-f4dec666b9b6" containerName="mariadb-database-create" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.027475 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7d2c11-273d-4266-8f85-f4dec666b9b6" containerName="mariadb-database-create" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.027669 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c7d2c11-273d-4266-8f85-f4dec666b9b6" containerName="mariadb-database-create" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.027696 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7eca99-3572-4f98-a1ed-d06666e1b77a" containerName="mariadb-account-create-update" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.028363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.029946 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.031222 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kldh" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.031574 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.032955 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.049438 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k7869"] Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.141785 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-combined-ca-bundle\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.141851 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-config-data\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.142213 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6btps\" (UniqueName: \"kubernetes.io/projected/504b9144-58a8-4842-b5bf-12d4d4498a38-kube-api-access-6btps\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.243439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6btps\" (UniqueName: \"kubernetes.io/projected/504b9144-58a8-4842-b5bf-12d4d4498a38-kube-api-access-6btps\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.243547 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-combined-ca-bundle\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.243602 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-config-data\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.248051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-config-data\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.251118 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-combined-ca-bundle\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.273560 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6btps\" (UniqueName: \"kubernetes.io/projected/504b9144-58a8-4842-b5bf-12d4d4498a38-kube-api-access-6btps\") pod \"keystone-db-sync-k7869\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.346097 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:31 crc kubenswrapper[4771]: I0219 22:58:31.870700 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k7869"] Feb 19 22:58:31 crc kubenswrapper[4771]: W0219 22:58:31.876221 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod504b9144_58a8_4842_b5bf_12d4d4498a38.slice/crio-55658e87f8e1223335c53136553510210bf2d36238dbe9d2cbb46ad94e32f2eb WatchSource:0}: Error finding container 55658e87f8e1223335c53136553510210bf2d36238dbe9d2cbb46ad94e32f2eb: Status 404 returned error can't find the container with id 55658e87f8e1223335c53136553510210bf2d36238dbe9d2cbb46ad94e32f2eb Feb 19 22:58:32 crc kubenswrapper[4771]: I0219 22:58:32.271378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k7869" event={"ID":"504b9144-58a8-4842-b5bf-12d4d4498a38","Type":"ContainerStarted","Data":"125b77871cc050b4732c34ff193c83b80609e19aa429e65486e2c108c80311e4"} Feb 19 22:58:32 crc kubenswrapper[4771]: I0219 22:58:32.271624 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k7869" event={"ID":"504b9144-58a8-4842-b5bf-12d4d4498a38","Type":"ContainerStarted","Data":"55658e87f8e1223335c53136553510210bf2d36238dbe9d2cbb46ad94e32f2eb"} Feb 19 22:58:32 crc kubenswrapper[4771]: I0219 22:58:32.305306 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-k7869" podStartSLOduration=1.3052863719999999 podStartE2EDuration="1.305286372s" podCreationTimestamp="2026-02-19 22:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:32.296890698 +0000 UTC m=+5412.568333258" watchObservedRunningTime="2026-02-19 22:58:32.305286372 +0000 UTC m=+5412.576728842" Feb 19 22:58:34 crc kubenswrapper[4771]: I0219 22:58:34.292799 4771 generic.go:334] "Generic (PLEG): container finished" podID="504b9144-58a8-4842-b5bf-12d4d4498a38" containerID="125b77871cc050b4732c34ff193c83b80609e19aa429e65486e2c108c80311e4" exitCode=0 Feb 19 22:58:34 crc kubenswrapper[4771]: I0219 22:58:34.292853 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k7869" event={"ID":"504b9144-58a8-4842-b5bf-12d4d4498a38","Type":"ContainerDied","Data":"125b77871cc050b4732c34ff193c83b80609e19aa429e65486e2c108c80311e4"} Feb 19 22:58:35 crc kubenswrapper[4771]: I0219 22:58:35.037555 4771 scope.go:117] "RemoveContainer" containerID="6eeaa439c42cdaf5e51a297908c004b04a8444bb1ee9e325924a23252146631d" Feb 19 22:58:35 crc kubenswrapper[4771]: I0219 22:58:35.810872 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:35 crc kubenswrapper[4771]: I0219 22:58:35.939498 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-combined-ca-bundle\") pod \"504b9144-58a8-4842-b5bf-12d4d4498a38\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " Feb 19 22:58:35 crc kubenswrapper[4771]: I0219 22:58:35.939559 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6btps\" (UniqueName: \"kubernetes.io/projected/504b9144-58a8-4842-b5bf-12d4d4498a38-kube-api-access-6btps\") pod \"504b9144-58a8-4842-b5bf-12d4d4498a38\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " Feb 19 22:58:35 crc kubenswrapper[4771]: I0219 22:58:35.939632 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-config-data\") pod \"504b9144-58a8-4842-b5bf-12d4d4498a38\" (UID: \"504b9144-58a8-4842-b5bf-12d4d4498a38\") " Feb 19 22:58:35 crc kubenswrapper[4771]: I0219 22:58:35.949392 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504b9144-58a8-4842-b5bf-12d4d4498a38-kube-api-access-6btps" (OuterVolumeSpecName: "kube-api-access-6btps") pod "504b9144-58a8-4842-b5bf-12d4d4498a38" (UID: "504b9144-58a8-4842-b5bf-12d4d4498a38"). InnerVolumeSpecName "kube-api-access-6btps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:35 crc kubenswrapper[4771]: I0219 22:58:35.981078 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "504b9144-58a8-4842-b5bf-12d4d4498a38" (UID: "504b9144-58a8-4842-b5bf-12d4d4498a38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.010105 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-config-data" (OuterVolumeSpecName: "config-data") pod "504b9144-58a8-4842-b5bf-12d4d4498a38" (UID: "504b9144-58a8-4842-b5bf-12d4d4498a38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.043129 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6btps\" (UniqueName: \"kubernetes.io/projected/504b9144-58a8-4842-b5bf-12d4d4498a38-kube-api-access-6btps\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.043173 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.043194 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/504b9144-58a8-4842-b5bf-12d4d4498a38-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.315868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k7869" event={"ID":"504b9144-58a8-4842-b5bf-12d4d4498a38","Type":"ContainerDied","Data":"55658e87f8e1223335c53136553510210bf2d36238dbe9d2cbb46ad94e32f2eb"} Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.315934 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k7869" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.315959 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55658e87f8e1223335c53136553510210bf2d36238dbe9d2cbb46ad94e32f2eb" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.605040 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j885f"] Feb 19 22:58:36 crc kubenswrapper[4771]: E0219 22:58:36.605703 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504b9144-58a8-4842-b5bf-12d4d4498a38" containerName="keystone-db-sync" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.605848 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="504b9144-58a8-4842-b5bf-12d4d4498a38" containerName="keystone-db-sync" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.606297 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="504b9144-58a8-4842-b5bf-12d4d4498a38" containerName="keystone-db-sync" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.607369 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.612148 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.612223 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.612437 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kldh" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.612541 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.613048 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.614119 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84df6f8cd9-pdflf"] Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.615354 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.628877 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84df6f8cd9-pdflf"] Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.641791 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j885f"] Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755548 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-config\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755640 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-dns-svc\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9lj\" (UniqueName: \"kubernetes.io/projected/b7a85e66-8f7a-416d-b0df-77c03113f6d1-kube-api-access-tm9lj\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755733 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-nb\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4k8j\" (UniqueName: \"kubernetes.io/projected/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-kube-api-access-f4k8j\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755842 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-config-data\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755869 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-credential-keys\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.755977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-scripts\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.756044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-fernet-keys\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.756071 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-sb\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.756097 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-combined-ca-bundle\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-config-data\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-credential-keys\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-scripts\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857304 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-fernet-keys\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-sb\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857340 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-combined-ca-bundle\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-config\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857394 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-dns-svc\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857423 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9lj\" (UniqueName: \"kubernetes.io/projected/b7a85e66-8f7a-416d-b0df-77c03113f6d1-kube-api-access-tm9lj\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-nb\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.857470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4k8j\" (UniqueName: \"kubernetes.io/projected/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-kube-api-access-f4k8j\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.858890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-dns-svc\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.859238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-sb\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.860139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-config\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.861305 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-nb\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.862879 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-scripts\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.863038 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-config-data\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.863100 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-combined-ca-bundle\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.863333 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-fernet-keys\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.864808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-credential-keys\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.886560 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9lj\" (UniqueName: \"kubernetes.io/projected/b7a85e66-8f7a-416d-b0df-77c03113f6d1-kube-api-access-tm9lj\") pod \"keystone-bootstrap-j885f\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.893714 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4k8j\" (UniqueName: \"kubernetes.io/projected/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-kube-api-access-f4k8j\") pod \"dnsmasq-dns-84df6f8cd9-pdflf\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.923866 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:36 crc kubenswrapper[4771]: I0219 22:58:36.935045 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:37 crc kubenswrapper[4771]: I0219 22:58:37.363364 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j885f"] Feb 19 22:58:37 crc kubenswrapper[4771]: W0219 22:58:37.363625 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7a85e66_8f7a_416d_b0df_77c03113f6d1.slice/crio-218a604e32de7a64a92900c6554d07aae0af72392af6bd54c377dcc551c02667 WatchSource:0}: Error finding container 218a604e32de7a64a92900c6554d07aae0af72392af6bd54c377dcc551c02667: Status 404 returned error can't find the container with id 218a604e32de7a64a92900c6554d07aae0af72392af6bd54c377dcc551c02667 Feb 19 22:58:37 crc kubenswrapper[4771]: I0219 22:58:37.424222 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84df6f8cd9-pdflf"] Feb 19 22:58:37 crc kubenswrapper[4771]: W0219 22:58:37.438364 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaba2c23b_cb37_462b_8fc6_f4c1e8bf641f.slice/crio-fc21337c6a4d9ba76fd865d419718f85a95190895dfe0d3dfea2951312ea1c70 WatchSource:0}: Error finding container fc21337c6a4d9ba76fd865d419718f85a95190895dfe0d3dfea2951312ea1c70: Status 404 returned error can't find the container with id fc21337c6a4d9ba76fd865d419718f85a95190895dfe0d3dfea2951312ea1c70 Feb 19 22:58:38 crc kubenswrapper[4771]: I0219 22:58:38.342740 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j885f" event={"ID":"b7a85e66-8f7a-416d-b0df-77c03113f6d1","Type":"ContainerStarted","Data":"321262661da65fb93b64589983526499089a587b4459fb2f5f71dbb0115041aa"} Feb 19 22:58:38 crc kubenswrapper[4771]: I0219 22:58:38.343108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j885f" event={"ID":"b7a85e66-8f7a-416d-b0df-77c03113f6d1","Type":"ContainerStarted","Data":"218a604e32de7a64a92900c6554d07aae0af72392af6bd54c377dcc551c02667"} Feb 19 22:58:38 crc kubenswrapper[4771]: I0219 22:58:38.353099 4771 generic.go:334] "Generic (PLEG): container finished" podID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerID="da57fc2085cceac58f5a34f860af9da9f9f436a79090eeacde5e33d545d0e8ff" exitCode=0 Feb 19 22:58:38 crc kubenswrapper[4771]: I0219 22:58:38.353161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" event={"ID":"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f","Type":"ContainerDied","Data":"da57fc2085cceac58f5a34f860af9da9f9f436a79090eeacde5e33d545d0e8ff"} Feb 19 22:58:38 crc kubenswrapper[4771]: I0219 22:58:38.353190 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" event={"ID":"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f","Type":"ContainerStarted","Data":"fc21337c6a4d9ba76fd865d419718f85a95190895dfe0d3dfea2951312ea1c70"} Feb 19 22:58:38 crc kubenswrapper[4771]: I0219 22:58:38.368622 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j885f" podStartSLOduration=2.368595165 podStartE2EDuration="2.368595165s" podCreationTimestamp="2026-02-19 22:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:38.360923401 +0000 UTC m=+5418.632365911" watchObservedRunningTime="2026-02-19 22:58:38.368595165 +0000 UTC m=+5418.640037675" Feb 19 22:58:39 crc kubenswrapper[4771]: I0219 22:58:39.364587 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" event={"ID":"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f","Type":"ContainerStarted","Data":"e62e486a0e3ee56695587359c5df9f4c18bcb1b82a6f0a28fc5e639f073517fa"} Feb 19 22:58:39 crc kubenswrapper[4771]: I0219 22:58:39.365148 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:39 crc kubenswrapper[4771]: I0219 22:58:39.385842 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" podStartSLOduration=3.385808023 podStartE2EDuration="3.385808023s" podCreationTimestamp="2026-02-19 22:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:39.382076983 +0000 UTC m=+5419.653519453" watchObservedRunningTime="2026-02-19 22:58:39.385808023 +0000 UTC m=+5419.657250493" Feb 19 22:58:41 crc kubenswrapper[4771]: I0219 22:58:41.384611 4771 generic.go:334] "Generic (PLEG): container finished" podID="b7a85e66-8f7a-416d-b0df-77c03113f6d1" containerID="321262661da65fb93b64589983526499089a587b4459fb2f5f71dbb0115041aa" exitCode=0 Feb 19 22:58:41 crc kubenswrapper[4771]: I0219 22:58:41.384734 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j885f" event={"ID":"b7a85e66-8f7a-416d-b0df-77c03113f6d1","Type":"ContainerDied","Data":"321262661da65fb93b64589983526499089a587b4459fb2f5f71dbb0115041aa"} Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.866337 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.894604 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-scripts\") pod \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.894701 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-combined-ca-bundle\") pod \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.907084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-scripts" (OuterVolumeSpecName: "scripts") pod "b7a85e66-8f7a-416d-b0df-77c03113f6d1" (UID: "b7a85e66-8f7a-416d-b0df-77c03113f6d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.926768 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7a85e66-8f7a-416d-b0df-77c03113f6d1" (UID: "b7a85e66-8f7a-416d-b0df-77c03113f6d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.997837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-fernet-keys\") pod \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.997891 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-credential-keys\") pod \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.997919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-config-data\") pod \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.997979 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm9lj\" (UniqueName: \"kubernetes.io/projected/b7a85e66-8f7a-416d-b0df-77c03113f6d1-kube-api-access-tm9lj\") pod \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\" (UID: \"b7a85e66-8f7a-416d-b0df-77c03113f6d1\") " Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.998456 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:42 crc kubenswrapper[4771]: I0219 22:58:42.998479 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.001365 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a85e66-8f7a-416d-b0df-77c03113f6d1-kube-api-access-tm9lj" (OuterVolumeSpecName: "kube-api-access-tm9lj") pod "b7a85e66-8f7a-416d-b0df-77c03113f6d1" (UID: "b7a85e66-8f7a-416d-b0df-77c03113f6d1"). InnerVolumeSpecName "kube-api-access-tm9lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.003146 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b7a85e66-8f7a-416d-b0df-77c03113f6d1" (UID: "b7a85e66-8f7a-416d-b0df-77c03113f6d1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.004443 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b7a85e66-8f7a-416d-b0df-77c03113f6d1" (UID: "b7a85e66-8f7a-416d-b0df-77c03113f6d1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.038739 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-config-data" (OuterVolumeSpecName: "config-data") pod "b7a85e66-8f7a-416d-b0df-77c03113f6d1" (UID: "b7a85e66-8f7a-416d-b0df-77c03113f6d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.100222 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.100270 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.100290 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a85e66-8f7a-416d-b0df-77c03113f6d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.100311 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm9lj\" (UniqueName: \"kubernetes.io/projected/b7a85e66-8f7a-416d-b0df-77c03113f6d1-kube-api-access-tm9lj\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.415128 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j885f" event={"ID":"b7a85e66-8f7a-416d-b0df-77c03113f6d1","Type":"ContainerDied","Data":"218a604e32de7a64a92900c6554d07aae0af72392af6bd54c377dcc551c02667"} Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.415196 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="218a604e32de7a64a92900c6554d07aae0af72392af6bd54c377dcc551c02667" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.415343 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j885f" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.441111 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:58:43 crc kubenswrapper[4771]: E0219 22:58:43.445282 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.525118 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j885f"] Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.538906 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j885f"] Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.606107 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d2jgk"] Feb 19 22:58:43 crc kubenswrapper[4771]: E0219 22:58:43.606616 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a85e66-8f7a-416d-b0df-77c03113f6d1" containerName="keystone-bootstrap" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.606636 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a85e66-8f7a-416d-b0df-77c03113f6d1" containerName="keystone-bootstrap" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.606938 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a85e66-8f7a-416d-b0df-77c03113f6d1" containerName="keystone-bootstrap" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.607926 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.610465 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.610847 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.610776 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.617523 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kldh" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.617878 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.627820 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d2jgk"] Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.721360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-fernet-keys\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.721536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-scripts\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.721610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-credential-keys\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.722146 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-combined-ca-bundle\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.722274 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-config-data\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.722306 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lc4\" (UniqueName: \"kubernetes.io/projected/2c81a18f-e0b7-4557-8789-62e97a5950af-kube-api-access-w8lc4\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.824746 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-credential-keys\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.824893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-combined-ca-bundle\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.825380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-config-data\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.825456 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lc4\" (UniqueName: \"kubernetes.io/projected/2c81a18f-e0b7-4557-8789-62e97a5950af-kube-api-access-w8lc4\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.825628 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-fernet-keys\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.825778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-scripts\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.833358 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-credential-keys\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.834233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-combined-ca-bundle\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.834898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-fernet-keys\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.835126 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-scripts\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.835172 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-config-data\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.855404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lc4\" (UniqueName: \"kubernetes.io/projected/2c81a18f-e0b7-4557-8789-62e97a5950af-kube-api-access-w8lc4\") pod \"keystone-bootstrap-d2jgk\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:43 crc kubenswrapper[4771]: I0219 22:58:43.942411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:44 crc kubenswrapper[4771]: I0219 22:58:44.454214 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a85e66-8f7a-416d-b0df-77c03113f6d1" path="/var/lib/kubelet/pods/b7a85e66-8f7a-416d-b0df-77c03113f6d1/volumes" Feb 19 22:58:44 crc kubenswrapper[4771]: I0219 22:58:44.512513 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d2jgk"] Feb 19 22:58:45 crc kubenswrapper[4771]: I0219 22:58:45.442508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d2jgk" event={"ID":"2c81a18f-e0b7-4557-8789-62e97a5950af","Type":"ContainerStarted","Data":"31e8257e5f63678562b42bdd4a7e3ab780e6923204c674c418a909706defdc99"} Feb 19 22:58:45 crc kubenswrapper[4771]: I0219 22:58:45.442834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d2jgk" event={"ID":"2c81a18f-e0b7-4557-8789-62e97a5950af","Type":"ContainerStarted","Data":"12387683510dd43d7d07aabfc65628f85c78822af104276f451fc2c0ce74451e"} Feb 19 22:58:45 crc kubenswrapper[4771]: I0219 22:58:45.475299 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d2jgk" podStartSLOduration=2.475275695 podStartE2EDuration="2.475275695s" podCreationTimestamp="2026-02-19 22:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:45.469528831 +0000 UTC m=+5425.740971321" watchObservedRunningTime="2026-02-19 22:58:45.475275695 +0000 UTC m=+5425.746718165" Feb 19 22:58:46 crc kubenswrapper[4771]: I0219 22:58:46.937918 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.018728 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b7c64ccf-rltm6"] Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.019107 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" podUID="6539f0e3-8b84-4309-bc81-289831a669c7" containerName="dnsmasq-dns" containerID="cri-o://633c83fac7433f2ea1100a6b7b4b2e5bfc437c8f1ada1ef96e319423d17f483e" gracePeriod=10 Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.461448 4771 generic.go:334] "Generic (PLEG): container finished" podID="6539f0e3-8b84-4309-bc81-289831a669c7" containerID="633c83fac7433f2ea1100a6b7b4b2e5bfc437c8f1ada1ef96e319423d17f483e" exitCode=0 Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.461508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" event={"ID":"6539f0e3-8b84-4309-bc81-289831a669c7","Type":"ContainerDied","Data":"633c83fac7433f2ea1100a6b7b4b2e5bfc437c8f1ada1ef96e319423d17f483e"} Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.461682 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" event={"ID":"6539f0e3-8b84-4309-bc81-289831a669c7","Type":"ContainerDied","Data":"a842236a0f6d4698e0e991c1054f8ea023a96dcdd332b7c7428b1962c9888055"} Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.461715 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a842236a0f6d4698e0e991c1054f8ea023a96dcdd332b7c7428b1962c9888055" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.525927 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.700721 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q8w2\" (UniqueName: \"kubernetes.io/projected/6539f0e3-8b84-4309-bc81-289831a669c7-kube-api-access-2q8w2\") pod \"6539f0e3-8b84-4309-bc81-289831a669c7\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.700821 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-sb\") pod \"6539f0e3-8b84-4309-bc81-289831a669c7\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.700854 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-nb\") pod \"6539f0e3-8b84-4309-bc81-289831a669c7\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.700877 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-config\") pod \"6539f0e3-8b84-4309-bc81-289831a669c7\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.700987 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-dns-svc\") pod \"6539f0e3-8b84-4309-bc81-289831a669c7\" (UID: \"6539f0e3-8b84-4309-bc81-289831a669c7\") " Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.716415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6539f0e3-8b84-4309-bc81-289831a669c7-kube-api-access-2q8w2" (OuterVolumeSpecName: "kube-api-access-2q8w2") pod "6539f0e3-8b84-4309-bc81-289831a669c7" (UID: "6539f0e3-8b84-4309-bc81-289831a669c7"). InnerVolumeSpecName "kube-api-access-2q8w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.742480 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6539f0e3-8b84-4309-bc81-289831a669c7" (UID: "6539f0e3-8b84-4309-bc81-289831a669c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.746651 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-config" (OuterVolumeSpecName: "config") pod "6539f0e3-8b84-4309-bc81-289831a669c7" (UID: "6539f0e3-8b84-4309-bc81-289831a669c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.746819 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6539f0e3-8b84-4309-bc81-289831a669c7" (UID: "6539f0e3-8b84-4309-bc81-289831a669c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.765213 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6539f0e3-8b84-4309-bc81-289831a669c7" (UID: "6539f0e3-8b84-4309-bc81-289831a669c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.803007 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q8w2\" (UniqueName: \"kubernetes.io/projected/6539f0e3-8b84-4309-bc81-289831a669c7-kube-api-access-2q8w2\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.803071 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.803082 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.803096 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4771]: I0219 22:58:47.803108 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6539f0e3-8b84-4309-bc81-289831a669c7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:48 crc kubenswrapper[4771]: I0219 22:58:48.474530 4771 generic.go:334] "Generic (PLEG): container finished" podID="2c81a18f-e0b7-4557-8789-62e97a5950af" containerID="31e8257e5f63678562b42bdd4a7e3ab780e6923204c674c418a909706defdc99" exitCode=0 Feb 19 22:58:48 crc kubenswrapper[4771]: I0219 22:58:48.474589 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d2jgk" event={"ID":"2c81a18f-e0b7-4557-8789-62e97a5950af","Type":"ContainerDied","Data":"31e8257e5f63678562b42bdd4a7e3ab780e6923204c674c418a909706defdc99"} Feb 19 22:58:48 crc kubenswrapper[4771]: I0219 22:58:48.474654 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7c64ccf-rltm6" Feb 19 22:58:48 crc kubenswrapper[4771]: I0219 22:58:48.539562 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b7c64ccf-rltm6"] Feb 19 22:58:48 crc kubenswrapper[4771]: I0219 22:58:48.550643 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b7c64ccf-rltm6"] Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.045518 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.144960 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-combined-ca-bundle\") pod \"2c81a18f-e0b7-4557-8789-62e97a5950af\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.145048 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-scripts\") pod \"2c81a18f-e0b7-4557-8789-62e97a5950af\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.145154 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-credential-keys\") pod \"2c81a18f-e0b7-4557-8789-62e97a5950af\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.145209 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8lc4\" (UniqueName: \"kubernetes.io/projected/2c81a18f-e0b7-4557-8789-62e97a5950af-kube-api-access-w8lc4\") pod \"2c81a18f-e0b7-4557-8789-62e97a5950af\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.145258 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-config-data\") pod \"2c81a18f-e0b7-4557-8789-62e97a5950af\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.145338 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-fernet-keys\") pod \"2c81a18f-e0b7-4557-8789-62e97a5950af\" (UID: \"2c81a18f-e0b7-4557-8789-62e97a5950af\") " Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.150868 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2c81a18f-e0b7-4557-8789-62e97a5950af" (UID: "2c81a18f-e0b7-4557-8789-62e97a5950af"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.151237 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-scripts" (OuterVolumeSpecName: "scripts") pod "2c81a18f-e0b7-4557-8789-62e97a5950af" (UID: "2c81a18f-e0b7-4557-8789-62e97a5950af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.152612 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c81a18f-e0b7-4557-8789-62e97a5950af-kube-api-access-w8lc4" (OuterVolumeSpecName: "kube-api-access-w8lc4") pod "2c81a18f-e0b7-4557-8789-62e97a5950af" (UID: "2c81a18f-e0b7-4557-8789-62e97a5950af"). InnerVolumeSpecName "kube-api-access-w8lc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.154068 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2c81a18f-e0b7-4557-8789-62e97a5950af" (UID: "2c81a18f-e0b7-4557-8789-62e97a5950af"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.172377 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-config-data" (OuterVolumeSpecName: "config-data") pod "2c81a18f-e0b7-4557-8789-62e97a5950af" (UID: "2c81a18f-e0b7-4557-8789-62e97a5950af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.187376 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c81a18f-e0b7-4557-8789-62e97a5950af" (UID: "2c81a18f-e0b7-4557-8789-62e97a5950af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.248142 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.248193 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8lc4\" (UniqueName: \"kubernetes.io/projected/2c81a18f-e0b7-4557-8789-62e97a5950af-kube-api-access-w8lc4\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.248214 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.248232 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.248250 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.248267 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c81a18f-e0b7-4557-8789-62e97a5950af-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.456484 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6539f0e3-8b84-4309-bc81-289831a669c7" path="/var/lib/kubelet/pods/6539f0e3-8b84-4309-bc81-289831a669c7/volumes" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.503341 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d2jgk" event={"ID":"2c81a18f-e0b7-4557-8789-62e97a5950af","Type":"ContainerDied","Data":"12387683510dd43d7d07aabfc65628f85c78822af104276f451fc2c0ce74451e"} Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.503412 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12387683510dd43d7d07aabfc65628f85c78822af104276f451fc2c0ce74451e" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.503428 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d2jgk" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.636903 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7767cf9677-cqzdw"] Feb 19 22:58:50 crc kubenswrapper[4771]: E0219 22:58:50.637416 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c81a18f-e0b7-4557-8789-62e97a5950af" containerName="keystone-bootstrap" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.637439 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c81a18f-e0b7-4557-8789-62e97a5950af" containerName="keystone-bootstrap" Feb 19 22:58:50 crc kubenswrapper[4771]: E0219 22:58:50.637476 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6539f0e3-8b84-4309-bc81-289831a669c7" containerName="init" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.637490 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6539f0e3-8b84-4309-bc81-289831a669c7" containerName="init" Feb 19 22:58:50 crc kubenswrapper[4771]: E0219 22:58:50.637520 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6539f0e3-8b84-4309-bc81-289831a669c7" containerName="dnsmasq-dns" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.637533 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6539f0e3-8b84-4309-bc81-289831a669c7" containerName="dnsmasq-dns" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.637795 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6539f0e3-8b84-4309-bc81-289831a669c7" containerName="dnsmasq-dns" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.637847 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c81a18f-e0b7-4557-8789-62e97a5950af" containerName="keystone-bootstrap" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.638829 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.642745 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.643552 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.644532 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.644823 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kldh" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.644888 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.645589 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.678555 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7767cf9677-cqzdw"] Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759051 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-fernet-keys\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759378 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-public-tls-certs\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759469 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-credential-keys\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759560 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-internal-tls-certs\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-scripts\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-combined-ca-bundle\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759784 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-config-data\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.759867 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8slj\" (UniqueName: \"kubernetes.io/projected/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-kube-api-access-r8slj\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861072 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-internal-tls-certs\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861123 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-scripts\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-combined-ca-bundle\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861164 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-config-data\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8slj\" (UniqueName: \"kubernetes.io/projected/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-kube-api-access-r8slj\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-fernet-keys\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-public-tls-certs\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.861334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-credential-keys\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.869069 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-config-data\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.872721 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-credential-keys\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.872896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-fernet-keys\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.872926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-public-tls-certs\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.873238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-combined-ca-bundle\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.873266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-internal-tls-certs\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.873429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-scripts\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.879122 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8slj\" (UniqueName: \"kubernetes.io/projected/88a37d62-ce17-4ba1-b2fe-5cede6afb25c-kube-api-access-r8slj\") pod \"keystone-7767cf9677-cqzdw\" (UID: \"88a37d62-ce17-4ba1-b2fe-5cede6afb25c\") " pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:50 crc kubenswrapper[4771]: I0219 22:58:50.963303 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:51 crc kubenswrapper[4771]: I0219 22:58:51.439464 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7767cf9677-cqzdw"] Feb 19 22:58:52 crc kubenswrapper[4771]: I0219 22:58:52.526567 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7767cf9677-cqzdw" event={"ID":"88a37d62-ce17-4ba1-b2fe-5cede6afb25c","Type":"ContainerStarted","Data":"68d788858612374dd9476f3dff46fa26dd4122f6de5153f961e8b5fb8d2afc56"} Feb 19 22:58:52 crc kubenswrapper[4771]: I0219 22:58:52.526638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7767cf9677-cqzdw" event={"ID":"88a37d62-ce17-4ba1-b2fe-5cede6afb25c","Type":"ContainerStarted","Data":"8b96933da7bc83668b0e87705727d00288d01621f265158bdfff35db8461c4ee"} Feb 19 22:58:52 crc kubenswrapper[4771]: I0219 22:58:52.526784 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:58:52 crc kubenswrapper[4771]: I0219 22:58:52.561981 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7767cf9677-cqzdw" podStartSLOduration=2.561954638 podStartE2EDuration="2.561954638s" podCreationTimestamp="2026-02-19 22:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:52.558312131 +0000 UTC m=+5432.829754711" watchObservedRunningTime="2026-02-19 22:58:52.561954638 +0000 UTC m=+5432.833397128" Feb 19 22:58:55 crc kubenswrapper[4771]: I0219 22:58:55.437201 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:58:55 crc kubenswrapper[4771]: E0219 22:58:55.438248 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:59:07 crc kubenswrapper[4771]: I0219 22:59:07.437638 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:59:07 crc kubenswrapper[4771]: E0219 22:59:07.438862 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:59:22 crc kubenswrapper[4771]: I0219 22:59:22.438343 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:59:22 crc kubenswrapper[4771]: E0219 22:59:22.439521 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:59:22 crc kubenswrapper[4771]: I0219 22:59:22.471321 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7767cf9677-cqzdw" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.512015 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.514376 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.518369 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.519006 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.521343 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-558vn" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.529034 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.622232 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config-secret\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.622312 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vdkm\" (UniqueName: \"kubernetes.io/projected/200f4663-56b8-47de-a506-2345c7d42ef9-kube-api-access-8vdkm\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.622792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.623113 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.724830 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.724994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.725116 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config-secret\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.725171 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vdkm\" (UniqueName: \"kubernetes.io/projected/200f4663-56b8-47de-a506-2345c7d42ef9-kube-api-access-8vdkm\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.726865 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.731097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.737789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config-secret\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.746005 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vdkm\" (UniqueName: \"kubernetes.io/projected/200f4663-56b8-47de-a506-2345c7d42ef9-kube-api-access-8vdkm\") pod \"openstackclient\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " pod="openstack/openstackclient" Feb 19 22:59:25 crc kubenswrapper[4771]: I0219 22:59:25.862969 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 22:59:26 crc kubenswrapper[4771]: I0219 22:59:26.337835 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 22:59:26 crc kubenswrapper[4771]: I0219 22:59:26.897841 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"200f4663-56b8-47de-a506-2345c7d42ef9","Type":"ContainerStarted","Data":"98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968"} Feb 19 22:59:26 crc kubenswrapper[4771]: I0219 22:59:26.899640 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"200f4663-56b8-47de-a506-2345c7d42ef9","Type":"ContainerStarted","Data":"c2da1964a2c94230b710274b57ac2fc6ad1d2b560574e0f894968fd4b351c41b"} Feb 19 22:59:26 crc kubenswrapper[4771]: I0219 22:59:26.920405 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.920384241 podStartE2EDuration="1.920384241s" podCreationTimestamp="2026-02-19 22:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:26.917993638 +0000 UTC m=+5467.189436128" watchObservedRunningTime="2026-02-19 22:59:26.920384241 +0000 UTC m=+5467.191826721" Feb 19 22:59:36 crc kubenswrapper[4771]: I0219 22:59:36.437741 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:59:36 crc kubenswrapper[4771]: E0219 22:59:36.438907 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 22:59:51 crc kubenswrapper[4771]: I0219 22:59:51.437391 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 22:59:51 crc kubenswrapper[4771]: E0219 22:59:51.438354 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.184821 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm"] Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.186616 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.188172 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.189991 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.214732 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm"] Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.380650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1226276-1244-42cc-8f6b-8fcf393451ab-secret-volume\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.380700 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1226276-1244-42cc-8f6b-8fcf393451ab-config-volume\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.380800 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqzhd\" (UniqueName: \"kubernetes.io/projected/e1226276-1244-42cc-8f6b-8fcf393451ab-kube-api-access-kqzhd\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.482785 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1226276-1244-42cc-8f6b-8fcf393451ab-secret-volume\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.482846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1226276-1244-42cc-8f6b-8fcf393451ab-config-volume\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.482922 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqzhd\" (UniqueName: \"kubernetes.io/projected/e1226276-1244-42cc-8f6b-8fcf393451ab-kube-api-access-kqzhd\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.483901 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1226276-1244-42cc-8f6b-8fcf393451ab-config-volume\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.487823 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1226276-1244-42cc-8f6b-8fcf393451ab-secret-volume\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.505876 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqzhd\" (UniqueName: \"kubernetes.io/projected/e1226276-1244-42cc-8f6b-8fcf393451ab-kube-api-access-kqzhd\") pod \"collect-profiles-29525700-rr5wm\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:00 crc kubenswrapper[4771]: I0219 23:00:00.803402 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:01 crc kubenswrapper[4771]: I0219 23:00:01.273736 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm"] Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.101511 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lkzdx"] Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.103521 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.108273 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lkzdx"] Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.209533 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-utilities\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.209599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-catalog-content\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.209806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kgh\" (UniqueName: \"kubernetes.io/projected/2361600c-2ad8-4f1e-a1cb-80aafeda1785-kube-api-access-44kgh\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.291623 4771 generic.go:334] "Generic (PLEG): container finished" podID="e1226276-1244-42cc-8f6b-8fcf393451ab" containerID="d47b98d930e2a03a9f2974f41dd40d0fc4e5cf703e829ee8c2222528e0ff75d7" exitCode=0 Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.291691 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" event={"ID":"e1226276-1244-42cc-8f6b-8fcf393451ab","Type":"ContainerDied","Data":"d47b98d930e2a03a9f2974f41dd40d0fc4e5cf703e829ee8c2222528e0ff75d7"} Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.291725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" event={"ID":"e1226276-1244-42cc-8f6b-8fcf393451ab","Type":"ContainerStarted","Data":"f667becb437fe8adea9d62c56a449b26902fea61fb1ec8b7dded56fecb41a089"} Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.315120 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-utilities\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.315178 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-catalog-content\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.315277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44kgh\" (UniqueName: \"kubernetes.io/projected/2361600c-2ad8-4f1e-a1cb-80aafeda1785-kube-api-access-44kgh\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.315812 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-utilities\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.316138 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-catalog-content\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.338462 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44kgh\" (UniqueName: \"kubernetes.io/projected/2361600c-2ad8-4f1e-a1cb-80aafeda1785-kube-api-access-44kgh\") pod \"community-operators-lkzdx\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.419309 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:02 crc kubenswrapper[4771]: I0219 23:00:02.915252 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lkzdx"] Feb 19 23:00:03 crc kubenswrapper[4771]: W0219 23:00:03.388508 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2361600c_2ad8_4f1e_a1cb_80aafeda1785.slice/crio-b6b2babdb67172c0bdd424660d349b4ac606226967e80cff0982b57b48c04542 WatchSource:0}: Error finding container b6b2babdb67172c0bdd424660d349b4ac606226967e80cff0982b57b48c04542: Status 404 returned error can't find the container with id b6b2babdb67172c0bdd424660d349b4ac606226967e80cff0982b57b48c04542 Feb 19 23:00:03 crc kubenswrapper[4771]: I0219 23:00:03.901160 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.053208 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1226276-1244-42cc-8f6b-8fcf393451ab-config-volume\") pod \"e1226276-1244-42cc-8f6b-8fcf393451ab\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.053601 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1226276-1244-42cc-8f6b-8fcf393451ab-secret-volume\") pod \"e1226276-1244-42cc-8f6b-8fcf393451ab\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.053676 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqzhd\" (UniqueName: \"kubernetes.io/projected/e1226276-1244-42cc-8f6b-8fcf393451ab-kube-api-access-kqzhd\") pod \"e1226276-1244-42cc-8f6b-8fcf393451ab\" (UID: \"e1226276-1244-42cc-8f6b-8fcf393451ab\") " Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.054380 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1226276-1244-42cc-8f6b-8fcf393451ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "e1226276-1244-42cc-8f6b-8fcf393451ab" (UID: "e1226276-1244-42cc-8f6b-8fcf393451ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.058711 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1226276-1244-42cc-8f6b-8fcf393451ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e1226276-1244-42cc-8f6b-8fcf393451ab" (UID: "e1226276-1244-42cc-8f6b-8fcf393451ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.059127 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1226276-1244-42cc-8f6b-8fcf393451ab-kube-api-access-kqzhd" (OuterVolumeSpecName: "kube-api-access-kqzhd") pod "e1226276-1244-42cc-8f6b-8fcf393451ab" (UID: "e1226276-1244-42cc-8f6b-8fcf393451ab"). InnerVolumeSpecName "kube-api-access-kqzhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.156464 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1226276-1244-42cc-8f6b-8fcf393451ab-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.156553 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqzhd\" (UniqueName: \"kubernetes.io/projected/e1226276-1244-42cc-8f6b-8fcf393451ab-kube-api-access-kqzhd\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.156584 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1226276-1244-42cc-8f6b-8fcf393451ab-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.314479 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.314525 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm" event={"ID":"e1226276-1244-42cc-8f6b-8fcf393451ab","Type":"ContainerDied","Data":"f667becb437fe8adea9d62c56a449b26902fea61fb1ec8b7dded56fecb41a089"} Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.314598 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f667becb437fe8adea9d62c56a449b26902fea61fb1ec8b7dded56fecb41a089" Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.316587 4771 generic.go:334] "Generic (PLEG): container finished" podID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerID="5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961" exitCode=0 Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.316630 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkzdx" event={"ID":"2361600c-2ad8-4f1e-a1cb-80aafeda1785","Type":"ContainerDied","Data":"5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961"} Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.316676 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkzdx" event={"ID":"2361600c-2ad8-4f1e-a1cb-80aafeda1785","Type":"ContainerStarted","Data":"b6b2babdb67172c0bdd424660d349b4ac606226967e80cff0982b57b48c04542"} Feb 19 23:00:04 crc kubenswrapper[4771]: I0219 23:00:04.994986 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f"] Feb 19 23:00:05 crc kubenswrapper[4771]: I0219 23:00:05.004562 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-qsx9f"] Feb 19 23:00:05 crc kubenswrapper[4771]: I0219 23:00:05.326433 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkzdx" event={"ID":"2361600c-2ad8-4f1e-a1cb-80aafeda1785","Type":"ContainerStarted","Data":"3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331"} Feb 19 23:00:06 crc kubenswrapper[4771]: I0219 23:00:06.358493 4771 generic.go:334] "Generic (PLEG): container finished" podID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerID="3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331" exitCode=0 Feb 19 23:00:06 crc kubenswrapper[4771]: I0219 23:00:06.358578 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkzdx" event={"ID":"2361600c-2ad8-4f1e-a1cb-80aafeda1785","Type":"ContainerDied","Data":"3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331"} Feb 19 23:00:06 crc kubenswrapper[4771]: I0219 23:00:06.437452 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:00:06 crc kubenswrapper[4771]: E0219 23:00:06.437663 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:00:06 crc kubenswrapper[4771]: I0219 23:00:06.447548 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e61c602-b6d1-42a1-a13d-19d61fcb9e12" path="/var/lib/kubelet/pods/8e61c602-b6d1-42a1-a13d-19d61fcb9e12/volumes" Feb 19 23:00:08 crc kubenswrapper[4771]: I0219 23:00:08.387381 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkzdx" event={"ID":"2361600c-2ad8-4f1e-a1cb-80aafeda1785","Type":"ContainerStarted","Data":"0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14"} Feb 19 23:00:08 crc kubenswrapper[4771]: I0219 23:00:08.413292 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lkzdx" podStartSLOduration=3.6050460920000003 podStartE2EDuration="6.413260826s" podCreationTimestamp="2026-02-19 23:00:02 +0000 UTC" firstStartedPulling="2026-02-19 23:00:04.319356781 +0000 UTC m=+5504.590799251" lastFinishedPulling="2026-02-19 23:00:07.127571485 +0000 UTC m=+5507.399013985" observedRunningTime="2026-02-19 23:00:08.410317638 +0000 UTC m=+5508.681760148" watchObservedRunningTime="2026-02-19 23:00:08.413260826 +0000 UTC m=+5508.684703296" Feb 19 23:00:12 crc kubenswrapper[4771]: I0219 23:00:12.420497 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:12 crc kubenswrapper[4771]: I0219 23:00:12.420927 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:12 crc kubenswrapper[4771]: I0219 23:00:12.925716 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:13 crc kubenswrapper[4771]: I0219 23:00:13.522335 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:13 crc kubenswrapper[4771]: I0219 23:00:13.588799 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lkzdx"] Feb 19 23:00:15 crc kubenswrapper[4771]: I0219 23:00:15.458972 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lkzdx" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="registry-server" containerID="cri-o://0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14" gracePeriod=2 Feb 19 23:00:15 crc kubenswrapper[4771]: I0219 23:00:15.992820 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.080472 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-catalog-content\") pod \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.081078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44kgh\" (UniqueName: \"kubernetes.io/projected/2361600c-2ad8-4f1e-a1cb-80aafeda1785-kube-api-access-44kgh\") pod \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.081280 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-utilities\") pod \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\" (UID: \"2361600c-2ad8-4f1e-a1cb-80aafeda1785\") " Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.083202 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-utilities" (OuterVolumeSpecName: "utilities") pod "2361600c-2ad8-4f1e-a1cb-80aafeda1785" (UID: "2361600c-2ad8-4f1e-a1cb-80aafeda1785"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.094310 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2361600c-2ad8-4f1e-a1cb-80aafeda1785-kube-api-access-44kgh" (OuterVolumeSpecName: "kube-api-access-44kgh") pod "2361600c-2ad8-4f1e-a1cb-80aafeda1785" (UID: "2361600c-2ad8-4f1e-a1cb-80aafeda1785"). InnerVolumeSpecName "kube-api-access-44kgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.148312 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2361600c-2ad8-4f1e-a1cb-80aafeda1785" (UID: "2361600c-2ad8-4f1e-a1cb-80aafeda1785"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.183989 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.184098 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2361600c-2ad8-4f1e-a1cb-80aafeda1785-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.184131 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44kgh\" (UniqueName: \"kubernetes.io/projected/2361600c-2ad8-4f1e-a1cb-80aafeda1785-kube-api-access-44kgh\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.472570 4771 generic.go:334] "Generic (PLEG): container finished" podID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerID="0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14" exitCode=0 Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.472996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkzdx" event={"ID":"2361600c-2ad8-4f1e-a1cb-80aafeda1785","Type":"ContainerDied","Data":"0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14"} Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.473084 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkzdx" event={"ID":"2361600c-2ad8-4f1e-a1cb-80aafeda1785","Type":"ContainerDied","Data":"b6b2babdb67172c0bdd424660d349b4ac606226967e80cff0982b57b48c04542"} Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.473146 4771 scope.go:117] "RemoveContainer" containerID="0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.473456 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkzdx" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.524857 4771 scope.go:117] "RemoveContainer" containerID="3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.532131 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lkzdx"] Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.544240 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lkzdx"] Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.552121 4771 scope.go:117] "RemoveContainer" containerID="5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.611684 4771 scope.go:117] "RemoveContainer" containerID="0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14" Feb 19 23:00:16 crc kubenswrapper[4771]: E0219 23:00:16.612277 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14\": container with ID starting with 0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14 not found: ID does not exist" containerID="0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.612514 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14"} err="failed to get container status \"0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14\": rpc error: code = NotFound desc = could not find container \"0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14\": container with ID starting with 0b5ba794d84b797f6dbbf869b9f41c69785d9786d58427c4aa011b7df2493d14 not found: ID does not exist" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.612742 4771 scope.go:117] "RemoveContainer" containerID="3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331" Feb 19 23:00:16 crc kubenswrapper[4771]: E0219 23:00:16.614105 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331\": container with ID starting with 3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331 not found: ID does not exist" containerID="3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.614147 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331"} err="failed to get container status \"3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331\": rpc error: code = NotFound desc = could not find container \"3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331\": container with ID starting with 3416ac99bebb65bfaf5550736b66116f61daa47a4192a6285ce67759b3d08331 not found: ID does not exist" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.614169 4771 scope.go:117] "RemoveContainer" containerID="5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961" Feb 19 23:00:16 crc kubenswrapper[4771]: E0219 23:00:16.614597 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961\": container with ID starting with 5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961 not found: ID does not exist" containerID="5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961" Feb 19 23:00:16 crc kubenswrapper[4771]: I0219 23:00:16.614640 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961"} err="failed to get container status \"5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961\": rpc error: code = NotFound desc = could not find container \"5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961\": container with ID starting with 5718e06b1f8b99e94e93a708bb2a9a2fe5364a545eedb71d47ec9fc122e3d961 not found: ID does not exist" Feb 19 23:00:18 crc kubenswrapper[4771]: I0219 23:00:18.450368 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" path="/var/lib/kubelet/pods/2361600c-2ad8-4f1e-a1cb-80aafeda1785/volumes" Feb 19 23:00:20 crc kubenswrapper[4771]: I0219 23:00:20.446824 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:00:20 crc kubenswrapper[4771]: E0219 23:00:20.447558 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:00:34 crc kubenswrapper[4771]: I0219 23:00:34.438270 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:00:34 crc kubenswrapper[4771]: E0219 23:00:34.441277 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:00:35 crc kubenswrapper[4771]: I0219 23:00:35.183206 4771 scope.go:117] "RemoveContainer" containerID="a1aec04422339f27f4205ac8acf7008279f11cb4fb400e79e945f193d2a9be7c" Feb 19 23:00:47 crc kubenswrapper[4771]: I0219 23:00:47.436937 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:00:47 crc kubenswrapper[4771]: E0219 23:00:47.437641 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:00:50 crc kubenswrapper[4771]: E0219 23:00:50.990991 4771 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.248:51036->38.102.83.248:36635: write tcp 38.102.83.248:51036->38.102.83.248:36635: write: broken pipe Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.162634 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525701-jn7pz"] Feb 19 23:01:00 crc kubenswrapper[4771]: E0219 23:01:00.163898 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.163920 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4771]: E0219 23:01:00.163946 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.163959 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4771]: E0219 23:01:00.163989 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.164003 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4771]: E0219 23:01:00.164047 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1226276-1244-42cc-8f6b-8fcf393451ab" containerName="collect-profiles" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.164059 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1226276-1244-42cc-8f6b-8fcf393451ab" containerName="collect-profiles" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.164351 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2361600c-2ad8-4f1e-a1cb-80aafeda1785" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.164378 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1226276-1244-42cc-8f6b-8fcf393451ab" containerName="collect-profiles" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.165174 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.179355 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525701-jn7pz"] Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.273527 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-combined-ca-bundle\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.273577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-fernet-keys\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.273631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-config-data\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.273716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpzcc\" (UniqueName: \"kubernetes.io/projected/1a65030d-bcc9-42d3-80d9-66a100f1882d-kube-api-access-zpzcc\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.376912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-fernet-keys\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.376963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-combined-ca-bundle\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.377012 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-config-data\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.377115 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpzcc\" (UniqueName: \"kubernetes.io/projected/1a65030d-bcc9-42d3-80d9-66a100f1882d-kube-api-access-zpzcc\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.389383 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-config-data\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.389544 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-combined-ca-bundle\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.390512 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-fernet-keys\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.411444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpzcc\" (UniqueName: \"kubernetes.io/projected/1a65030d-bcc9-42d3-80d9-66a100f1882d-kube-api-access-zpzcc\") pod \"keystone-cron-29525701-jn7pz\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.443372 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:01:00 crc kubenswrapper[4771]: E0219 23:01:00.443797 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.495218 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.777782 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525701-jn7pz"] Feb 19 23:01:00 crc kubenswrapper[4771]: I0219 23:01:00.915449 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-jn7pz" event={"ID":"1a65030d-bcc9-42d3-80d9-66a100f1882d","Type":"ContainerStarted","Data":"28fc9b20fe8a341e8ddc3ac5d040ce2a2c0a24bba9584072ab8bda9e8e1cd3fd"} Feb 19 23:01:01 crc kubenswrapper[4771]: I0219 23:01:01.927393 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-jn7pz" event={"ID":"1a65030d-bcc9-42d3-80d9-66a100f1882d","Type":"ContainerStarted","Data":"2674309b1c9692688b90f09271717be1a556821c8d835db400cb68a898784022"} Feb 19 23:01:01 crc kubenswrapper[4771]: I0219 23:01:01.948742 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525701-jn7pz" podStartSLOduration=1.948718268 podStartE2EDuration="1.948718268s" podCreationTimestamp="2026-02-19 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:01.948542224 +0000 UTC m=+5562.219984714" watchObservedRunningTime="2026-02-19 23:01:01.948718268 +0000 UTC m=+5562.220160778" Feb 19 23:01:02 crc kubenswrapper[4771]: I0219 23:01:02.945765 4771 generic.go:334] "Generic (PLEG): container finished" podID="1a65030d-bcc9-42d3-80d9-66a100f1882d" containerID="2674309b1c9692688b90f09271717be1a556821c8d835db400cb68a898784022" exitCode=0 Feb 19 23:01:02 crc kubenswrapper[4771]: I0219 23:01:02.945852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-jn7pz" event={"ID":"1a65030d-bcc9-42d3-80d9-66a100f1882d","Type":"ContainerDied","Data":"2674309b1c9692688b90f09271717be1a556821c8d835db400cb68a898784022"} Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.610146 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7djsk"] Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.613119 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.620311 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7djsk"] Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.744578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-utilities\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.745046 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgq52\" (UniqueName: \"kubernetes.io/projected/1b632835-f9cf-43b8-a530-099b7257c60c-kube-api-access-lgq52\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.745254 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-catalog-content\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.846618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-catalog-content\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.847000 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-utilities\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.847197 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-catalog-content\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.847345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgq52\" (UniqueName: \"kubernetes.io/projected/1b632835-f9cf-43b8-a530-099b7257c60c-kube-api-access-lgq52\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:03 crc kubenswrapper[4771]: I0219 23:01:03.847891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-utilities\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.261778 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgq52\" (UniqueName: \"kubernetes.io/projected/1b632835-f9cf-43b8-a530-099b7257c60c-kube-api-access-lgq52\") pod \"redhat-operators-7djsk\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.280817 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.385077 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.558829 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-fernet-keys\") pod \"1a65030d-bcc9-42d3-80d9-66a100f1882d\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.558901 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-config-data\") pod \"1a65030d-bcc9-42d3-80d9-66a100f1882d\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.558971 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpzcc\" (UniqueName: \"kubernetes.io/projected/1a65030d-bcc9-42d3-80d9-66a100f1882d-kube-api-access-zpzcc\") pod \"1a65030d-bcc9-42d3-80d9-66a100f1882d\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.559106 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-combined-ca-bundle\") pod \"1a65030d-bcc9-42d3-80d9-66a100f1882d\" (UID: \"1a65030d-bcc9-42d3-80d9-66a100f1882d\") " Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.581541 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1a65030d-bcc9-42d3-80d9-66a100f1882d" (UID: "1a65030d-bcc9-42d3-80d9-66a100f1882d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.581617 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a65030d-bcc9-42d3-80d9-66a100f1882d-kube-api-access-zpzcc" (OuterVolumeSpecName: "kube-api-access-zpzcc") pod "1a65030d-bcc9-42d3-80d9-66a100f1882d" (UID: "1a65030d-bcc9-42d3-80d9-66a100f1882d"). InnerVolumeSpecName "kube-api-access-zpzcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.601877 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-config-data" (OuterVolumeSpecName: "config-data") pod "1a65030d-bcc9-42d3-80d9-66a100f1882d" (UID: "1a65030d-bcc9-42d3-80d9-66a100f1882d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.604439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a65030d-bcc9-42d3-80d9-66a100f1882d" (UID: "1a65030d-bcc9-42d3-80d9-66a100f1882d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.666729 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.667032 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.667100 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a65030d-bcc9-42d3-80d9-66a100f1882d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.667155 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpzcc\" (UniqueName: \"kubernetes.io/projected/1a65030d-bcc9-42d3-80d9-66a100f1882d-kube-api-access-zpzcc\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.792185 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7djsk"] Feb 19 23:01:04 crc kubenswrapper[4771]: W0219 23:01:04.812543 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b632835_f9cf_43b8_a530_099b7257c60c.slice/crio-d07b3a46b69447ce0641eec1db0d6dc3fc7dd9a1efc8bf7cc05104562a658e32 WatchSource:0}: Error finding container d07b3a46b69447ce0641eec1db0d6dc3fc7dd9a1efc8bf7cc05104562a658e32: Status 404 returned error can't find the container with id d07b3a46b69447ce0641eec1db0d6dc3fc7dd9a1efc8bf7cc05104562a658e32 Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.965846 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7djsk" event={"ID":"1b632835-f9cf-43b8-a530-099b7257c60c","Type":"ContainerStarted","Data":"8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a"} Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.966186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7djsk" event={"ID":"1b632835-f9cf-43b8-a530-099b7257c60c","Type":"ContainerStarted","Data":"d07b3a46b69447ce0641eec1db0d6dc3fc7dd9a1efc8bf7cc05104562a658e32"} Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.968095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-jn7pz" event={"ID":"1a65030d-bcc9-42d3-80d9-66a100f1882d","Type":"ContainerDied","Data":"28fc9b20fe8a341e8ddc3ac5d040ce2a2c0a24bba9584072ab8bda9e8e1cd3fd"} Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.968226 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28fc9b20fe8a341e8ddc3ac5d040ce2a2c0a24bba9584072ab8bda9e8e1cd3fd" Feb 19 23:01:04 crc kubenswrapper[4771]: I0219 23:01:04.968349 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-jn7pz" Feb 19 23:01:05 crc kubenswrapper[4771]: I0219 23:01:05.981958 4771 generic.go:334] "Generic (PLEG): container finished" podID="1b632835-f9cf-43b8-a530-099b7257c60c" containerID="8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a" exitCode=0 Feb 19 23:01:05 crc kubenswrapper[4771]: I0219 23:01:05.982098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7djsk" event={"ID":"1b632835-f9cf-43b8-a530-099b7257c60c","Type":"ContainerDied","Data":"8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a"} Feb 19 23:01:07 crc kubenswrapper[4771]: E0219 23:01:07.876375 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b632835_f9cf_43b8_a530_099b7257c60c.slice/crio-3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b632835_f9cf_43b8_a530_099b7257c60c.slice/crio-conmon-3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229.scope\": RecentStats: unable to find data in memory cache]" Feb 19 23:01:08 crc kubenswrapper[4771]: I0219 23:01:08.004933 4771 generic.go:334] "Generic (PLEG): container finished" podID="1b632835-f9cf-43b8-a530-099b7257c60c" containerID="3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229" exitCode=0 Feb 19 23:01:08 crc kubenswrapper[4771]: I0219 23:01:08.005076 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7djsk" event={"ID":"1b632835-f9cf-43b8-a530-099b7257c60c","Type":"ContainerDied","Data":"3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229"} Feb 19 23:01:09 crc kubenswrapper[4771]: I0219 23:01:09.016437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7djsk" event={"ID":"1b632835-f9cf-43b8-a530-099b7257c60c","Type":"ContainerStarted","Data":"4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9"} Feb 19 23:01:09 crc kubenswrapper[4771]: I0219 23:01:09.043254 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7djsk" podStartSLOduration=3.627486984 podStartE2EDuration="6.043228863s" podCreationTimestamp="2026-02-19 23:01:03 +0000 UTC" firstStartedPulling="2026-02-19 23:01:05.984896814 +0000 UTC m=+5566.256339324" lastFinishedPulling="2026-02-19 23:01:08.400638703 +0000 UTC m=+5568.672081203" observedRunningTime="2026-02-19 23:01:09.040353026 +0000 UTC m=+5569.311795546" watchObservedRunningTime="2026-02-19 23:01:09.043228863 +0000 UTC m=+5569.314671383" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.041859 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-tgc56"] Feb 19 23:01:10 crc kubenswrapper[4771]: E0219 23:01:10.043088 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a65030d-bcc9-42d3-80d9-66a100f1882d" containerName="keystone-cron" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.043106 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a65030d-bcc9-42d3-80d9-66a100f1882d" containerName="keystone-cron" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.043297 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a65030d-bcc9-42d3-80d9-66a100f1882d" containerName="keystone-cron" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.043908 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.054148 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6d55-account-create-update-6w54c"] Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.055144 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.059472 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tgc56"] Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.083507 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.090780 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6d55-account-create-update-6w54c"] Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.185960 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f1f2a3-c87c-49e6-9a87-7e622b30faac-operator-scripts\") pod \"barbican-6d55-account-create-update-6w54c\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.186040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglv6\" (UniqueName: \"kubernetes.io/projected/29f1f2a3-c87c-49e6-9a87-7e622b30faac-kube-api-access-jglv6\") pod \"barbican-6d55-account-create-update-6w54c\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.186162 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlt2s\" (UniqueName: \"kubernetes.io/projected/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-kube-api-access-rlt2s\") pod \"barbican-db-create-tgc56\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.186353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-operator-scripts\") pod \"barbican-db-create-tgc56\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.287868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-operator-scripts\") pod \"barbican-db-create-tgc56\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.288727 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-operator-scripts\") pod \"barbican-db-create-tgc56\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.288957 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f1f2a3-c87c-49e6-9a87-7e622b30faac-operator-scripts\") pod \"barbican-6d55-account-create-update-6w54c\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.289679 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f1f2a3-c87c-49e6-9a87-7e622b30faac-operator-scripts\") pod \"barbican-6d55-account-create-update-6w54c\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.289745 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jglv6\" (UniqueName: \"kubernetes.io/projected/29f1f2a3-c87c-49e6-9a87-7e622b30faac-kube-api-access-jglv6\") pod \"barbican-6d55-account-create-update-6w54c\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.290170 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlt2s\" (UniqueName: \"kubernetes.io/projected/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-kube-api-access-rlt2s\") pod \"barbican-db-create-tgc56\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.311827 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlt2s\" (UniqueName: \"kubernetes.io/projected/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-kube-api-access-rlt2s\") pod \"barbican-db-create-tgc56\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.318184 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jglv6\" (UniqueName: \"kubernetes.io/projected/29f1f2a3-c87c-49e6-9a87-7e622b30faac-kube-api-access-jglv6\") pod \"barbican-6d55-account-create-update-6w54c\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.424247 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.437948 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.894510 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tgc56"] Feb 19 23:01:10 crc kubenswrapper[4771]: W0219 23:01:10.898688 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c084eb3_4488_4d20_b1fc_c92a2f42dd19.slice/crio-a2a36e1e63328bc7eea641930f5d2019e20991ca400baecffbf9c529a30477b2 WatchSource:0}: Error finding container a2a36e1e63328bc7eea641930f5d2019e20991ca400baecffbf9c529a30477b2: Status 404 returned error can't find the container with id a2a36e1e63328bc7eea641930f5d2019e20991ca400baecffbf9c529a30477b2 Feb 19 23:01:10 crc kubenswrapper[4771]: I0219 23:01:10.957139 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6d55-account-create-update-6w54c"] Feb 19 23:01:10 crc kubenswrapper[4771]: W0219 23:01:10.966842 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29f1f2a3_c87c_49e6_9a87_7e622b30faac.slice/crio-a2d2789305afc5245014385070c9ff4d50c1477425bc9037c34a2a6d9f06ef95 WatchSource:0}: Error finding container a2d2789305afc5245014385070c9ff4d50c1477425bc9037c34a2a6d9f06ef95: Status 404 returned error can't find the container with id a2d2789305afc5245014385070c9ff4d50c1477425bc9037c34a2a6d9f06ef95 Feb 19 23:01:11 crc kubenswrapper[4771]: I0219 23:01:11.038454 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d55-account-create-update-6w54c" event={"ID":"29f1f2a3-c87c-49e6-9a87-7e622b30faac","Type":"ContainerStarted","Data":"a2d2789305afc5245014385070c9ff4d50c1477425bc9037c34a2a6d9f06ef95"} Feb 19 23:01:11 crc kubenswrapper[4771]: I0219 23:01:11.040920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tgc56" event={"ID":"0c084eb3-4488-4d20-b1fc-c92a2f42dd19","Type":"ContainerStarted","Data":"a2a36e1e63328bc7eea641930f5d2019e20991ca400baecffbf9c529a30477b2"} Feb 19 23:01:12 crc kubenswrapper[4771]: I0219 23:01:12.054586 4771 generic.go:334] "Generic (PLEG): container finished" podID="29f1f2a3-c87c-49e6-9a87-7e622b30faac" containerID="ab7c09220036754f543c128fdff1af89f32df97a316d04996d4ef7f0c4065202" exitCode=0 Feb 19 23:01:12 crc kubenswrapper[4771]: I0219 23:01:12.054693 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d55-account-create-update-6w54c" event={"ID":"29f1f2a3-c87c-49e6-9a87-7e622b30faac","Type":"ContainerDied","Data":"ab7c09220036754f543c128fdff1af89f32df97a316d04996d4ef7f0c4065202"} Feb 19 23:01:12 crc kubenswrapper[4771]: I0219 23:01:12.060819 4771 generic.go:334] "Generic (PLEG): container finished" podID="0c084eb3-4488-4d20-b1fc-c92a2f42dd19" containerID="f66be0da4eff9bb9f7b585d44765ebb24d7a93f1d44e293d07a3603f265a90a4" exitCode=0 Feb 19 23:01:12 crc kubenswrapper[4771]: I0219 23:01:12.060879 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tgc56" event={"ID":"0c084eb3-4488-4d20-b1fc-c92a2f42dd19","Type":"ContainerDied","Data":"f66be0da4eff9bb9f7b585d44765ebb24d7a93f1d44e293d07a3603f265a90a4"} Feb 19 23:01:12 crc kubenswrapper[4771]: I0219 23:01:12.437739 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:01:12 crc kubenswrapper[4771]: E0219 23:01:12.438264 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.464903 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.469727 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.551121 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-operator-scripts\") pod \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.551199 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jglv6\" (UniqueName: \"kubernetes.io/projected/29f1f2a3-c87c-49e6-9a87-7e622b30faac-kube-api-access-jglv6\") pod \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.551234 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlt2s\" (UniqueName: \"kubernetes.io/projected/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-kube-api-access-rlt2s\") pod \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\" (UID: \"0c084eb3-4488-4d20-b1fc-c92a2f42dd19\") " Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.551271 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f1f2a3-c87c-49e6-9a87-7e622b30faac-operator-scripts\") pod \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\" (UID: \"29f1f2a3-c87c-49e6-9a87-7e622b30faac\") " Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.551768 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c084eb3-4488-4d20-b1fc-c92a2f42dd19" (UID: "0c084eb3-4488-4d20-b1fc-c92a2f42dd19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.552057 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29f1f2a3-c87c-49e6-9a87-7e622b30faac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29f1f2a3-c87c-49e6-9a87-7e622b30faac" (UID: "29f1f2a3-c87c-49e6-9a87-7e622b30faac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.558324 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f1f2a3-c87c-49e6-9a87-7e622b30faac-kube-api-access-jglv6" (OuterVolumeSpecName: "kube-api-access-jglv6") pod "29f1f2a3-c87c-49e6-9a87-7e622b30faac" (UID: "29f1f2a3-c87c-49e6-9a87-7e622b30faac"). InnerVolumeSpecName "kube-api-access-jglv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.564294 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-kube-api-access-rlt2s" (OuterVolumeSpecName: "kube-api-access-rlt2s") pod "0c084eb3-4488-4d20-b1fc-c92a2f42dd19" (UID: "0c084eb3-4488-4d20-b1fc-c92a2f42dd19"). InnerVolumeSpecName "kube-api-access-rlt2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.653870 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.653925 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jglv6\" (UniqueName: \"kubernetes.io/projected/29f1f2a3-c87c-49e6-9a87-7e622b30faac-kube-api-access-jglv6\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.653945 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlt2s\" (UniqueName: \"kubernetes.io/projected/0c084eb3-4488-4d20-b1fc-c92a2f42dd19-kube-api-access-rlt2s\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:13 crc kubenswrapper[4771]: I0219 23:01:13.653963 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f1f2a3-c87c-49e6-9a87-7e622b30faac-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.079842 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tgc56" event={"ID":"0c084eb3-4488-4d20-b1fc-c92a2f42dd19","Type":"ContainerDied","Data":"a2a36e1e63328bc7eea641930f5d2019e20991ca400baecffbf9c529a30477b2"} Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.080477 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2a36e1e63328bc7eea641930f5d2019e20991ca400baecffbf9c529a30477b2" Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.080589 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tgc56" Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.084317 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6d55-account-create-update-6w54c" event={"ID":"29f1f2a3-c87c-49e6-9a87-7e622b30faac","Type":"ContainerDied","Data":"a2d2789305afc5245014385070c9ff4d50c1477425bc9037c34a2a6d9f06ef95"} Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.084376 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2d2789305afc5245014385070c9ff4d50c1477425bc9037c34a2a6d9f06ef95" Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.084497 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6d55-account-create-update-6w54c" Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.281427 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:14 crc kubenswrapper[4771]: I0219 23:01:14.281508 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.314936 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-s45z6"] Feb 19 23:01:15 crc kubenswrapper[4771]: E0219 23:01:15.315358 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c084eb3-4488-4d20-b1fc-c92a2f42dd19" containerName="mariadb-database-create" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.315374 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c084eb3-4488-4d20-b1fc-c92a2f42dd19" containerName="mariadb-database-create" Feb 19 23:01:15 crc kubenswrapper[4771]: E0219 23:01:15.315392 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f1f2a3-c87c-49e6-9a87-7e622b30faac" containerName="mariadb-account-create-update" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.315400 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f1f2a3-c87c-49e6-9a87-7e622b30faac" containerName="mariadb-account-create-update" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.315602 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f1f2a3-c87c-49e6-9a87-7e622b30faac" containerName="mariadb-account-create-update" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.315620 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c084eb3-4488-4d20-b1fc-c92a2f42dd19" containerName="mariadb-database-create" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.316273 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.319886 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.320438 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gdvb5" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.325506 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-s45z6"] Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.358350 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7djsk" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="registry-server" probeResult="failure" output=< Feb 19 23:01:15 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:01:15 crc kubenswrapper[4771]: > Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.385805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-combined-ca-bundle\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.385867 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-db-sync-config-data\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.385952 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8gc\" (UniqueName: \"kubernetes.io/projected/87d33bd6-8a72-405a-b3ec-14bcce254f9e-kube-api-access-xg8gc\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.488104 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8gc\" (UniqueName: \"kubernetes.io/projected/87d33bd6-8a72-405a-b3ec-14bcce254f9e-kube-api-access-xg8gc\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.488283 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-combined-ca-bundle\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.488315 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-db-sync-config-data\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.495750 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-combined-ca-bundle\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.496454 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-db-sync-config-data\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.531743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8gc\" (UniqueName: \"kubernetes.io/projected/87d33bd6-8a72-405a-b3ec-14bcce254f9e-kube-api-access-xg8gc\") pod \"barbican-db-sync-s45z6\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:15 crc kubenswrapper[4771]: I0219 23:01:15.665420 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:16 crc kubenswrapper[4771]: I0219 23:01:16.180971 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-s45z6"] Feb 19 23:01:17 crc kubenswrapper[4771]: I0219 23:01:17.146982 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s45z6" event={"ID":"87d33bd6-8a72-405a-b3ec-14bcce254f9e","Type":"ContainerStarted","Data":"dfca0112d2c458918462e5dd50667567b8be4a8e86a92145d14e1629fcfba85c"} Feb 19 23:01:17 crc kubenswrapper[4771]: I0219 23:01:17.147504 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s45z6" event={"ID":"87d33bd6-8a72-405a-b3ec-14bcce254f9e","Type":"ContainerStarted","Data":"9636b6b99ceb86d6f38bd843d4a36d1b597a95395a27778447e1de45fed2ee50"} Feb 19 23:01:17 crc kubenswrapper[4771]: I0219 23:01:17.169893 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-s45z6" podStartSLOduration=2.169869711 podStartE2EDuration="2.169869711s" podCreationTimestamp="2026-02-19 23:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:17.163895162 +0000 UTC m=+5577.435337662" watchObservedRunningTime="2026-02-19 23:01:17.169869711 +0000 UTC m=+5577.441312191" Feb 19 23:01:20 crc kubenswrapper[4771]: I0219 23:01:20.177069 4771 generic.go:334] "Generic (PLEG): container finished" podID="87d33bd6-8a72-405a-b3ec-14bcce254f9e" containerID="dfca0112d2c458918462e5dd50667567b8be4a8e86a92145d14e1629fcfba85c" exitCode=0 Feb 19 23:01:20 crc kubenswrapper[4771]: I0219 23:01:20.177149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s45z6" event={"ID":"87d33bd6-8a72-405a-b3ec-14bcce254f9e","Type":"ContainerDied","Data":"dfca0112d2c458918462e5dd50667567b8be4a8e86a92145d14e1629fcfba85c"} Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.628911 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.762094 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-db-sync-config-data\") pod \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.762206 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg8gc\" (UniqueName: \"kubernetes.io/projected/87d33bd6-8a72-405a-b3ec-14bcce254f9e-kube-api-access-xg8gc\") pod \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.762276 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-combined-ca-bundle\") pod \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\" (UID: \"87d33bd6-8a72-405a-b3ec-14bcce254f9e\") " Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.768515 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "87d33bd6-8a72-405a-b3ec-14bcce254f9e" (UID: "87d33bd6-8a72-405a-b3ec-14bcce254f9e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.769522 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d33bd6-8a72-405a-b3ec-14bcce254f9e-kube-api-access-xg8gc" (OuterVolumeSpecName: "kube-api-access-xg8gc") pod "87d33bd6-8a72-405a-b3ec-14bcce254f9e" (UID: "87d33bd6-8a72-405a-b3ec-14bcce254f9e"). InnerVolumeSpecName "kube-api-access-xg8gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.787775 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87d33bd6-8a72-405a-b3ec-14bcce254f9e" (UID: "87d33bd6-8a72-405a-b3ec-14bcce254f9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.864201 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.864238 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg8gc\" (UniqueName: \"kubernetes.io/projected/87d33bd6-8a72-405a-b3ec-14bcce254f9e-kube-api-access-xg8gc\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:21 crc kubenswrapper[4771]: I0219 23:01:21.864252 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d33bd6-8a72-405a-b3ec-14bcce254f9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.202899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-s45z6" event={"ID":"87d33bd6-8a72-405a-b3ec-14bcce254f9e","Type":"ContainerDied","Data":"9636b6b99ceb86d6f38bd843d4a36d1b597a95395a27778447e1de45fed2ee50"} Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.202949 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9636b6b99ceb86d6f38bd843d4a36d1b597a95395a27778447e1de45fed2ee50" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.203013 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-s45z6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.471442 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b4448d5cc-99fd6"] Feb 19 23:01:22 crc kubenswrapper[4771]: E0219 23:01:22.473290 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d33bd6-8a72-405a-b3ec-14bcce254f9e" containerName="barbican-db-sync" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.473316 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d33bd6-8a72-405a-b3ec-14bcce254f9e" containerName="barbican-db-sync" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.479557 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d33bd6-8a72-405a-b3ec-14bcce254f9e" containerName="barbican-db-sync" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.480577 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8c545c846-c84s7"] Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.481893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.482159 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.485081 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.485127 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.485210 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gdvb5" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.485229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.495003 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b4448d5cc-99fd6"] Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.514656 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8c545c846-c84s7"] Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574325 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59511b4b-718b-4f8b-8781-1a1c952c0e16-logs\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574372 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ntp\" (UniqueName: \"kubernetes.io/projected/59511b4b-718b-4f8b-8781-1a1c952c0e16-kube-api-access-s4ntp\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-config-data\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574433 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-combined-ca-bundle\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574514 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-config-data-custom\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574531 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-combined-ca-bundle\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574565 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-config-data\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574581 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09286b82-6a52-4993-99ee-868ad3d84cb7-logs\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574603 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktmz5\" (UniqueName: \"kubernetes.io/projected/09286b82-6a52-4993-99ee-868ad3d84cb7-kube-api-access-ktmz5\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.574627 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-config-data-custom\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.581755 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78d466d845-vhxrn"] Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.582970 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.594497 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d466d845-vhxrn"] Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.675770 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-config-data-custom\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.675841 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-combined-ca-bundle\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.675870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-dns-svc\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-config-data\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09286b82-6a52-4993-99ee-868ad3d84cb7-logs\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676555 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktmz5\" (UniqueName: \"kubernetes.io/projected/09286b82-6a52-4993-99ee-868ad3d84cb7-kube-api-access-ktmz5\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-config-data-custom\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59511b4b-718b-4f8b-8781-1a1c952c0e16-logs\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676630 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-config\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ntp\" (UniqueName: \"kubernetes.io/projected/59511b4b-718b-4f8b-8781-1a1c952c0e16-kube-api-access-s4ntp\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676715 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-nb\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-config-data\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676765 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-combined-ca-bundle\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.676985 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-sb\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.677048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7wl\" (UniqueName: \"kubernetes.io/projected/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-kube-api-access-bf7wl\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.679141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-config-data-custom\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.679370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-combined-ca-bundle\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.679560 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09286b82-6a52-4993-99ee-868ad3d84cb7-logs\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.679961 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59511b4b-718b-4f8b-8781-1a1c952c0e16-logs\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.680758 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-config-data\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.681624 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-config-data-custom\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.689767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09286b82-6a52-4993-99ee-868ad3d84cb7-config-data\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.690215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59511b4b-718b-4f8b-8781-1a1c952c0e16-combined-ca-bundle\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.702771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktmz5\" (UniqueName: \"kubernetes.io/projected/09286b82-6a52-4993-99ee-868ad3d84cb7-kube-api-access-ktmz5\") pod \"barbican-keystone-listener-8c545c846-c84s7\" (UID: \"09286b82-6a52-4993-99ee-868ad3d84cb7\") " pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.705209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ntp\" (UniqueName: \"kubernetes.io/projected/59511b4b-718b-4f8b-8781-1a1c952c0e16-kube-api-access-s4ntp\") pod \"barbican-worker-7b4448d5cc-99fd6\" (UID: \"59511b4b-718b-4f8b-8781-1a1c952c0e16\") " pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.723319 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5765b4fd9b-7qrrs"] Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.725137 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.726978 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.739305 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5765b4fd9b-7qrrs"] Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-config\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778059 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-nb\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778108 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-sb\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778155 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7wl\" (UniqueName: \"kubernetes.io/projected/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-kube-api-access-bf7wl\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-dns-svc\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-nb\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-dns-svc\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.778936 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-sb\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.779054 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-config\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.793448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7wl\" (UniqueName: \"kubernetes.io/projected/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-kube-api-access-bf7wl\") pod \"dnsmasq-dns-78d466d845-vhxrn\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.805243 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8c545c846-c84s7" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.818731 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b4448d5cc-99fd6" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.879596 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1772b4d-283d-4758-8b7b-6fed24e80b8c-logs\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.879636 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data-custom\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.879673 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-combined-ca-bundle\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.879932 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hdq9\" (UniqueName: \"kubernetes.io/projected/b1772b4d-283d-4758-8b7b-6fed24e80b8c-kube-api-access-5hdq9\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.880216 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.904112 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.983939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-combined-ca-bundle\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.984006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hdq9\" (UniqueName: \"kubernetes.io/projected/b1772b4d-283d-4758-8b7b-6fed24e80b8c-kube-api-access-5hdq9\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.984091 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.984136 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1772b4d-283d-4758-8b7b-6fed24e80b8c-logs\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.984153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data-custom\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.985941 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1772b4d-283d-4758-8b7b-6fed24e80b8c-logs\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.999013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-combined-ca-bundle\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.999053 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data-custom\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:22 crc kubenswrapper[4771]: I0219 23:01:22.999920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:23 crc kubenswrapper[4771]: I0219 23:01:23.003443 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hdq9\" (UniqueName: \"kubernetes.io/projected/b1772b4d-283d-4758-8b7b-6fed24e80b8c-kube-api-access-5hdq9\") pod \"barbican-api-5765b4fd9b-7qrrs\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:23 crc kubenswrapper[4771]: I0219 23:01:23.077301 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:23 crc kubenswrapper[4771]: I0219 23:01:23.262493 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8c545c846-c84s7"] Feb 19 23:01:23 crc kubenswrapper[4771]: W0219 23:01:23.270697 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09286b82_6a52_4993_99ee_868ad3d84cb7.slice/crio-52ca6f206b7c169d04931f3e60b380af66668e25bcf7db9b809acfdd50e8e339 WatchSource:0}: Error finding container 52ca6f206b7c169d04931f3e60b380af66668e25bcf7db9b809acfdd50e8e339: Status 404 returned error can't find the container with id 52ca6f206b7c169d04931f3e60b380af66668e25bcf7db9b809acfdd50e8e339 Feb 19 23:01:23 crc kubenswrapper[4771]: I0219 23:01:23.384157 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b4448d5cc-99fd6"] Feb 19 23:01:23 crc kubenswrapper[4771]: I0219 23:01:23.395076 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d466d845-vhxrn"] Feb 19 23:01:23 crc kubenswrapper[4771]: I0219 23:01:23.582075 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5765b4fd9b-7qrrs"] Feb 19 23:01:23 crc kubenswrapper[4771]: W0219 23:01:23.630887 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1772b4d_283d_4758_8b7b_6fed24e80b8c.slice/crio-74bf5d869beb0db77652fafe1a4052940c4f134853b1a2407e7655e02b0c0b24 WatchSource:0}: Error finding container 74bf5d869beb0db77652fafe1a4052940c4f134853b1a2407e7655e02b0c0b24: Status 404 returned error can't find the container with id 74bf5d869beb0db77652fafe1a4052940c4f134853b1a2407e7655e02b0c0b24 Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.219365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c545c846-c84s7" event={"ID":"09286b82-6a52-4993-99ee-868ad3d84cb7","Type":"ContainerStarted","Data":"01ab47bfebd9a866058a7275a7a330a756cd5bd7b2aa6ebe2d696aef19563ef0"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.219679 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c545c846-c84s7" event={"ID":"09286b82-6a52-4993-99ee-868ad3d84cb7","Type":"ContainerStarted","Data":"ab0747db48e1d5aacae4d55a4a50155cc54691d51cfa662d8787f83cc193a2cf"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.219708 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8c545c846-c84s7" event={"ID":"09286b82-6a52-4993-99ee-868ad3d84cb7","Type":"ContainerStarted","Data":"52ca6f206b7c169d04931f3e60b380af66668e25bcf7db9b809acfdd50e8e339"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.222066 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerID="01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068" exitCode=0 Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.222157 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" event={"ID":"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd","Type":"ContainerDied","Data":"01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.222430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" event={"ID":"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd","Type":"ContainerStarted","Data":"00a28616791d31f0285deed69c696acba090690a22478adc27d376baf1b49979"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.242008 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4448d5cc-99fd6" event={"ID":"59511b4b-718b-4f8b-8781-1a1c952c0e16","Type":"ContainerStarted","Data":"c0069a3b3ed4180c6a9508a5f20011886cc2d4a2eb40c34b588a549e6c94795a"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.242129 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4448d5cc-99fd6" event={"ID":"59511b4b-718b-4f8b-8781-1a1c952c0e16","Type":"ContainerStarted","Data":"05a33f95e7a1193cc8fe665c5ef95e1e88615ee5cb61ec57b0091113167e38f4"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.242143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4448d5cc-99fd6" event={"ID":"59511b4b-718b-4f8b-8781-1a1c952c0e16","Type":"ContainerStarted","Data":"33b8e5d5749c957fa7bf42ddacb2fd3cd78bba64fb8167d54fa33aa65ee38f4a"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.245259 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8c545c846-c84s7" podStartSLOduration=2.245240974 podStartE2EDuration="2.245240974s" podCreationTimestamp="2026-02-19 23:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:24.242933613 +0000 UTC m=+5584.514376093" watchObservedRunningTime="2026-02-19 23:01:24.245240974 +0000 UTC m=+5584.516683434" Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.248763 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5765b4fd9b-7qrrs" event={"ID":"b1772b4d-283d-4758-8b7b-6fed24e80b8c","Type":"ContainerStarted","Data":"5ba5571d15c3804ee666f4819817b9d9b1c59ad3fc8f6ae79033e153a6be9c42"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.248844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5765b4fd9b-7qrrs" event={"ID":"b1772b4d-283d-4758-8b7b-6fed24e80b8c","Type":"ContainerStarted","Data":"2c89f05639536485a5c8cc66ed9dce830ddbc1f426fdcc9c6c40a252958f0e1a"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.248856 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5765b4fd9b-7qrrs" event={"ID":"b1772b4d-283d-4758-8b7b-6fed24e80b8c","Type":"ContainerStarted","Data":"74bf5d869beb0db77652fafe1a4052940c4f134853b1a2407e7655e02b0c0b24"} Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.248911 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.248926 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.343341 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5765b4fd9b-7qrrs" podStartSLOduration=2.343311351 podStartE2EDuration="2.343311351s" podCreationTimestamp="2026-02-19 23:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:24.332962746 +0000 UTC m=+5584.604405216" watchObservedRunningTime="2026-02-19 23:01:24.343311351 +0000 UTC m=+5584.614753841" Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.351601 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b4448d5cc-99fd6" podStartSLOduration=2.351575732 podStartE2EDuration="2.351575732s" podCreationTimestamp="2026-02-19 23:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:24.308435461 +0000 UTC m=+5584.579877931" watchObservedRunningTime="2026-02-19 23:01:24.351575732 +0000 UTC m=+5584.623018212" Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.361181 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.448737 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:24 crc kubenswrapper[4771]: I0219 23:01:24.593686 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7djsk"] Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.011847 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55c9c68bd-k2tjn"] Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.013993 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.015980 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.016196 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.019614 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55c9c68bd-k2tjn"] Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.127193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7d2\" (UniqueName: \"kubernetes.io/projected/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-kube-api-access-9q7d2\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.127238 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-internal-tls-certs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.127358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-public-tls-certs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.127475 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-logs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.127661 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-config-data\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.127689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-config-data-custom\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.127783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-combined-ca-bundle\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.229827 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7d2\" (UniqueName: \"kubernetes.io/projected/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-kube-api-access-9q7d2\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.229870 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-internal-tls-certs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.229902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-public-tls-certs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.229939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-logs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.229993 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-config-data\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.230014 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-config-data-custom\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.230070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-combined-ca-bundle\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.230732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-logs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.236120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-config-data\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.244771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-config-data-custom\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.246791 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-internal-tls-certs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.247936 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-combined-ca-bundle\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.250670 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-public-tls-certs\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.253731 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7d2\" (UniqueName: \"kubernetes.io/projected/ef1a2dcb-a395-4516-a64b-0e25c1802d4e-kube-api-access-9q7d2\") pod \"barbican-api-55c9c68bd-k2tjn\" (UID: \"ef1a2dcb-a395-4516-a64b-0e25c1802d4e\") " pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.261523 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" event={"ID":"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd","Type":"ContainerStarted","Data":"b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3"} Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.296642 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" podStartSLOduration=3.296615863 podStartE2EDuration="3.296615863s" podCreationTimestamp="2026-02-19 23:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:25.281693985 +0000 UTC m=+5585.553136465" watchObservedRunningTime="2026-02-19 23:01:25.296615863 +0000 UTC m=+5585.568058373" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.345781 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:25 crc kubenswrapper[4771]: I0219 23:01:25.874077 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55c9c68bd-k2tjn"] Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.278955 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9c68bd-k2tjn" event={"ID":"ef1a2dcb-a395-4516-a64b-0e25c1802d4e","Type":"ContainerStarted","Data":"e30988f60e2b9dddf4d1b5db26fa497d8014ac4aa9d7e7fdb93b134bc5d76722"} Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.279298 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9c68bd-k2tjn" event={"ID":"ef1a2dcb-a395-4516-a64b-0e25c1802d4e","Type":"ContainerStarted","Data":"2c91f6adb8af826d3002528a9bf216bce5d60cd9482c493a8d754db1526148a3"} Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.279329 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.279330 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7djsk" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="registry-server" containerID="cri-o://4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9" gracePeriod=2 Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.437379 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:01:26 crc kubenswrapper[4771]: E0219 23:01:26.437666 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.639613 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.757124 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-utilities\") pod \"1b632835-f9cf-43b8-a530-099b7257c60c\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.757195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgq52\" (UniqueName: \"kubernetes.io/projected/1b632835-f9cf-43b8-a530-099b7257c60c-kube-api-access-lgq52\") pod \"1b632835-f9cf-43b8-a530-099b7257c60c\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.757260 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-catalog-content\") pod \"1b632835-f9cf-43b8-a530-099b7257c60c\" (UID: \"1b632835-f9cf-43b8-a530-099b7257c60c\") " Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.758042 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-utilities" (OuterVolumeSpecName: "utilities") pod "1b632835-f9cf-43b8-a530-099b7257c60c" (UID: "1b632835-f9cf-43b8-a530-099b7257c60c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.760475 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b632835-f9cf-43b8-a530-099b7257c60c-kube-api-access-lgq52" (OuterVolumeSpecName: "kube-api-access-lgq52") pod "1b632835-f9cf-43b8-a530-099b7257c60c" (UID: "1b632835-f9cf-43b8-a530-099b7257c60c"). InnerVolumeSpecName "kube-api-access-lgq52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.858794 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.858824 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgq52\" (UniqueName: \"kubernetes.io/projected/1b632835-f9cf-43b8-a530-099b7257c60c-kube-api-access-lgq52\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.890496 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b632835-f9cf-43b8-a530-099b7257c60c" (UID: "1b632835-f9cf-43b8-a530-099b7257c60c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:01:26 crc kubenswrapper[4771]: I0219 23:01:26.960784 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b632835-f9cf-43b8-a530-099b7257c60c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.292150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9c68bd-k2tjn" event={"ID":"ef1a2dcb-a395-4516-a64b-0e25c1802d4e","Type":"ContainerStarted","Data":"82d6ab6cacbbc5c5c5b3e2dbf8a2f3f4ac3f4cc42859402086292931c5595f6f"} Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.292702 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.292731 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.296693 4771 generic.go:334] "Generic (PLEG): container finished" podID="1b632835-f9cf-43b8-a530-099b7257c60c" containerID="4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9" exitCode=0 Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.296883 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7djsk" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.296826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7djsk" event={"ID":"1b632835-f9cf-43b8-a530-099b7257c60c","Type":"ContainerDied","Data":"4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9"} Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.296980 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7djsk" event={"ID":"1b632835-f9cf-43b8-a530-099b7257c60c","Type":"ContainerDied","Data":"d07b3a46b69447ce0641eec1db0d6dc3fc7dd9a1efc8bf7cc05104562a658e32"} Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.297015 4771 scope.go:117] "RemoveContainer" containerID="4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.323402 4771 scope.go:117] "RemoveContainer" containerID="3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.338313 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55c9c68bd-k2tjn" podStartSLOduration=3.3382415180000002 podStartE2EDuration="3.338241518s" podCreationTimestamp="2026-02-19 23:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:27.321743979 +0000 UTC m=+5587.593186509" watchObservedRunningTime="2026-02-19 23:01:27.338241518 +0000 UTC m=+5587.609684018" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.369871 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7djsk"] Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.377222 4771 scope.go:117] "RemoveContainer" containerID="8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.387893 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7djsk"] Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.405707 4771 scope.go:117] "RemoveContainer" containerID="4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9" Feb 19 23:01:27 crc kubenswrapper[4771]: E0219 23:01:27.406310 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9\": container with ID starting with 4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9 not found: ID does not exist" containerID="4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.406461 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9"} err="failed to get container status \"4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9\": rpc error: code = NotFound desc = could not find container \"4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9\": container with ID starting with 4fbb79e161f2611cf5b0122cdcf75566346e2707860e9775e592d536452099d9 not found: ID does not exist" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.406547 4771 scope.go:117] "RemoveContainer" containerID="3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229" Feb 19 23:01:27 crc kubenswrapper[4771]: E0219 23:01:27.407177 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229\": container with ID starting with 3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229 not found: ID does not exist" containerID="3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.407211 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229"} err="failed to get container status \"3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229\": rpc error: code = NotFound desc = could not find container \"3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229\": container with ID starting with 3d17ddb76af8fff78bfbf23363d0dd9dcd72a5eda3d8d65ec1dd2f2d6698d229 not found: ID does not exist" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.407244 4771 scope.go:117] "RemoveContainer" containerID="8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a" Feb 19 23:01:27 crc kubenswrapper[4771]: E0219 23:01:27.407613 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a\": container with ID starting with 8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a not found: ID does not exist" containerID="8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a" Feb 19 23:01:27 crc kubenswrapper[4771]: I0219 23:01:27.407674 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a"} err="failed to get container status \"8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a\": rpc error: code = NotFound desc = could not find container \"8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a\": container with ID starting with 8a4146da2e16ee1d138ac6f7a526f2eb60138f20ea36ebff7afc7e3443254c0a not found: ID does not exist" Feb 19 23:01:28 crc kubenswrapper[4771]: I0219 23:01:28.453950 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" path="/var/lib/kubelet/pods/1b632835-f9cf-43b8-a530-099b7257c60c/volumes" Feb 19 23:01:29 crc kubenswrapper[4771]: I0219 23:01:29.486669 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:30 crc kubenswrapper[4771]: I0219 23:01:30.747524 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:32 crc kubenswrapper[4771]: I0219 23:01:32.905867 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:01:32 crc kubenswrapper[4771]: I0219 23:01:32.965811 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84df6f8cd9-pdflf"] Feb 19 23:01:32 crc kubenswrapper[4771]: I0219 23:01:32.966061 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" podUID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerName="dnsmasq-dns" containerID="cri-o://e62e486a0e3ee56695587359c5df9f4c18bcb1b82a6f0a28fc5e639f073517fa" gracePeriod=10 Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.358748 4771 generic.go:334] "Generic (PLEG): container finished" podID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerID="e62e486a0e3ee56695587359c5df9f4c18bcb1b82a6f0a28fc5e639f073517fa" exitCode=0 Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.358826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" event={"ID":"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f","Type":"ContainerDied","Data":"e62e486a0e3ee56695587359c5df9f4c18bcb1b82a6f0a28fc5e639f073517fa"} Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.590681 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.735679 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4k8j\" (UniqueName: \"kubernetes.io/projected/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-kube-api-access-f4k8j\") pod \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.735739 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-sb\") pod \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.735801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-config\") pod \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.735833 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-nb\") pod \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.735895 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-dns-svc\") pod \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\" (UID: \"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f\") " Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.740873 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-kube-api-access-f4k8j" (OuterVolumeSpecName: "kube-api-access-f4k8j") pod "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" (UID: "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f"). InnerVolumeSpecName "kube-api-access-f4k8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.775176 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" (UID: "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.779481 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-config" (OuterVolumeSpecName: "config") pod "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" (UID: "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.791084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" (UID: "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.794246 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" (UID: "aba2c23b-cb37-462b-8fc6-f4c1e8bf641f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.837906 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.837941 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4k8j\" (UniqueName: \"kubernetes.io/projected/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-kube-api-access-f4k8j\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.837955 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.837967 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:33 crc kubenswrapper[4771]: I0219 23:01:33.837976 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:34 crc kubenswrapper[4771]: I0219 23:01:34.369758 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" event={"ID":"aba2c23b-cb37-462b-8fc6-f4c1e8bf641f","Type":"ContainerDied","Data":"fc21337c6a4d9ba76fd865d419718f85a95190895dfe0d3dfea2951312ea1c70"} Feb 19 23:01:34 crc kubenswrapper[4771]: I0219 23:01:34.370155 4771 scope.go:117] "RemoveContainer" containerID="e62e486a0e3ee56695587359c5df9f4c18bcb1b82a6f0a28fc5e639f073517fa" Feb 19 23:01:34 crc kubenswrapper[4771]: I0219 23:01:34.369847 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84df6f8cd9-pdflf" Feb 19 23:01:34 crc kubenswrapper[4771]: I0219 23:01:34.428627 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84df6f8cd9-pdflf"] Feb 19 23:01:34 crc kubenswrapper[4771]: I0219 23:01:34.430262 4771 scope.go:117] "RemoveContainer" containerID="da57fc2085cceac58f5a34f860af9da9f9f436a79090eeacde5e33d545d0e8ff" Feb 19 23:01:34 crc kubenswrapper[4771]: I0219 23:01:34.480115 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84df6f8cd9-pdflf"] Feb 19 23:01:36 crc kubenswrapper[4771]: I0219 23:01:36.448622 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" path="/var/lib/kubelet/pods/aba2c23b-cb37-462b-8fc6-f4c1e8bf641f/volumes" Feb 19 23:01:36 crc kubenswrapper[4771]: I0219 23:01:36.745015 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:36 crc kubenswrapper[4771]: I0219 23:01:36.750514 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55c9c68bd-k2tjn" Feb 19 23:01:36 crc kubenswrapper[4771]: I0219 23:01:36.845305 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5765b4fd9b-7qrrs"] Feb 19 23:01:36 crc kubenswrapper[4771]: I0219 23:01:36.845570 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5765b4fd9b-7qrrs" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api-log" containerID="cri-o://2c89f05639536485a5c8cc66ed9dce830ddbc1f426fdcc9c6c40a252958f0e1a" gracePeriod=30 Feb 19 23:01:36 crc kubenswrapper[4771]: I0219 23:01:36.845683 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5765b4fd9b-7qrrs" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api" containerID="cri-o://5ba5571d15c3804ee666f4819817b9d9b1c59ad3fc8f6ae79033e153a6be9c42" gracePeriod=30 Feb 19 23:01:37 crc kubenswrapper[4771]: I0219 23:01:37.403657 4771 generic.go:334] "Generic (PLEG): container finished" podID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerID="2c89f05639536485a5c8cc66ed9dce830ddbc1f426fdcc9c6c40a252958f0e1a" exitCode=143 Feb 19 23:01:37 crc kubenswrapper[4771]: I0219 23:01:37.403747 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5765b4fd9b-7qrrs" event={"ID":"b1772b4d-283d-4758-8b7b-6fed24e80b8c","Type":"ContainerDied","Data":"2c89f05639536485a5c8cc66ed9dce830ddbc1f426fdcc9c6c40a252958f0e1a"} Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.006886 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5765b4fd9b-7qrrs" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.46:9311/healthcheck\": read tcp 10.217.0.2:52312->10.217.1.46:9311: read: connection reset by peer" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.006963 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5765b4fd9b-7qrrs" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.46:9311/healthcheck\": read tcp 10.217.0.2:52324->10.217.1.46:9311: read: connection reset by peer" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.444776 4771 generic.go:334] "Generic (PLEG): container finished" podID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerID="5ba5571d15c3804ee666f4819817b9d9b1c59ad3fc8f6ae79033e153a6be9c42" exitCode=0 Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.449420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5765b4fd9b-7qrrs" event={"ID":"b1772b4d-283d-4758-8b7b-6fed24e80b8c","Type":"ContainerDied","Data":"5ba5571d15c3804ee666f4819817b9d9b1c59ad3fc8f6ae79033e153a6be9c42"} Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.604289 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.703207 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1772b4d-283d-4758-8b7b-6fed24e80b8c-logs\") pod \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.703313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data\") pod \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.703377 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-combined-ca-bundle\") pod \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.703473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data-custom\") pod \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.703514 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hdq9\" (UniqueName: \"kubernetes.io/projected/b1772b4d-283d-4758-8b7b-6fed24e80b8c-kube-api-access-5hdq9\") pod \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\" (UID: \"b1772b4d-283d-4758-8b7b-6fed24e80b8c\") " Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.705521 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1772b4d-283d-4758-8b7b-6fed24e80b8c-logs" (OuterVolumeSpecName: "logs") pod "b1772b4d-283d-4758-8b7b-6fed24e80b8c" (UID: "b1772b4d-283d-4758-8b7b-6fed24e80b8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.712073 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1772b4d-283d-4758-8b7b-6fed24e80b8c" (UID: "b1772b4d-283d-4758-8b7b-6fed24e80b8c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.729392 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1772b4d-283d-4758-8b7b-6fed24e80b8c-kube-api-access-5hdq9" (OuterVolumeSpecName: "kube-api-access-5hdq9") pod "b1772b4d-283d-4758-8b7b-6fed24e80b8c" (UID: "b1772b4d-283d-4758-8b7b-6fed24e80b8c"). InnerVolumeSpecName "kube-api-access-5hdq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.755201 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1772b4d-283d-4758-8b7b-6fed24e80b8c" (UID: "b1772b4d-283d-4758-8b7b-6fed24e80b8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.805573 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.805616 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hdq9\" (UniqueName: \"kubernetes.io/projected/b1772b4d-283d-4758-8b7b-6fed24e80b8c-kube-api-access-5hdq9\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.805638 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1772b4d-283d-4758-8b7b-6fed24e80b8c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.805654 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.808775 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data" (OuterVolumeSpecName: "config-data") pod "b1772b4d-283d-4758-8b7b-6fed24e80b8c" (UID: "b1772b4d-283d-4758-8b7b-6fed24e80b8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:40 crc kubenswrapper[4771]: I0219 23:01:40.907217 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1772b4d-283d-4758-8b7b-6fed24e80b8c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:41 crc kubenswrapper[4771]: I0219 23:01:41.437847 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:01:41 crc kubenswrapper[4771]: E0219 23:01:41.438511 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:01:41 crc kubenswrapper[4771]: I0219 23:01:41.456484 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5765b4fd9b-7qrrs" event={"ID":"b1772b4d-283d-4758-8b7b-6fed24e80b8c","Type":"ContainerDied","Data":"74bf5d869beb0db77652fafe1a4052940c4f134853b1a2407e7655e02b0c0b24"} Feb 19 23:01:41 crc kubenswrapper[4771]: I0219 23:01:41.456834 4771 scope.go:117] "RemoveContainer" containerID="5ba5571d15c3804ee666f4819817b9d9b1c59ad3fc8f6ae79033e153a6be9c42" Feb 19 23:01:41 crc kubenswrapper[4771]: I0219 23:01:41.456560 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5765b4fd9b-7qrrs" Feb 19 23:01:41 crc kubenswrapper[4771]: I0219 23:01:41.486638 4771 scope.go:117] "RemoveContainer" containerID="2c89f05639536485a5c8cc66ed9dce830ddbc1f426fdcc9c6c40a252958f0e1a" Feb 19 23:01:41 crc kubenswrapper[4771]: I0219 23:01:41.498567 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5765b4fd9b-7qrrs"] Feb 19 23:01:41 crc kubenswrapper[4771]: I0219 23:01:41.505735 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5765b4fd9b-7qrrs"] Feb 19 23:01:42 crc kubenswrapper[4771]: I0219 23:01:42.455983 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" path="/var/lib/kubelet/pods/b1772b4d-283d-4758-8b7b-6fed24e80b8c/volumes" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.268931 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ww2tk"] Feb 19 23:01:44 crc kubenswrapper[4771]: E0219 23:01:44.269538 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="registry-server" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269551 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="registry-server" Feb 19 23:01:44 crc kubenswrapper[4771]: E0219 23:01:44.269566 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api-log" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269575 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api-log" Feb 19 23:01:44 crc kubenswrapper[4771]: E0219 23:01:44.269586 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269592 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api" Feb 19 23:01:44 crc kubenswrapper[4771]: E0219 23:01:44.269599 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerName="dnsmasq-dns" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269605 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerName="dnsmasq-dns" Feb 19 23:01:44 crc kubenswrapper[4771]: E0219 23:01:44.269626 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="extract-content" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269631 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="extract-content" Feb 19 23:01:44 crc kubenswrapper[4771]: E0219 23:01:44.269641 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="extract-utilities" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269647 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="extract-utilities" Feb 19 23:01:44 crc kubenswrapper[4771]: E0219 23:01:44.269657 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerName="init" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269662 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerName="init" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269810 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api-log" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269830 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b632835-f9cf-43b8-a530-099b7257c60c" containerName="registry-server" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269838 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba2c23b-cb37-462b-8fc6-f4c1e8bf641f" containerName="dnsmasq-dns" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.269846 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1772b4d-283d-4758-8b7b-6fed24e80b8c" containerName="barbican-api" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.270412 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.288312 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ww2tk"] Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.359869 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3b4b-account-create-update-2fm94"] Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.360838 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.363183 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.372714 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3b4b-account-create-update-2fm94"] Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.387105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3337aa8-71a7-4f33-a69a-c46d3204007a-operator-scripts\") pod \"neutron-db-create-ww2tk\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.387249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48kv\" (UniqueName: \"kubernetes.io/projected/c3337aa8-71a7-4f33-a69a-c46d3204007a-kube-api-access-q48kv\") pod \"neutron-db-create-ww2tk\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.488558 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48kv\" (UniqueName: \"kubernetes.io/projected/c3337aa8-71a7-4f33-a69a-c46d3204007a-kube-api-access-q48kv\") pod \"neutron-db-create-ww2tk\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.488852 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9t9\" (UniqueName: \"kubernetes.io/projected/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-kube-api-access-jt9t9\") pod \"neutron-3b4b-account-create-update-2fm94\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.488936 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3337aa8-71a7-4f33-a69a-c46d3204007a-operator-scripts\") pod \"neutron-db-create-ww2tk\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.488980 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-operator-scripts\") pod \"neutron-3b4b-account-create-update-2fm94\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.489945 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3337aa8-71a7-4f33-a69a-c46d3204007a-operator-scripts\") pod \"neutron-db-create-ww2tk\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.537333 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48kv\" (UniqueName: \"kubernetes.io/projected/c3337aa8-71a7-4f33-a69a-c46d3204007a-kube-api-access-q48kv\") pod \"neutron-db-create-ww2tk\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.586159 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.590440 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-operator-scripts\") pod \"neutron-3b4b-account-create-update-2fm94\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.590537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9t9\" (UniqueName: \"kubernetes.io/projected/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-kube-api-access-jt9t9\") pod \"neutron-3b4b-account-create-update-2fm94\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.591430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-operator-scripts\") pod \"neutron-3b4b-account-create-update-2fm94\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.609743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9t9\" (UniqueName: \"kubernetes.io/projected/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-kube-api-access-jt9t9\") pod \"neutron-3b4b-account-create-update-2fm94\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:44 crc kubenswrapper[4771]: I0219 23:01:44.679386 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.110950 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ww2tk"] Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.190343 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3b4b-account-create-update-2fm94"] Feb 19 23:01:45 crc kubenswrapper[4771]: W0219 23:01:45.198066 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a51bd2d_0856_4bbb_b4d5_f9915c1e6cbf.slice/crio-b7c8dece087d4e8013f61762bfeecb4e6156d1c939b6c8ad030725d5cca3cf72 WatchSource:0}: Error finding container b7c8dece087d4e8013f61762bfeecb4e6156d1c939b6c8ad030725d5cca3cf72: Status 404 returned error can't find the container with id b7c8dece087d4e8013f61762bfeecb4e6156d1c939b6c8ad030725d5cca3cf72 Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.502670 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ww2tk" event={"ID":"c3337aa8-71a7-4f33-a69a-c46d3204007a","Type":"ContainerStarted","Data":"5d7cdd7a93d149e393813df4dbcc8ff10ecd9b2dde28e26d870319c355a60911"} Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.503054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ww2tk" event={"ID":"c3337aa8-71a7-4f33-a69a-c46d3204007a","Type":"ContainerStarted","Data":"0ae6d3ce62c3df6b9251f7dee486f44b0813ea2a8cc6cc824958a1fa425c22fb"} Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.504291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b4b-account-create-update-2fm94" event={"ID":"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf","Type":"ContainerStarted","Data":"0e3f909bb24d5c1718653a43e2504ffcec274d377440092ff761f9985d7387c4"} Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.504344 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b4b-account-create-update-2fm94" event={"ID":"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf","Type":"ContainerStarted","Data":"b7c8dece087d4e8013f61762bfeecb4e6156d1c939b6c8ad030725d5cca3cf72"} Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.533105 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-ww2tk" podStartSLOduration=1.533077741 podStartE2EDuration="1.533077741s" podCreationTimestamp="2026-02-19 23:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:45.528942051 +0000 UTC m=+5605.800384521" watchObservedRunningTime="2026-02-19 23:01:45.533077741 +0000 UTC m=+5605.804520241" Feb 19 23:01:45 crc kubenswrapper[4771]: I0219 23:01:45.553574 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-3b4b-account-create-update-2fm94" podStartSLOduration=1.5535595180000001 podStartE2EDuration="1.553559518s" podCreationTimestamp="2026-02-19 23:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:45.549993763 +0000 UTC m=+5605.821436233" watchObservedRunningTime="2026-02-19 23:01:45.553559518 +0000 UTC m=+5605.825001988" Feb 19 23:01:46 crc kubenswrapper[4771]: I0219 23:01:46.517232 4771 generic.go:334] "Generic (PLEG): container finished" podID="4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf" containerID="0e3f909bb24d5c1718653a43e2504ffcec274d377440092ff761f9985d7387c4" exitCode=0 Feb 19 23:01:46 crc kubenswrapper[4771]: I0219 23:01:46.517354 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b4b-account-create-update-2fm94" event={"ID":"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf","Type":"ContainerDied","Data":"0e3f909bb24d5c1718653a43e2504ffcec274d377440092ff761f9985d7387c4"} Feb 19 23:01:46 crc kubenswrapper[4771]: I0219 23:01:46.519534 4771 generic.go:334] "Generic (PLEG): container finished" podID="c3337aa8-71a7-4f33-a69a-c46d3204007a" containerID="5d7cdd7a93d149e393813df4dbcc8ff10ecd9b2dde28e26d870319c355a60911" exitCode=0 Feb 19 23:01:46 crc kubenswrapper[4771]: I0219 23:01:46.519576 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ww2tk" event={"ID":"c3337aa8-71a7-4f33-a69a-c46d3204007a","Type":"ContainerDied","Data":"5d7cdd7a93d149e393813df4dbcc8ff10ecd9b2dde28e26d870319c355a60911"} Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.046836 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.073686 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.164296 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-operator-scripts\") pod \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.164640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3337aa8-71a7-4f33-a69a-c46d3204007a-operator-scripts\") pod \"c3337aa8-71a7-4f33-a69a-c46d3204007a\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.164804 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q48kv\" (UniqueName: \"kubernetes.io/projected/c3337aa8-71a7-4f33-a69a-c46d3204007a-kube-api-access-q48kv\") pod \"c3337aa8-71a7-4f33-a69a-c46d3204007a\" (UID: \"c3337aa8-71a7-4f33-a69a-c46d3204007a\") " Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.164831 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf" (UID: "4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.165540 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt9t9\" (UniqueName: \"kubernetes.io/projected/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-kube-api-access-jt9t9\") pod \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\" (UID: \"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf\") " Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.165992 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3337aa8-71a7-4f33-a69a-c46d3204007a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3337aa8-71a7-4f33-a69a-c46d3204007a" (UID: "c3337aa8-71a7-4f33-a69a-c46d3204007a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.167375 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.167777 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3337aa8-71a7-4f33-a69a-c46d3204007a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.171769 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-kube-api-access-jt9t9" (OuterVolumeSpecName: "kube-api-access-jt9t9") pod "4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf" (UID: "4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf"). InnerVolumeSpecName "kube-api-access-jt9t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.172130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3337aa8-71a7-4f33-a69a-c46d3204007a-kube-api-access-q48kv" (OuterVolumeSpecName: "kube-api-access-q48kv") pod "c3337aa8-71a7-4f33-a69a-c46d3204007a" (UID: "c3337aa8-71a7-4f33-a69a-c46d3204007a"). InnerVolumeSpecName "kube-api-access-q48kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.270079 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt9t9\" (UniqueName: \"kubernetes.io/projected/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf-kube-api-access-jt9t9\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.270132 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q48kv\" (UniqueName: \"kubernetes.io/projected/c3337aa8-71a7-4f33-a69a-c46d3204007a-kube-api-access-q48kv\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.558790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b4b-account-create-update-2fm94" event={"ID":"4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf","Type":"ContainerDied","Data":"b7c8dece087d4e8013f61762bfeecb4e6156d1c939b6c8ad030725d5cca3cf72"} Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.558857 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c8dece087d4e8013f61762bfeecb4e6156d1c939b6c8ad030725d5cca3cf72" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.558934 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b4b-account-create-update-2fm94" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.564660 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ww2tk" event={"ID":"c3337aa8-71a7-4f33-a69a-c46d3204007a","Type":"ContainerDied","Data":"0ae6d3ce62c3df6b9251f7dee486f44b0813ea2a8cc6cc824958a1fa425c22fb"} Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.564722 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae6d3ce62c3df6b9251f7dee486f44b0813ea2a8cc6cc824958a1fa425c22fb" Feb 19 23:01:48 crc kubenswrapper[4771]: I0219 23:01:48.564805 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ww2tk" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.658575 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hmnhd"] Feb 19 23:01:49 crc kubenswrapper[4771]: E0219 23:01:49.659113 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3337aa8-71a7-4f33-a69a-c46d3204007a" containerName="mariadb-database-create" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.659130 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3337aa8-71a7-4f33-a69a-c46d3204007a" containerName="mariadb-database-create" Feb 19 23:01:49 crc kubenswrapper[4771]: E0219 23:01:49.659147 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf" containerName="mariadb-account-create-update" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.659154 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf" containerName="mariadb-account-create-update" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.659343 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3337aa8-71a7-4f33-a69a-c46d3204007a" containerName="mariadb-database-create" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.659356 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf" containerName="mariadb-account-create-update" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.659887 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.661708 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2b42l" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.662209 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.662779 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.672338 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hmnhd"] Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.799994 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-config\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.800054 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-combined-ca-bundle\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.800150 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgn6r\" (UniqueName: \"kubernetes.io/projected/43635992-5285-44e4-802b-43eaa8310d68-kube-api-access-hgn6r\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.902482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgn6r\" (UniqueName: \"kubernetes.io/projected/43635992-5285-44e4-802b-43eaa8310d68-kube-api-access-hgn6r\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.902646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-config\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.902688 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-combined-ca-bundle\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.911429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-config\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.913753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-combined-ca-bundle\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.925187 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgn6r\" (UniqueName: \"kubernetes.io/projected/43635992-5285-44e4-802b-43eaa8310d68-kube-api-access-hgn6r\") pod \"neutron-db-sync-hmnhd\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:49 crc kubenswrapper[4771]: I0219 23:01:49.975581 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:50 crc kubenswrapper[4771]: I0219 23:01:50.522477 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hmnhd"] Feb 19 23:01:50 crc kubenswrapper[4771]: W0219 23:01:50.532711 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43635992_5285_44e4_802b_43eaa8310d68.slice/crio-5b4ad3d7324fcde01cdc2b5726527603de923ddad95e38414b90a620a68b40a3 WatchSource:0}: Error finding container 5b4ad3d7324fcde01cdc2b5726527603de923ddad95e38414b90a620a68b40a3: Status 404 returned error can't find the container with id 5b4ad3d7324fcde01cdc2b5726527603de923ddad95e38414b90a620a68b40a3 Feb 19 23:01:50 crc kubenswrapper[4771]: I0219 23:01:50.584101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmnhd" event={"ID":"43635992-5285-44e4-802b-43eaa8310d68","Type":"ContainerStarted","Data":"5b4ad3d7324fcde01cdc2b5726527603de923ddad95e38414b90a620a68b40a3"} Feb 19 23:01:51 crc kubenswrapper[4771]: I0219 23:01:51.597135 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmnhd" event={"ID":"43635992-5285-44e4-802b-43eaa8310d68","Type":"ContainerStarted","Data":"5ed32cd77ab23cb02780e49be7657b442d9bf822d05d7d1a501ba6a7b0c888ad"} Feb 19 23:01:54 crc kubenswrapper[4771]: I0219 23:01:54.627171 4771 generic.go:334] "Generic (PLEG): container finished" podID="43635992-5285-44e4-802b-43eaa8310d68" containerID="5ed32cd77ab23cb02780e49be7657b442d9bf822d05d7d1a501ba6a7b0c888ad" exitCode=0 Feb 19 23:01:54 crc kubenswrapper[4771]: I0219 23:01:54.627541 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmnhd" event={"ID":"43635992-5285-44e4-802b-43eaa8310d68","Type":"ContainerDied","Data":"5ed32cd77ab23cb02780e49be7657b442d9bf822d05d7d1a501ba6a7b0c888ad"} Feb 19 23:01:55 crc kubenswrapper[4771]: I0219 23:01:55.438255 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.057697 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.123308 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgn6r\" (UniqueName: \"kubernetes.io/projected/43635992-5285-44e4-802b-43eaa8310d68-kube-api-access-hgn6r\") pod \"43635992-5285-44e4-802b-43eaa8310d68\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.123348 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-combined-ca-bundle\") pod \"43635992-5285-44e4-802b-43eaa8310d68\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.123471 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-config\") pod \"43635992-5285-44e4-802b-43eaa8310d68\" (UID: \"43635992-5285-44e4-802b-43eaa8310d68\") " Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.137358 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43635992-5285-44e4-802b-43eaa8310d68-kube-api-access-hgn6r" (OuterVolumeSpecName: "kube-api-access-hgn6r") pod "43635992-5285-44e4-802b-43eaa8310d68" (UID: "43635992-5285-44e4-802b-43eaa8310d68"). InnerVolumeSpecName "kube-api-access-hgn6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.157185 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-config" (OuterVolumeSpecName: "config") pod "43635992-5285-44e4-802b-43eaa8310d68" (UID: "43635992-5285-44e4-802b-43eaa8310d68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.180686 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43635992-5285-44e4-802b-43eaa8310d68" (UID: "43635992-5285-44e4-802b-43eaa8310d68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.225387 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.225409 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgn6r\" (UniqueName: \"kubernetes.io/projected/43635992-5285-44e4-802b-43eaa8310d68-kube-api-access-hgn6r\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.225420 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43635992-5285-44e4-802b-43eaa8310d68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.651594 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmnhd" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.651619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmnhd" event={"ID":"43635992-5285-44e4-802b-43eaa8310d68","Type":"ContainerDied","Data":"5b4ad3d7324fcde01cdc2b5726527603de923ddad95e38414b90a620a68b40a3"} Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.652221 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4ad3d7324fcde01cdc2b5726527603de923ddad95e38414b90a620a68b40a3" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.656390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"c2b2a4906ade6d26077f231564fac49a46292a6f1358314d170979c9e439e6b3"} Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.818477 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568bf9c765-ks2m8"] Feb 19 23:01:56 crc kubenswrapper[4771]: E0219 23:01:56.818803 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43635992-5285-44e4-802b-43eaa8310d68" containerName="neutron-db-sync" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.818817 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="43635992-5285-44e4-802b-43eaa8310d68" containerName="neutron-db-sync" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.818960 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="43635992-5285-44e4-802b-43eaa8310d68" containerName="neutron-db-sync" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.819731 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.868215 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568bf9c765-ks2m8"] Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.919144 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84546b7c56-srr9l"] Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.920549 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.923276 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.923419 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2b42l" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.923525 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.923625 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.924685 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84546b7c56-srr9l"] Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.945726 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74qhk\" (UniqueName: \"kubernetes.io/projected/e927281e-63c9-4de2-a111-12d3c975aac5-kube-api-access-74qhk\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.945767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-dns-svc\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.945838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-config\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.945895 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-nb\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:56 crc kubenswrapper[4771]: I0219 23:01:56.946031 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-sb\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.047767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g952j\" (UniqueName: \"kubernetes.io/projected/69ff9d36-e15e-412f-a04f-a16180e6361f-kube-api-access-g952j\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.047821 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-httpd-config\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.047848 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-ovndb-tls-certs\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.047880 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-config\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.047906 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74qhk\" (UniqueName: \"kubernetes.io/projected/e927281e-63c9-4de2-a111-12d3c975aac5-kube-api-access-74qhk\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.048088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-dns-svc\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.048136 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-config\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.048166 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-nb\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.048258 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-combined-ca-bundle\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.048328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-sb\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.048924 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-config\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.048921 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-dns-svc\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.049360 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-sb\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.049625 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-nb\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.069760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74qhk\" (UniqueName: \"kubernetes.io/projected/e927281e-63c9-4de2-a111-12d3c975aac5-kube-api-access-74qhk\") pod \"dnsmasq-dns-568bf9c765-ks2m8\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.138975 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.149270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-httpd-config\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.149305 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-ovndb-tls-certs\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.149344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-config\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.149382 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-combined-ca-bundle\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.149457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g952j\" (UniqueName: \"kubernetes.io/projected/69ff9d36-e15e-412f-a04f-a16180e6361f-kube-api-access-g952j\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.152841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-ovndb-tls-certs\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.153319 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-httpd-config\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.153702 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-config\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.156590 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-combined-ca-bundle\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.185109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g952j\" (UniqueName: \"kubernetes.io/projected/69ff9d36-e15e-412f-a04f-a16180e6361f-kube-api-access-g952j\") pod \"neutron-84546b7c56-srr9l\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.278929 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.651979 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84546b7c56-srr9l"] Feb 19 23:01:57 crc kubenswrapper[4771]: I0219 23:01:57.675939 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568bf9c765-ks2m8"] Feb 19 23:01:57 crc kubenswrapper[4771]: W0219 23:01:57.676312 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode927281e_63c9_4de2_a111_12d3c975aac5.slice/crio-d15e0b63996236c5a0806d86667e3b62729ca717cfb0da174049833b64578cef WatchSource:0}: Error finding container d15e0b63996236c5a0806d86667e3b62729ca717cfb0da174049833b64578cef: Status 404 returned error can't find the container with id d15e0b63996236c5a0806d86667e3b62729ca717cfb0da174049833b64578cef Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.683404 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84546b7c56-srr9l" event={"ID":"69ff9d36-e15e-412f-a04f-a16180e6361f","Type":"ContainerStarted","Data":"90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22"} Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.683959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84546b7c56-srr9l" event={"ID":"69ff9d36-e15e-412f-a04f-a16180e6361f","Type":"ContainerStarted","Data":"bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce"} Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.683968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84546b7c56-srr9l" event={"ID":"69ff9d36-e15e-412f-a04f-a16180e6361f","Type":"ContainerStarted","Data":"9f29c45059ac00ff03397ed7203a29a09d19115afc3dda1f00277d0c5bf5b53c"} Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.683981 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.684856 4771 generic.go:334] "Generic (PLEG): container finished" podID="e927281e-63c9-4de2-a111-12d3c975aac5" containerID="28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292" exitCode=0 Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.684884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" event={"ID":"e927281e-63c9-4de2-a111-12d3c975aac5","Type":"ContainerDied","Data":"28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292"} Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.684912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" event={"ID":"e927281e-63c9-4de2-a111-12d3c975aac5","Type":"ContainerStarted","Data":"d15e0b63996236c5a0806d86667e3b62729ca717cfb0da174049833b64578cef"} Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.714429 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84546b7c56-srr9l" podStartSLOduration=2.714407627 podStartE2EDuration="2.714407627s" podCreationTimestamp="2026-02-19 23:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:58.700607699 +0000 UTC m=+5618.972050179" watchObservedRunningTime="2026-02-19 23:01:58.714407627 +0000 UTC m=+5618.985850277" Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.888426 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-686cc5555f-67pr9"] Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.890157 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.892975 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.897783 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 23:01:58 crc kubenswrapper[4771]: I0219 23:01:58.922863 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-686cc5555f-67pr9"] Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.010535 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hqm\" (UniqueName: \"kubernetes.io/projected/061e4898-95bd-4626-9869-811dd8055f42-kube-api-access-l5hqm\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.010805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-public-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.010906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-httpd-config\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.010981 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-internal-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.011000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-combined-ca-bundle\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.011105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-config\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.011187 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-ovndb-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.113098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-httpd-config\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.113150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-internal-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.113174 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-combined-ca-bundle\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.113189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-config\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.113211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-ovndb-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.113237 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hqm\" (UniqueName: \"kubernetes.io/projected/061e4898-95bd-4626-9869-811dd8055f42-kube-api-access-l5hqm\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.113270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-public-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.118460 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-public-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.121370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-httpd-config\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.123786 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-config\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.124272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-combined-ca-bundle\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.125953 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-internal-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.128863 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061e4898-95bd-4626-9869-811dd8055f42-ovndb-tls-certs\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.136203 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hqm\" (UniqueName: \"kubernetes.io/projected/061e4898-95bd-4626-9869-811dd8055f42-kube-api-access-l5hqm\") pod \"neutron-686cc5555f-67pr9\" (UID: \"061e4898-95bd-4626-9869-811dd8055f42\") " pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.205130 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.705619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" event={"ID":"e927281e-63c9-4de2-a111-12d3c975aac5","Type":"ContainerStarted","Data":"f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02"} Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.706044 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.730953 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-686cc5555f-67pr9"] Feb 19 23:01:59 crc kubenswrapper[4771]: I0219 23:01:59.732178 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" podStartSLOduration=3.732161658 podStartE2EDuration="3.732161658s" podCreationTimestamp="2026-02-19 23:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:59.728532832 +0000 UTC m=+5619.999975312" watchObservedRunningTime="2026-02-19 23:01:59.732161658 +0000 UTC m=+5620.003604128" Feb 19 23:02:00 crc kubenswrapper[4771]: I0219 23:02:00.713205 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686cc5555f-67pr9" event={"ID":"061e4898-95bd-4626-9869-811dd8055f42","Type":"ContainerStarted","Data":"5d9a0ae283403b480c21e5a093e87a5ab7d9cf2e9955dc2dbf43e2c87cabbdc1"} Feb 19 23:02:00 crc kubenswrapper[4771]: I0219 23:02:00.713678 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686cc5555f-67pr9" event={"ID":"061e4898-95bd-4626-9869-811dd8055f42","Type":"ContainerStarted","Data":"3cd3290759854a344d0033b28f5c72d80aebfca7412f79ddc55eb4a4c4b9cebc"} Feb 19 23:02:00 crc kubenswrapper[4771]: I0219 23:02:00.713689 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686cc5555f-67pr9" event={"ID":"061e4898-95bd-4626-9869-811dd8055f42","Type":"ContainerStarted","Data":"1fb9eb5de8ad58d475f57a3ae90e08a5c71a9f4df7a0dd38f43d3df4c495f086"} Feb 19 23:02:00 crc kubenswrapper[4771]: I0219 23:02:00.713746 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:02:00 crc kubenswrapper[4771]: I0219 23:02:00.739077 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-686cc5555f-67pr9" podStartSLOduration=2.73906127 podStartE2EDuration="2.73906127s" podCreationTimestamp="2026-02-19 23:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:00.738589817 +0000 UTC m=+5621.010032287" watchObservedRunningTime="2026-02-19 23:02:00.73906127 +0000 UTC m=+5621.010503740" Feb 19 23:02:01 crc kubenswrapper[4771]: I0219 23:02:01.061076 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ltm4c"] Feb 19 23:02:01 crc kubenswrapper[4771]: I0219 23:02:01.067878 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ltm4c"] Feb 19 23:02:02 crc kubenswrapper[4771]: I0219 23:02:02.455632 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bacb27-3fc0-4e14-960f-ca5058407555" path="/var/lib/kubelet/pods/d0bacb27-3fc0-4e14-960f-ca5058407555/volumes" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.141260 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.238182 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d466d845-vhxrn"] Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.238638 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" podUID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerName="dnsmasq-dns" containerID="cri-o://b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3" gracePeriod=10 Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.724647 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.783573 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerID="b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3" exitCode=0 Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.783622 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" event={"ID":"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd","Type":"ContainerDied","Data":"b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3"} Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.783648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" event={"ID":"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd","Type":"ContainerDied","Data":"00a28616791d31f0285deed69c696acba090690a22478adc27d376baf1b49979"} Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.783667 4771 scope.go:117] "RemoveContainer" containerID="b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.783797 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d466d845-vhxrn" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.803386 4771 scope.go:117] "RemoveContainer" containerID="01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.821844 4771 scope.go:117] "RemoveContainer" containerID="b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3" Feb 19 23:02:07 crc kubenswrapper[4771]: E0219 23:02:07.822192 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3\": container with ID starting with b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3 not found: ID does not exist" containerID="b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.822224 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3"} err="failed to get container status \"b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3\": rpc error: code = NotFound desc = could not find container \"b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3\": container with ID starting with b4f76714d34ec743cebfe1cb1aafb4436d94102f162ec5601e4ce8228246b6d3 not found: ID does not exist" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.822246 4771 scope.go:117] "RemoveContainer" containerID="01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068" Feb 19 23:02:07 crc kubenswrapper[4771]: E0219 23:02:07.822559 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068\": container with ID starting with 01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068 not found: ID does not exist" containerID="01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.822600 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068"} err="failed to get container status \"01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068\": rpc error: code = NotFound desc = could not find container \"01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068\": container with ID starting with 01d22a2e851b0bb5fd294c1b090ea1258558f2a8d666ce6a82629fab6d466068 not found: ID does not exist" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.921653 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-sb\") pod \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.921714 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf7wl\" (UniqueName: \"kubernetes.io/projected/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-kube-api-access-bf7wl\") pod \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.921744 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-nb\") pod \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.921785 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-config\") pod \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.921860 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-dns-svc\") pod \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\" (UID: \"2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd\") " Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.927307 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-kube-api-access-bf7wl" (OuterVolumeSpecName: "kube-api-access-bf7wl") pod "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" (UID: "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd"). InnerVolumeSpecName "kube-api-access-bf7wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.977698 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" (UID: "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.977877 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" (UID: "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.978118 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-config" (OuterVolumeSpecName: "config") pod "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" (UID: "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:07 crc kubenswrapper[4771]: I0219 23:02:07.979740 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" (UID: "2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.023913 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.023954 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.023973 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf7wl\" (UniqueName: \"kubernetes.io/projected/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-kube-api-access-bf7wl\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.023984 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.023996 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.121220 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d466d845-vhxrn"] Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.153374 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78d466d845-vhxrn"] Feb 19 23:02:08 crc kubenswrapper[4771]: I0219 23:02:08.453168 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" path="/var/lib/kubelet/pods/2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd/volumes" Feb 19 23:02:27 crc kubenswrapper[4771]: I0219 23:02:27.292355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:02:29 crc kubenswrapper[4771]: I0219 23:02:29.236626 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-686cc5555f-67pr9" Feb 19 23:02:29 crc kubenswrapper[4771]: I0219 23:02:29.319955 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84546b7c56-srr9l"] Feb 19 23:02:29 crc kubenswrapper[4771]: I0219 23:02:29.320656 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84546b7c56-srr9l" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-api" containerID="cri-o://bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce" gracePeriod=30 Feb 19 23:02:29 crc kubenswrapper[4771]: I0219 23:02:29.321354 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84546b7c56-srr9l" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-httpd" containerID="cri-o://90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22" gracePeriod=30 Feb 19 23:02:30 crc kubenswrapper[4771]: I0219 23:02:30.057621 4771 generic.go:334] "Generic (PLEG): container finished" podID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerID="90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22" exitCode=0 Feb 19 23:02:30 crc kubenswrapper[4771]: I0219 23:02:30.057713 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84546b7c56-srr9l" event={"ID":"69ff9d36-e15e-412f-a04f-a16180e6361f","Type":"ContainerDied","Data":"90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22"} Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.042683 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.094909 4771 generic.go:334] "Generic (PLEG): container finished" podID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerID="bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce" exitCode=0 Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.094947 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84546b7c56-srr9l" event={"ID":"69ff9d36-e15e-412f-a04f-a16180e6361f","Type":"ContainerDied","Data":"bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce"} Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.094972 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84546b7c56-srr9l" event={"ID":"69ff9d36-e15e-412f-a04f-a16180e6361f","Type":"ContainerDied","Data":"9f29c45059ac00ff03397ed7203a29a09d19115afc3dda1f00277d0c5bf5b53c"} Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.094990 4771 scope.go:117] "RemoveContainer" containerID="90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.095126 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84546b7c56-srr9l" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.123369 4771 scope.go:117] "RemoveContainer" containerID="bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.160704 4771 scope.go:117] "RemoveContainer" containerID="90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22" Feb 19 23:02:33 crc kubenswrapper[4771]: E0219 23:02:33.161948 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22\": container with ID starting with 90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22 not found: ID does not exist" containerID="90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.162040 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22"} err="failed to get container status \"90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22\": rpc error: code = NotFound desc = could not find container \"90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22\": container with ID starting with 90557df946c9075eda85bf4340747ee3b4e78e75503dd6e690ea0739e8fe5d22 not found: ID does not exist" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.162089 4771 scope.go:117] "RemoveContainer" containerID="bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce" Feb 19 23:02:33 crc kubenswrapper[4771]: E0219 23:02:33.162561 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce\": container with ID starting with bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce not found: ID does not exist" containerID="bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.162606 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce"} err="failed to get container status \"bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce\": rpc error: code = NotFound desc = could not find container \"bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce\": container with ID starting with bde0b5154ec64b3440d03813d4318e0dcdbcd46bd479b04666efc07dede7c6ce not found: ID does not exist" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.216835 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-httpd-config\") pod \"69ff9d36-e15e-412f-a04f-a16180e6361f\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.217163 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-config\") pod \"69ff9d36-e15e-412f-a04f-a16180e6361f\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.217470 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-combined-ca-bundle\") pod \"69ff9d36-e15e-412f-a04f-a16180e6361f\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.217594 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g952j\" (UniqueName: \"kubernetes.io/projected/69ff9d36-e15e-412f-a04f-a16180e6361f-kube-api-access-g952j\") pod \"69ff9d36-e15e-412f-a04f-a16180e6361f\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.217616 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-ovndb-tls-certs\") pod \"69ff9d36-e15e-412f-a04f-a16180e6361f\" (UID: \"69ff9d36-e15e-412f-a04f-a16180e6361f\") " Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.222814 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "69ff9d36-e15e-412f-a04f-a16180e6361f" (UID: "69ff9d36-e15e-412f-a04f-a16180e6361f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.223075 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ff9d36-e15e-412f-a04f-a16180e6361f-kube-api-access-g952j" (OuterVolumeSpecName: "kube-api-access-g952j") pod "69ff9d36-e15e-412f-a04f-a16180e6361f" (UID: "69ff9d36-e15e-412f-a04f-a16180e6361f"). InnerVolumeSpecName "kube-api-access-g952j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.272572 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-config" (OuterVolumeSpecName: "config") pod "69ff9d36-e15e-412f-a04f-a16180e6361f" (UID: "69ff9d36-e15e-412f-a04f-a16180e6361f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.278900 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69ff9d36-e15e-412f-a04f-a16180e6361f" (UID: "69ff9d36-e15e-412f-a04f-a16180e6361f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.313266 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "69ff9d36-e15e-412f-a04f-a16180e6361f" (UID: "69ff9d36-e15e-412f-a04f-a16180e6361f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.320216 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.320252 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g952j\" (UniqueName: \"kubernetes.io/projected/69ff9d36-e15e-412f-a04f-a16180e6361f-kube-api-access-g952j\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.320265 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.320301 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.320314 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69ff9d36-e15e-412f-a04f-a16180e6361f-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.422705 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84546b7c56-srr9l"] Feb 19 23:02:33 crc kubenswrapper[4771]: I0219 23:02:33.433145 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84546b7c56-srr9l"] Feb 19 23:02:34 crc kubenswrapper[4771]: I0219 23:02:34.463000 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" path="/var/lib/kubelet/pods/69ff9d36-e15e-412f-a04f-a16180e6361f/volumes" Feb 19 23:02:35 crc kubenswrapper[4771]: I0219 23:02:35.347544 4771 scope.go:117] "RemoveContainer" containerID="047d9d94cd4852842bbcc5eb0558e71cc4f58600037cf5514d944ecee4f134d1" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.884471 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-x6j2q"] Feb 19 23:02:38 crc kubenswrapper[4771]: E0219 23:02:38.885509 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerName="init" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.885524 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerName="init" Feb 19 23:02:38 crc kubenswrapper[4771]: E0219 23:02:38.885555 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-httpd" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.885563 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-httpd" Feb 19 23:02:38 crc kubenswrapper[4771]: E0219 23:02:38.885578 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-api" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.885586 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-api" Feb 19 23:02:38 crc kubenswrapper[4771]: E0219 23:02:38.885602 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerName="dnsmasq-dns" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.885609 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerName="dnsmasq-dns" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.885791 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f14ae3d-67aa-496d-85a0-ec37cbc7a9bd" containerName="dnsmasq-dns" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.885816 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-api" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.885829 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ff9d36-e15e-412f-a04f-a16180e6361f" containerName="neutron-httpd" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.886498 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.895573 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.895626 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.895668 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.895830 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6fztx" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.895949 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 23:02:38 crc kubenswrapper[4771]: I0219 23:02:38.931476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-x6j2q"] Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.012493 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64c8b56b79-llhfr"] Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.013797 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.023291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnwf\" (UniqueName: \"kubernetes.io/projected/bf923325-9073-4b69-91b2-573bcdfbfdb0-kube-api-access-svnwf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.023331 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-swiftconf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.023358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-dispersionconf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.023414 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-scripts\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.023450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-combined-ca-bundle\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.023500 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-ring-data-devices\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.023535 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf923325-9073-4b69-91b2-573bcdfbfdb0-etc-swift\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.032684 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64c8b56b79-llhfr"] Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb77p\" (UniqueName: \"kubernetes.io/projected/a9f7a722-eb78-420a-a27b-77c3cafbfa26-kube-api-access-gb77p\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-scripts\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125225 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-sb\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125263 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-combined-ca-bundle\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-dns-svc\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-ring-data-devices\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125384 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-nb\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125413 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf923325-9073-4b69-91b2-573bcdfbfdb0-etc-swift\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-config\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125481 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnwf\" (UniqueName: \"kubernetes.io/projected/bf923325-9073-4b69-91b2-573bcdfbfdb0-kube-api-access-svnwf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125500 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-swiftconf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.125520 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-dispersionconf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.126243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf923325-9073-4b69-91b2-573bcdfbfdb0-etc-swift\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.126404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-ring-data-devices\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.126414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-scripts\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.131649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-swiftconf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.132916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-combined-ca-bundle\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.146438 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-dispersionconf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.147555 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnwf\" (UniqueName: \"kubernetes.io/projected/bf923325-9073-4b69-91b2-573bcdfbfdb0-kube-api-access-svnwf\") pod \"swift-ring-rebalance-x6j2q\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.211542 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.227198 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-nb\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.227268 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-config\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.227316 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb77p\" (UniqueName: \"kubernetes.io/projected/a9f7a722-eb78-420a-a27b-77c3cafbfa26-kube-api-access-gb77p\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.227352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-sb\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.227411 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-dns-svc\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.228182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-nb\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.228216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-dns-svc\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.228249 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-config\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.228449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-sb\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.264895 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb77p\" (UniqueName: \"kubernetes.io/projected/a9f7a722-eb78-420a-a27b-77c3cafbfa26-kube-api-access-gb77p\") pod \"dnsmasq-dns-64c8b56b79-llhfr\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.334528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.700998 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-x6j2q"] Feb 19 23:02:39 crc kubenswrapper[4771]: I0219 23:02:39.833294 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64c8b56b79-llhfr"] Feb 19 23:02:40 crc kubenswrapper[4771]: I0219 23:02:40.193161 4771 generic.go:334] "Generic (PLEG): container finished" podID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerID="6f7268cdef6e11a203b9e0f5baf520e2c5270023de10ebe589a6d6c50f676dc3" exitCode=0 Feb 19 23:02:40 crc kubenswrapper[4771]: I0219 23:02:40.194307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" event={"ID":"a9f7a722-eb78-420a-a27b-77c3cafbfa26","Type":"ContainerDied","Data":"6f7268cdef6e11a203b9e0f5baf520e2c5270023de10ebe589a6d6c50f676dc3"} Feb 19 23:02:40 crc kubenswrapper[4771]: I0219 23:02:40.194413 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" event={"ID":"a9f7a722-eb78-420a-a27b-77c3cafbfa26","Type":"ContainerStarted","Data":"b3b4afb7826f4630ab4544a22cac416f85f82762c54abef63ff92ca3251836ac"} Feb 19 23:02:40 crc kubenswrapper[4771]: I0219 23:02:40.202573 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6j2q" event={"ID":"bf923325-9073-4b69-91b2-573bcdfbfdb0","Type":"ContainerStarted","Data":"fa7638dc82dd9ef8fa44305c6c8e8e9a9a867b5b65fd4a66681599efc2f1057e"} Feb 19 23:02:40 crc kubenswrapper[4771]: I0219 23:02:40.202611 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6j2q" event={"ID":"bf923325-9073-4b69-91b2-573bcdfbfdb0","Type":"ContainerStarted","Data":"34186231af5eba1e562b74e450b78a2ba72a65d22ab60ba9b8f95bd77c20677a"} Feb 19 23:02:40 crc kubenswrapper[4771]: I0219 23:02:40.254125 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-x6j2q" podStartSLOduration=2.254105623 podStartE2EDuration="2.254105623s" podCreationTimestamp="2026-02-19 23:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:40.245450722 +0000 UTC m=+5660.516893192" watchObservedRunningTime="2026-02-19 23:02:40.254105623 +0000 UTC m=+5660.525548093" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.191871 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-c987fbfd4-j2jd2"] Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.193363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.195218 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.202886 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c987fbfd4-j2jd2"] Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.212100 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" event={"ID":"a9f7a722-eb78-420a-a27b-77c3cafbfa26","Type":"ContainerStarted","Data":"2614cee41c4ab74bb7a691f4579c67d5c169ba4567b374a683b192a41a0a7be9"} Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.212138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.242147 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" podStartSLOduration=3.24213023 podStartE2EDuration="3.24213023s" podCreationTimestamp="2026-02-19 23:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:41.238176916 +0000 UTC m=+5661.509619396" watchObservedRunningTime="2026-02-19 23:02:41.24213023 +0000 UTC m=+5661.513572700" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.268389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-etc-swift\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.268819 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-log-httpd\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.269012 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-config-data\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.269057 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-run-httpd\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.269077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-combined-ca-bundle\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.269124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs7xm\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-kube-api-access-zs7xm\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.371871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-log-httpd\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.371994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-config-data\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.372040 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-run-httpd\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.372064 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-combined-ca-bundle\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.372098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs7xm\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-kube-api-access-zs7xm\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.372150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-etc-swift\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.372467 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-log-httpd\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.372483 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-run-httpd\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.377118 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-combined-ca-bundle\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.377712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-config-data\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.381158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-etc-swift\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.386429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs7xm\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-kube-api-access-zs7xm\") pod \"swift-proxy-c987fbfd4-j2jd2\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:41 crc kubenswrapper[4771]: I0219 23:02:41.515404 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:42 crc kubenswrapper[4771]: I0219 23:02:42.172033 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c987fbfd4-j2jd2"] Feb 19 23:02:42 crc kubenswrapper[4771]: W0219 23:02:42.176188 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0f668cc_1d0e_4306_be4b_cd43366d001e.slice/crio-05f7edd80030b9eff018b3dccf91fa3ce2005c5da55e824896622170a9e26714 WatchSource:0}: Error finding container 05f7edd80030b9eff018b3dccf91fa3ce2005c5da55e824896622170a9e26714: Status 404 returned error can't find the container with id 05f7edd80030b9eff018b3dccf91fa3ce2005c5da55e824896622170a9e26714 Feb 19 23:02:42 crc kubenswrapper[4771]: I0219 23:02:42.219771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c987fbfd4-j2jd2" event={"ID":"a0f668cc-1d0e-4306-be4b-cd43366d001e","Type":"ContainerStarted","Data":"05f7edd80030b9eff018b3dccf91fa3ce2005c5da55e824896622170a9e26714"} Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.227417 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c987fbfd4-j2jd2" event={"ID":"a0f668cc-1d0e-4306-be4b-cd43366d001e","Type":"ContainerStarted","Data":"b6239cee456e7a26a81cf9f3f7b7d90388c3547cbe7367af5c1b53301767754d"} Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.227688 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.227701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c987fbfd4-j2jd2" event={"ID":"a0f668cc-1d0e-4306-be4b-cd43366d001e","Type":"ContainerStarted","Data":"bc9332c3506a610b2c5ffafd49df0c0ad5d0979432487239af8b9abaa5606a10"} Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.227873 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.264555 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-c987fbfd4-j2jd2" podStartSLOduration=2.264527283 podStartE2EDuration="2.264527283s" podCreationTimestamp="2026-02-19 23:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:43.254155636 +0000 UTC m=+5663.525598116" watchObservedRunningTime="2026-02-19 23:02:43.264527283 +0000 UTC m=+5663.535969763" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.339689 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-df6b66f66-cg8dd"] Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.340943 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.343098 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.343471 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.404838 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-df6b66f66-cg8dd"] Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.407793 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-public-tls-certs\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.407871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-internal-tls-certs\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.407904 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e92976d-8555-4275-bf37-3d1e2f56aea1-log-httpd\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.407965 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-config-data\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.407994 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-combined-ca-bundle\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.408034 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e92976d-8555-4275-bf37-3d1e2f56aea1-run-httpd\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.408107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgrt\" (UniqueName: \"kubernetes.io/projected/2e92976d-8555-4275-bf37-3d1e2f56aea1-kube-api-access-8kgrt\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.408306 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2e92976d-8555-4275-bf37-3d1e2f56aea1-etc-swift\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.509721 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-public-tls-certs\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.509798 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-internal-tls-certs\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.509837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e92976d-8555-4275-bf37-3d1e2f56aea1-log-httpd\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.509867 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-config-data\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.509904 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-combined-ca-bundle\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.509924 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e92976d-8555-4275-bf37-3d1e2f56aea1-run-httpd\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.509941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgrt\" (UniqueName: \"kubernetes.io/projected/2e92976d-8555-4275-bf37-3d1e2f56aea1-kube-api-access-8kgrt\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.510001 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2e92976d-8555-4275-bf37-3d1e2f56aea1-etc-swift\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.510582 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e92976d-8555-4275-bf37-3d1e2f56aea1-run-httpd\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.511484 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e92976d-8555-4275-bf37-3d1e2f56aea1-log-httpd\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.526812 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-config-data\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.527498 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-public-tls-certs\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.528512 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-internal-tls-certs\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.530946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2e92976d-8555-4275-bf37-3d1e2f56aea1-etc-swift\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.548594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e92976d-8555-4275-bf37-3d1e2f56aea1-combined-ca-bundle\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.549775 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgrt\" (UniqueName: \"kubernetes.io/projected/2e92976d-8555-4275-bf37-3d1e2f56aea1-kube-api-access-8kgrt\") pod \"swift-proxy-df6b66f66-cg8dd\" (UID: \"2e92976d-8555-4275-bf37-3d1e2f56aea1\") " pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:43 crc kubenswrapper[4771]: I0219 23:02:43.655764 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:44 crc kubenswrapper[4771]: I0219 23:02:44.234299 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf923325-9073-4b69-91b2-573bcdfbfdb0" containerID="fa7638dc82dd9ef8fa44305c6c8e8e9a9a867b5b65fd4a66681599efc2f1057e" exitCode=0 Feb 19 23:02:44 crc kubenswrapper[4771]: I0219 23:02:44.234374 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6j2q" event={"ID":"bf923325-9073-4b69-91b2-573bcdfbfdb0","Type":"ContainerDied","Data":"fa7638dc82dd9ef8fa44305c6c8e8e9a9a867b5b65fd4a66681599efc2f1057e"} Feb 19 23:02:44 crc kubenswrapper[4771]: I0219 23:02:44.445482 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-df6b66f66-cg8dd"] Feb 19 23:02:44 crc kubenswrapper[4771]: W0219 23:02:44.446947 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e92976d_8555_4275_bf37_3d1e2f56aea1.slice/crio-6570d196c19d0330eec90f7b9490e07cd0df0c80fafd45358c1b6493a6ce1a3e WatchSource:0}: Error finding container 6570d196c19d0330eec90f7b9490e07cd0df0c80fafd45358c1b6493a6ce1a3e: Status 404 returned error can't find the container with id 6570d196c19d0330eec90f7b9490e07cd0df0c80fafd45358c1b6493a6ce1a3e Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.248503 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-df6b66f66-cg8dd" event={"ID":"2e92976d-8555-4275-bf37-3d1e2f56aea1","Type":"ContainerStarted","Data":"df58fd1dfc93d37a1930520907b8e19d59b65a6b51e90ef1953d73806553e17c"} Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.248792 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-df6b66f66-cg8dd" event={"ID":"2e92976d-8555-4275-bf37-3d1e2f56aea1","Type":"ContainerStarted","Data":"1d9a3b1f99a744984383dcd945f6b0a22d3e04b977be9548b9abe71d9cb2d47e"} Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.248803 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-df6b66f66-cg8dd" event={"ID":"2e92976d-8555-4275-bf37-3d1e2f56aea1","Type":"ContainerStarted","Data":"6570d196c19d0330eec90f7b9490e07cd0df0c80fafd45358c1b6493a6ce1a3e"} Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.292655 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-df6b66f66-cg8dd" podStartSLOduration=2.292627328 podStartE2EDuration="2.292627328s" podCreationTimestamp="2026-02-19 23:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:45.280207726 +0000 UTC m=+5665.551650216" watchObservedRunningTime="2026-02-19 23:02:45.292627328 +0000 UTC m=+5665.564069808" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.720364 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781254 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-scripts\") pod \"bf923325-9073-4b69-91b2-573bcdfbfdb0\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-dispersionconf\") pod \"bf923325-9073-4b69-91b2-573bcdfbfdb0\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-combined-ca-bundle\") pod \"bf923325-9073-4b69-91b2-573bcdfbfdb0\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781450 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-ring-data-devices\") pod \"bf923325-9073-4b69-91b2-573bcdfbfdb0\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781478 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svnwf\" (UniqueName: \"kubernetes.io/projected/bf923325-9073-4b69-91b2-573bcdfbfdb0-kube-api-access-svnwf\") pod \"bf923325-9073-4b69-91b2-573bcdfbfdb0\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781519 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-swiftconf\") pod \"bf923325-9073-4b69-91b2-573bcdfbfdb0\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781636 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf923325-9073-4b69-91b2-573bcdfbfdb0-etc-swift\") pod \"bf923325-9073-4b69-91b2-573bcdfbfdb0\" (UID: \"bf923325-9073-4b69-91b2-573bcdfbfdb0\") " Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.781988 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bf923325-9073-4b69-91b2-573bcdfbfdb0" (UID: "bf923325-9073-4b69-91b2-573bcdfbfdb0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.782126 4771 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.782626 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf923325-9073-4b69-91b2-573bcdfbfdb0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bf923325-9073-4b69-91b2-573bcdfbfdb0" (UID: "bf923325-9073-4b69-91b2-573bcdfbfdb0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.790261 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf923325-9073-4b69-91b2-573bcdfbfdb0-kube-api-access-svnwf" (OuterVolumeSpecName: "kube-api-access-svnwf") pod "bf923325-9073-4b69-91b2-573bcdfbfdb0" (UID: "bf923325-9073-4b69-91b2-573bcdfbfdb0"). InnerVolumeSpecName "kube-api-access-svnwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.790744 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bf923325-9073-4b69-91b2-573bcdfbfdb0" (UID: "bf923325-9073-4b69-91b2-573bcdfbfdb0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.813734 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf923325-9073-4b69-91b2-573bcdfbfdb0" (UID: "bf923325-9073-4b69-91b2-573bcdfbfdb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.817134 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bf923325-9073-4b69-91b2-573bcdfbfdb0" (UID: "bf923325-9073-4b69-91b2-573bcdfbfdb0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.823387 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-scripts" (OuterVolumeSpecName: "scripts") pod "bf923325-9073-4b69-91b2-573bcdfbfdb0" (UID: "bf923325-9073-4b69-91b2-573bcdfbfdb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.884086 4771 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.884122 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf923325-9073-4b69-91b2-573bcdfbfdb0-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.884132 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf923325-9073-4b69-91b2-573bcdfbfdb0-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.884141 4771 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.884150 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf923325-9073-4b69-91b2-573bcdfbfdb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:45 crc kubenswrapper[4771]: I0219 23:02:45.884159 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svnwf\" (UniqueName: \"kubernetes.io/projected/bf923325-9073-4b69-91b2-573bcdfbfdb0-kube-api-access-svnwf\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:46 crc kubenswrapper[4771]: I0219 23:02:46.278131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6j2q" event={"ID":"bf923325-9073-4b69-91b2-573bcdfbfdb0","Type":"ContainerDied","Data":"34186231af5eba1e562b74e450b78a2ba72a65d22ab60ba9b8f95bd77c20677a"} Feb 19 23:02:46 crc kubenswrapper[4771]: I0219 23:02:46.278169 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6j2q" Feb 19 23:02:46 crc kubenswrapper[4771]: I0219 23:02:46.278173 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34186231af5eba1e562b74e450b78a2ba72a65d22ab60ba9b8f95bd77c20677a" Feb 19 23:02:46 crc kubenswrapper[4771]: I0219 23:02:46.279145 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:46 crc kubenswrapper[4771]: I0219 23:02:46.279206 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.337228 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.399330 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568bf9c765-ks2m8"] Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.399592 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" podUID="e927281e-63c9-4de2-a111-12d3c975aac5" containerName="dnsmasq-dns" containerID="cri-o://f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02" gracePeriod=10 Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.889951 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.964828 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-dns-svc\") pod \"e927281e-63c9-4de2-a111-12d3c975aac5\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.964908 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74qhk\" (UniqueName: \"kubernetes.io/projected/e927281e-63c9-4de2-a111-12d3c975aac5-kube-api-access-74qhk\") pod \"e927281e-63c9-4de2-a111-12d3c975aac5\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.964943 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-config\") pod \"e927281e-63c9-4de2-a111-12d3c975aac5\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.964974 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-sb\") pod \"e927281e-63c9-4de2-a111-12d3c975aac5\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.965102 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-nb\") pod \"e927281e-63c9-4de2-a111-12d3c975aac5\" (UID: \"e927281e-63c9-4de2-a111-12d3c975aac5\") " Feb 19 23:02:49 crc kubenswrapper[4771]: I0219 23:02:49.970541 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e927281e-63c9-4de2-a111-12d3c975aac5-kube-api-access-74qhk" (OuterVolumeSpecName: "kube-api-access-74qhk") pod "e927281e-63c9-4de2-a111-12d3c975aac5" (UID: "e927281e-63c9-4de2-a111-12d3c975aac5"). InnerVolumeSpecName "kube-api-access-74qhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.012512 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e927281e-63c9-4de2-a111-12d3c975aac5" (UID: "e927281e-63c9-4de2-a111-12d3c975aac5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.017326 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-config" (OuterVolumeSpecName: "config") pod "e927281e-63c9-4de2-a111-12d3c975aac5" (UID: "e927281e-63c9-4de2-a111-12d3c975aac5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.029008 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e927281e-63c9-4de2-a111-12d3c975aac5" (UID: "e927281e-63c9-4de2-a111-12d3c975aac5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.029573 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e927281e-63c9-4de2-a111-12d3c975aac5" (UID: "e927281e-63c9-4de2-a111-12d3c975aac5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.066559 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74qhk\" (UniqueName: \"kubernetes.io/projected/e927281e-63c9-4de2-a111-12d3c975aac5-kube-api-access-74qhk\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.066590 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.066599 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.066607 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.066616 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e927281e-63c9-4de2-a111-12d3c975aac5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.315288 4771 generic.go:334] "Generic (PLEG): container finished" podID="e927281e-63c9-4de2-a111-12d3c975aac5" containerID="f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02" exitCode=0 Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.315357 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.315387 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" event={"ID":"e927281e-63c9-4de2-a111-12d3c975aac5","Type":"ContainerDied","Data":"f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02"} Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.315704 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568bf9c765-ks2m8" event={"ID":"e927281e-63c9-4de2-a111-12d3c975aac5","Type":"ContainerDied","Data":"d15e0b63996236c5a0806d86667e3b62729ca717cfb0da174049833b64578cef"} Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.315755 4771 scope.go:117] "RemoveContainer" containerID="f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.333714 4771 scope.go:117] "RemoveContainer" containerID="28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.343812 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568bf9c765-ks2m8"] Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.351424 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568bf9c765-ks2m8"] Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.368662 4771 scope.go:117] "RemoveContainer" containerID="f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02" Feb 19 23:02:50 crc kubenswrapper[4771]: E0219 23:02:50.369345 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02\": container with ID starting with f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02 not found: ID does not exist" containerID="f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.369393 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02"} err="failed to get container status \"f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02\": rpc error: code = NotFound desc = could not find container \"f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02\": container with ID starting with f1a6439bff6424c983146c24e099dddf4f878adf3e92dc849ddf4e257f6ccd02 not found: ID does not exist" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.369421 4771 scope.go:117] "RemoveContainer" containerID="28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292" Feb 19 23:02:50 crc kubenswrapper[4771]: E0219 23:02:50.369894 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292\": container with ID starting with 28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292 not found: ID does not exist" containerID="28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.369952 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292"} err="failed to get container status \"28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292\": rpc error: code = NotFound desc = could not find container \"28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292\": container with ID starting with 28d77348fb0d51e50f0a0ad784ee9532d942ab7a268b3dd3e637cc9f1970c292 not found: ID does not exist" Feb 19 23:02:50 crc kubenswrapper[4771]: I0219 23:02:50.463061 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e927281e-63c9-4de2-a111-12d3c975aac5" path="/var/lib/kubelet/pods/e927281e-63c9-4de2-a111-12d3c975aac5/volumes" Feb 19 23:02:51 crc kubenswrapper[4771]: I0219 23:02:51.519376 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:51 crc kubenswrapper[4771]: I0219 23:02:51.520751 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:53 crc kubenswrapper[4771]: I0219 23:02:53.661985 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:53 crc kubenswrapper[4771]: I0219 23:02:53.663787 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-df6b66f66-cg8dd" Feb 19 23:02:53 crc kubenswrapper[4771]: I0219 23:02:53.773301 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-c987fbfd4-j2jd2"] Feb 19 23:02:53 crc kubenswrapper[4771]: I0219 23:02:53.773532 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-c987fbfd4-j2jd2" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-httpd" containerID="cri-o://bc9332c3506a610b2c5ffafd49df0c0ad5d0979432487239af8b9abaa5606a10" gracePeriod=30 Feb 19 23:02:53 crc kubenswrapper[4771]: I0219 23:02:53.774008 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-c987fbfd4-j2jd2" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-server" containerID="cri-o://b6239cee456e7a26a81cf9f3f7b7d90388c3547cbe7367af5c1b53301767754d" gracePeriod=30 Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.372078 4771 generic.go:334] "Generic (PLEG): container finished" podID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerID="b6239cee456e7a26a81cf9f3f7b7d90388c3547cbe7367af5c1b53301767754d" exitCode=0 Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.372450 4771 generic.go:334] "Generic (PLEG): container finished" podID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerID="bc9332c3506a610b2c5ffafd49df0c0ad5d0979432487239af8b9abaa5606a10" exitCode=0 Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.372173 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c987fbfd4-j2jd2" event={"ID":"a0f668cc-1d0e-4306-be4b-cd43366d001e","Type":"ContainerDied","Data":"b6239cee456e7a26a81cf9f3f7b7d90388c3547cbe7367af5c1b53301767754d"} Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.372514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c987fbfd4-j2jd2" event={"ID":"a0f668cc-1d0e-4306-be4b-cd43366d001e","Type":"ContainerDied","Data":"bc9332c3506a610b2c5ffafd49df0c0ad5d0979432487239af8b9abaa5606a10"} Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.641616 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.661292 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-config-data\") pod \"a0f668cc-1d0e-4306-be4b-cd43366d001e\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.661408 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs7xm\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-kube-api-access-zs7xm\") pod \"a0f668cc-1d0e-4306-be4b-cd43366d001e\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.661594 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-run-httpd\") pod \"a0f668cc-1d0e-4306-be4b-cd43366d001e\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.661638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-log-httpd\") pod \"a0f668cc-1d0e-4306-be4b-cd43366d001e\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.661694 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-etc-swift\") pod \"a0f668cc-1d0e-4306-be4b-cd43366d001e\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.661736 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-combined-ca-bundle\") pod \"a0f668cc-1d0e-4306-be4b-cd43366d001e\" (UID: \"a0f668cc-1d0e-4306-be4b-cd43366d001e\") " Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.662328 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0f668cc-1d0e-4306-be4b-cd43366d001e" (UID: "a0f668cc-1d0e-4306-be4b-cd43366d001e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.662627 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.662708 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0f668cc-1d0e-4306-be4b-cd43366d001e" (UID: "a0f668cc-1d0e-4306-be4b-cd43366d001e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.681290 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-kube-api-access-zs7xm" (OuterVolumeSpecName: "kube-api-access-zs7xm") pod "a0f668cc-1d0e-4306-be4b-cd43366d001e" (UID: "a0f668cc-1d0e-4306-be4b-cd43366d001e"). InnerVolumeSpecName "kube-api-access-zs7xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.681464 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a0f668cc-1d0e-4306-be4b-cd43366d001e" (UID: "a0f668cc-1d0e-4306-be4b-cd43366d001e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.718871 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0f668cc-1d0e-4306-be4b-cd43366d001e" (UID: "a0f668cc-1d0e-4306-be4b-cd43366d001e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.741271 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-config-data" (OuterVolumeSpecName: "config-data") pod "a0f668cc-1d0e-4306-be4b-cd43366d001e" (UID: "a0f668cc-1d0e-4306-be4b-cd43366d001e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.764959 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0f668cc-1d0e-4306-be4b-cd43366d001e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.764994 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.765002 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.765012 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f668cc-1d0e-4306-be4b-cd43366d001e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:54 crc kubenswrapper[4771]: I0219 23:02:54.765033 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs7xm\" (UniqueName: \"kubernetes.io/projected/a0f668cc-1d0e-4306-be4b-cd43366d001e-kube-api-access-zs7xm\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:55 crc kubenswrapper[4771]: I0219 23:02:55.383946 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c987fbfd4-j2jd2" event={"ID":"a0f668cc-1d0e-4306-be4b-cd43366d001e","Type":"ContainerDied","Data":"05f7edd80030b9eff018b3dccf91fa3ce2005c5da55e824896622170a9e26714"} Feb 19 23:02:55 crc kubenswrapper[4771]: I0219 23:02:55.384051 4771 scope.go:117] "RemoveContainer" containerID="b6239cee456e7a26a81cf9f3f7b7d90388c3547cbe7367af5c1b53301767754d" Feb 19 23:02:55 crc kubenswrapper[4771]: I0219 23:02:55.384094 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c987fbfd4-j2jd2" Feb 19 23:02:55 crc kubenswrapper[4771]: I0219 23:02:55.414655 4771 scope.go:117] "RemoveContainer" containerID="bc9332c3506a610b2c5ffafd49df0c0ad5d0979432487239af8b9abaa5606a10" Feb 19 23:02:55 crc kubenswrapper[4771]: I0219 23:02:55.440604 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-c987fbfd4-j2jd2"] Feb 19 23:02:55 crc kubenswrapper[4771]: I0219 23:02:55.452451 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-c987fbfd4-j2jd2"] Feb 19 23:02:56 crc kubenswrapper[4771]: I0219 23:02:56.480765 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" path="/var/lib/kubelet/pods/a0f668cc-1d0e-4306-be4b-cd43366d001e/volumes" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.740176 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qs9x4"] Feb 19 23:02:59 crc kubenswrapper[4771]: E0219 23:02:59.740823 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e927281e-63c9-4de2-a111-12d3c975aac5" containerName="init" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.740838 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e927281e-63c9-4de2-a111-12d3c975aac5" containerName="init" Feb 19 23:02:59 crc kubenswrapper[4771]: E0219 23:02:59.740864 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf923325-9073-4b69-91b2-573bcdfbfdb0" containerName="swift-ring-rebalance" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.740872 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf923325-9073-4b69-91b2-573bcdfbfdb0" containerName="swift-ring-rebalance" Feb 19 23:02:59 crc kubenswrapper[4771]: E0219 23:02:59.740896 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e927281e-63c9-4de2-a111-12d3c975aac5" containerName="dnsmasq-dns" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.740904 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e927281e-63c9-4de2-a111-12d3c975aac5" containerName="dnsmasq-dns" Feb 19 23:02:59 crc kubenswrapper[4771]: E0219 23:02:59.740917 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-server" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.740925 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-server" Feb 19 23:02:59 crc kubenswrapper[4771]: E0219 23:02:59.740934 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-httpd" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.740941 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-httpd" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.741200 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-server" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.741219 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf923325-9073-4b69-91b2-573bcdfbfdb0" containerName="swift-ring-rebalance" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.741229 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f668cc-1d0e-4306-be4b-cd43366d001e" containerName="proxy-httpd" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.741248 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e927281e-63c9-4de2-a111-12d3c975aac5" containerName="dnsmasq-dns" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.741872 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs9x4" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.792324 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qs9x4"] Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.837435 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7af4-account-create-update-m742s"] Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.838565 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.840093 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.844721 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7af4-account-create-update-m742s"] Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.858962 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jdb\" (UniqueName: \"kubernetes.io/projected/bd0523b6-312f-4159-8965-7bdbec2a4604-kube-api-access-j2jdb\") pod \"cinder-db-create-qs9x4\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " pod="openstack/cinder-db-create-qs9x4" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.859092 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd0523b6-312f-4159-8965-7bdbec2a4604-operator-scripts\") pod \"cinder-db-create-qs9x4\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " pod="openstack/cinder-db-create-qs9x4" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.960598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqzg\" (UniqueName: \"kubernetes.io/projected/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-kube-api-access-rgqzg\") pod \"cinder-7af4-account-create-update-m742s\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.960777 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd0523b6-312f-4159-8965-7bdbec2a4604-operator-scripts\") pod \"cinder-db-create-qs9x4\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " pod="openstack/cinder-db-create-qs9x4" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.960869 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-operator-scripts\") pod \"cinder-7af4-account-create-update-m742s\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.960924 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jdb\" (UniqueName: \"kubernetes.io/projected/bd0523b6-312f-4159-8965-7bdbec2a4604-kube-api-access-j2jdb\") pod \"cinder-db-create-qs9x4\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " pod="openstack/cinder-db-create-qs9x4" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.961630 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd0523b6-312f-4159-8965-7bdbec2a4604-operator-scripts\") pod \"cinder-db-create-qs9x4\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " pod="openstack/cinder-db-create-qs9x4" Feb 19 23:02:59 crc kubenswrapper[4771]: I0219 23:02:59.978749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jdb\" (UniqueName: \"kubernetes.io/projected/bd0523b6-312f-4159-8965-7bdbec2a4604-kube-api-access-j2jdb\") pod \"cinder-db-create-qs9x4\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " pod="openstack/cinder-db-create-qs9x4" Feb 19 23:03:00 crc kubenswrapper[4771]: I0219 23:03:00.063258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqzg\" (UniqueName: \"kubernetes.io/projected/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-kube-api-access-rgqzg\") pod \"cinder-7af4-account-create-update-m742s\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:03:00 crc kubenswrapper[4771]: I0219 23:03:00.063546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-operator-scripts\") pod \"cinder-7af4-account-create-update-m742s\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:03:00 crc kubenswrapper[4771]: I0219 23:03:00.064824 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-operator-scripts\") pod \"cinder-7af4-account-create-update-m742s\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:03:00 crc kubenswrapper[4771]: I0219 23:03:00.082306 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqzg\" (UniqueName: \"kubernetes.io/projected/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-kube-api-access-rgqzg\") pod \"cinder-7af4-account-create-update-m742s\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:03:00 crc kubenswrapper[4771]: I0219 23:03:00.113206 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs9x4" Feb 19 23:03:00 crc kubenswrapper[4771]: I0219 23:03:00.162697 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:03:01 crc kubenswrapper[4771]: I0219 23:03:01.235751 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7af4-account-create-update-m742s"] Feb 19 23:03:01 crc kubenswrapper[4771]: I0219 23:03:01.259400 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qs9x4"] Feb 19 23:03:01 crc kubenswrapper[4771]: I0219 23:03:01.478306 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qs9x4" event={"ID":"bd0523b6-312f-4159-8965-7bdbec2a4604","Type":"ContainerStarted","Data":"fc17221fe803bee1e194b0e359da3e6eaeb16e2439530592e70d797fcc85f785"} Feb 19 23:03:01 crc kubenswrapper[4771]: I0219 23:03:01.479180 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7af4-account-create-update-m742s" event={"ID":"1e14bb44-70cd-4826-a969-3cb4cf02e9b9","Type":"ContainerStarted","Data":"8f3cd1ec0dc67c5de659e433fd600fe0963b726b1a6c82f2e5f54ef7738072e1"} Feb 19 23:03:02 crc kubenswrapper[4771]: I0219 23:03:02.493587 4771 generic.go:334] "Generic (PLEG): container finished" podID="bd0523b6-312f-4159-8965-7bdbec2a4604" containerID="c510b293aab102e9fa4bbbbc25e57eca69a3d792fde7155abea6e09958950d06" exitCode=0 Feb 19 23:03:02 crc kubenswrapper[4771]: I0219 23:03:02.493766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qs9x4" event={"ID":"bd0523b6-312f-4159-8965-7bdbec2a4604","Type":"ContainerDied","Data":"c510b293aab102e9fa4bbbbc25e57eca69a3d792fde7155abea6e09958950d06"} Feb 19 23:03:02 crc kubenswrapper[4771]: I0219 23:03:02.498174 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e14bb44-70cd-4826-a969-3cb4cf02e9b9" containerID="4c7b00cc6e781261fe9e6a4d33c2fdc91d83e5358e2167acd65e58b05c780c58" exitCode=0 Feb 19 23:03:02 crc kubenswrapper[4771]: I0219 23:03:02.498461 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7af4-account-create-update-m742s" event={"ID":"1e14bb44-70cd-4826-a969-3cb4cf02e9b9","Type":"ContainerDied","Data":"4c7b00cc6e781261fe9e6a4d33c2fdc91d83e5358e2167acd65e58b05c780c58"} Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.110236 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs9x4" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.121469 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.269635 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2jdb\" (UniqueName: \"kubernetes.io/projected/bd0523b6-312f-4159-8965-7bdbec2a4604-kube-api-access-j2jdb\") pod \"bd0523b6-312f-4159-8965-7bdbec2a4604\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.269784 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-operator-scripts\") pod \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.269818 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgqzg\" (UniqueName: \"kubernetes.io/projected/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-kube-api-access-rgqzg\") pod \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\" (UID: \"1e14bb44-70cd-4826-a969-3cb4cf02e9b9\") " Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.269875 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd0523b6-312f-4159-8965-7bdbec2a4604-operator-scripts\") pod \"bd0523b6-312f-4159-8965-7bdbec2a4604\" (UID: \"bd0523b6-312f-4159-8965-7bdbec2a4604\") " Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.270710 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd0523b6-312f-4159-8965-7bdbec2a4604-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd0523b6-312f-4159-8965-7bdbec2a4604" (UID: "bd0523b6-312f-4159-8965-7bdbec2a4604"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.270918 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e14bb44-70cd-4826-a969-3cb4cf02e9b9" (UID: "1e14bb44-70cd-4826-a969-3cb4cf02e9b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.276317 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-kube-api-access-rgqzg" (OuterVolumeSpecName: "kube-api-access-rgqzg") pod "1e14bb44-70cd-4826-a969-3cb4cf02e9b9" (UID: "1e14bb44-70cd-4826-a969-3cb4cf02e9b9"). InnerVolumeSpecName "kube-api-access-rgqzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.277724 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0523b6-312f-4159-8965-7bdbec2a4604-kube-api-access-j2jdb" (OuterVolumeSpecName: "kube-api-access-j2jdb") pod "bd0523b6-312f-4159-8965-7bdbec2a4604" (UID: "bd0523b6-312f-4159-8965-7bdbec2a4604"). InnerVolumeSpecName "kube-api-access-j2jdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.371774 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.371806 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgqzg\" (UniqueName: \"kubernetes.io/projected/1e14bb44-70cd-4826-a969-3cb4cf02e9b9-kube-api-access-rgqzg\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.371816 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd0523b6-312f-4159-8965-7bdbec2a4604-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.371825 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2jdb\" (UniqueName: \"kubernetes.io/projected/bd0523b6-312f-4159-8965-7bdbec2a4604-kube-api-access-j2jdb\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.522967 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs9x4" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.522958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qs9x4" event={"ID":"bd0523b6-312f-4159-8965-7bdbec2a4604","Type":"ContainerDied","Data":"fc17221fe803bee1e194b0e359da3e6eaeb16e2439530592e70d797fcc85f785"} Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.523183 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc17221fe803bee1e194b0e359da3e6eaeb16e2439530592e70d797fcc85f785" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.525883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7af4-account-create-update-m742s" event={"ID":"1e14bb44-70cd-4826-a969-3cb4cf02e9b9","Type":"ContainerDied","Data":"8f3cd1ec0dc67c5de659e433fd600fe0963b726b1a6c82f2e5f54ef7738072e1"} Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.525924 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f3cd1ec0dc67c5de659e433fd600fe0963b726b1a6c82f2e5f54ef7738072e1" Feb 19 23:03:04 crc kubenswrapper[4771]: I0219 23:03:04.525949 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7af4-account-create-update-m742s" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.235343 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dhhc8"] Feb 19 23:03:09 crc kubenswrapper[4771]: E0219 23:03:09.236226 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e14bb44-70cd-4826-a969-3cb4cf02e9b9" containerName="mariadb-account-create-update" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.236252 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e14bb44-70cd-4826-a969-3cb4cf02e9b9" containerName="mariadb-account-create-update" Feb 19 23:03:09 crc kubenswrapper[4771]: E0219 23:03:09.236282 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0523b6-312f-4159-8965-7bdbec2a4604" containerName="mariadb-database-create" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.236294 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0523b6-312f-4159-8965-7bdbec2a4604" containerName="mariadb-database-create" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.236596 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e14bb44-70cd-4826-a969-3cb4cf02e9b9" containerName="mariadb-account-create-update" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.236638 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0523b6-312f-4159-8965-7bdbec2a4604" containerName="mariadb-database-create" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.238466 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.253841 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhhc8"] Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.267121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-catalog-content\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.267175 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-utilities\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.267228 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89kpq\" (UniqueName: \"kubernetes.io/projected/6f4d0a7b-999c-4285-ba3b-220837394af3-kube-api-access-89kpq\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.368251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-utilities\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.368334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89kpq\" (UniqueName: \"kubernetes.io/projected/6f4d0a7b-999c-4285-ba3b-220837394af3-kube-api-access-89kpq\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.368705 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-catalog-content\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.368905 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-utilities\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.368990 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-catalog-content\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.397856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89kpq\" (UniqueName: \"kubernetes.io/projected/6f4d0a7b-999c-4285-ba3b-220837394af3-kube-api-access-89kpq\") pod \"redhat-marketplace-dhhc8\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:09 crc kubenswrapper[4771]: I0219 23:03:09.573235 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.029769 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhhc8"] Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.113856 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9b2lj"] Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.116411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.118588 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.119609 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xztbs" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.119626 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.145251 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9b2lj"] Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.187261 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-config-data\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.187306 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-combined-ca-bundle\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.187339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-db-sync-config-data\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.187403 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-scripts\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.187429 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ee88110-ba9d-4076-b6e1-412a08c88b2b-etc-machine-id\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.187457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkghj\" (UniqueName: \"kubernetes.io/projected/1ee88110-ba9d-4076-b6e1-412a08c88b2b-kube-api-access-qkghj\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.288896 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-config-data\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.288956 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-combined-ca-bundle\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.288994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-db-sync-config-data\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.289053 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-scripts\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.289084 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ee88110-ba9d-4076-b6e1-412a08c88b2b-etc-machine-id\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.289115 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkghj\" (UniqueName: \"kubernetes.io/projected/1ee88110-ba9d-4076-b6e1-412a08c88b2b-kube-api-access-qkghj\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.289216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ee88110-ba9d-4076-b6e1-412a08c88b2b-etc-machine-id\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.294115 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-db-sync-config-data\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.294173 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-scripts\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.294175 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-combined-ca-bundle\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.296070 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-config-data\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.306244 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkghj\" (UniqueName: \"kubernetes.io/projected/1ee88110-ba9d-4076-b6e1-412a08c88b2b-kube-api-access-qkghj\") pod \"cinder-db-sync-9b2lj\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.469188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.618705 4771 generic.go:334] "Generic (PLEG): container finished" podID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerID="1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7" exitCode=0 Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.618748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhhc8" event={"ID":"6f4d0a7b-999c-4285-ba3b-220837394af3","Type":"ContainerDied","Data":"1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7"} Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.618778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhhc8" event={"ID":"6f4d0a7b-999c-4285-ba3b-220837394af3","Type":"ContainerStarted","Data":"fd28d0623f3368829053885113e7c71532a1400b9929eca9ced5d405fbfd525d"} Feb 19 23:03:10 crc kubenswrapper[4771]: I0219 23:03:10.931960 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9b2lj"] Feb 19 23:03:10 crc kubenswrapper[4771]: W0219 23:03:10.933283 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee88110_ba9d_4076_b6e1_412a08c88b2b.slice/crio-ac74f90103fca6bbddd7827d18f62f6466b7c1f3370f8435acb554b3950c8063 WatchSource:0}: Error finding container ac74f90103fca6bbddd7827d18f62f6466b7c1f3370f8435acb554b3950c8063: Status 404 returned error can't find the container with id ac74f90103fca6bbddd7827d18f62f6466b7c1f3370f8435acb554b3950c8063 Feb 19 23:03:11 crc kubenswrapper[4771]: I0219 23:03:11.633666 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9b2lj" event={"ID":"1ee88110-ba9d-4076-b6e1-412a08c88b2b","Type":"ContainerStarted","Data":"ac74f90103fca6bbddd7827d18f62f6466b7c1f3370f8435acb554b3950c8063"} Feb 19 23:03:11 crc kubenswrapper[4771]: I0219 23:03:11.637816 4771 generic.go:334] "Generic (PLEG): container finished" podID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerID="79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244" exitCode=0 Feb 19 23:03:11 crc kubenswrapper[4771]: I0219 23:03:11.637849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhhc8" event={"ID":"6f4d0a7b-999c-4285-ba3b-220837394af3","Type":"ContainerDied","Data":"79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244"} Feb 19 23:03:12 crc kubenswrapper[4771]: I0219 23:03:12.650897 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9b2lj" event={"ID":"1ee88110-ba9d-4076-b6e1-412a08c88b2b","Type":"ContainerStarted","Data":"169e31fe6ae33f7eff5e60f2e3cc613f3278f50cacfe0ca7d04b383c1c185bd8"} Feb 19 23:03:12 crc kubenswrapper[4771]: I0219 23:03:12.655700 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhhc8" event={"ID":"6f4d0a7b-999c-4285-ba3b-220837394af3","Type":"ContainerStarted","Data":"c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8"} Feb 19 23:03:12 crc kubenswrapper[4771]: I0219 23:03:12.682689 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9b2lj" podStartSLOduration=2.682665587 podStartE2EDuration="2.682665587s" podCreationTimestamp="2026-02-19 23:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:12.672171977 +0000 UTC m=+5692.943614467" watchObservedRunningTime="2026-02-19 23:03:12.682665587 +0000 UTC m=+5692.954108077" Feb 19 23:03:12 crc kubenswrapper[4771]: I0219 23:03:12.694234 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dhhc8" podStartSLOduration=2.256530687 podStartE2EDuration="3.694210295s" podCreationTimestamp="2026-02-19 23:03:09 +0000 UTC" firstStartedPulling="2026-02-19 23:03:10.62115162 +0000 UTC m=+5690.892594100" lastFinishedPulling="2026-02-19 23:03:12.058831228 +0000 UTC m=+5692.330273708" observedRunningTime="2026-02-19 23:03:12.692744196 +0000 UTC m=+5692.964186716" watchObservedRunningTime="2026-02-19 23:03:12.694210295 +0000 UTC m=+5692.965652775" Feb 19 23:03:14 crc kubenswrapper[4771]: I0219 23:03:14.680213 4771 generic.go:334] "Generic (PLEG): container finished" podID="1ee88110-ba9d-4076-b6e1-412a08c88b2b" containerID="169e31fe6ae33f7eff5e60f2e3cc613f3278f50cacfe0ca7d04b383c1c185bd8" exitCode=0 Feb 19 23:03:14 crc kubenswrapper[4771]: I0219 23:03:14.680353 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9b2lj" event={"ID":"1ee88110-ba9d-4076-b6e1-412a08c88b2b","Type":"ContainerDied","Data":"169e31fe6ae33f7eff5e60f2e3cc613f3278f50cacfe0ca7d04b383c1c185bd8"} Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.125849 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.213759 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-scripts\") pod \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.213958 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkghj\" (UniqueName: \"kubernetes.io/projected/1ee88110-ba9d-4076-b6e1-412a08c88b2b-kube-api-access-qkghj\") pod \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.214004 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ee88110-ba9d-4076-b6e1-412a08c88b2b-etc-machine-id\") pod \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.214081 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-db-sync-config-data\") pod \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.214152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-config-data\") pod \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.214200 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-combined-ca-bundle\") pod \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\" (UID: \"1ee88110-ba9d-4076-b6e1-412a08c88b2b\") " Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.214244 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ee88110-ba9d-4076-b6e1-412a08c88b2b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1ee88110-ba9d-4076-b6e1-412a08c88b2b" (UID: "1ee88110-ba9d-4076-b6e1-412a08c88b2b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.214766 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ee88110-ba9d-4076-b6e1-412a08c88b2b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.220624 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee88110-ba9d-4076-b6e1-412a08c88b2b-kube-api-access-qkghj" (OuterVolumeSpecName: "kube-api-access-qkghj") pod "1ee88110-ba9d-4076-b6e1-412a08c88b2b" (UID: "1ee88110-ba9d-4076-b6e1-412a08c88b2b"). InnerVolumeSpecName "kube-api-access-qkghj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.220719 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-scripts" (OuterVolumeSpecName: "scripts") pod "1ee88110-ba9d-4076-b6e1-412a08c88b2b" (UID: "1ee88110-ba9d-4076-b6e1-412a08c88b2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.227318 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1ee88110-ba9d-4076-b6e1-412a08c88b2b" (UID: "1ee88110-ba9d-4076-b6e1-412a08c88b2b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.251323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ee88110-ba9d-4076-b6e1-412a08c88b2b" (UID: "1ee88110-ba9d-4076-b6e1-412a08c88b2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.300308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-config-data" (OuterVolumeSpecName: "config-data") pod "1ee88110-ba9d-4076-b6e1-412a08c88b2b" (UID: "1ee88110-ba9d-4076-b6e1-412a08c88b2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.316374 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.316414 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.316430 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.316443 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkghj\" (UniqueName: \"kubernetes.io/projected/1ee88110-ba9d-4076-b6e1-412a08c88b2b-kube-api-access-qkghj\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.316455 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ee88110-ba9d-4076-b6e1-412a08c88b2b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.714088 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9b2lj" event={"ID":"1ee88110-ba9d-4076-b6e1-412a08c88b2b","Type":"ContainerDied","Data":"ac74f90103fca6bbddd7827d18f62f6466b7c1f3370f8435acb554b3950c8063"} Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.714164 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac74f90103fca6bbddd7827d18f62f6466b7c1f3370f8435acb554b3950c8063" Feb 19 23:03:16 crc kubenswrapper[4771]: I0219 23:03:16.714218 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9b2lj" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.106234 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fbb99998f-hqnp6"] Feb 19 23:03:17 crc kubenswrapper[4771]: E0219 23:03:17.106525 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee88110-ba9d-4076-b6e1-412a08c88b2b" containerName="cinder-db-sync" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.106541 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee88110-ba9d-4076-b6e1-412a08c88b2b" containerName="cinder-db-sync" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.106698 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee88110-ba9d-4076-b6e1-412a08c88b2b" containerName="cinder-db-sync" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.107493 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.116785 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbb99998f-hqnp6"] Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.239201 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbzs\" (UniqueName: \"kubernetes.io/projected/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-kube-api-access-2jbzs\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.239296 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-dns-svc\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.239316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-config\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.239400 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.239452 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.244223 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.245988 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.250089 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.250284 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.250409 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.250913 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xztbs" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.253068 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341191 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341279 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-dns-svc\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341333 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-config\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341349 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-scripts\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341385 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj7bx\" (UniqueName: \"kubernetes.io/projected/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-kube-api-access-tj7bx\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341406 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data-custom\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341423 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341504 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-logs\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.341543 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbzs\" (UniqueName: \"kubernetes.io/projected/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-kube-api-access-2jbzs\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.342791 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-dns-svc\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.342982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-config\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.343087 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.343292 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-sb\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.358067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbzs\" (UniqueName: \"kubernetes.io/projected/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-kube-api-access-2jbzs\") pod \"dnsmasq-dns-7fbb99998f-hqnp6\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.431547 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.443228 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-scripts\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.443604 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj7bx\" (UniqueName: \"kubernetes.io/projected/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-kube-api-access-tj7bx\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.443638 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data-custom\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.443693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.443740 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-logs\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.443804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.443880 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.444218 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.444268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-logs\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.447594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-scripts\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.447935 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.448008 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.449073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data-custom\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.471587 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj7bx\" (UniqueName: \"kubernetes.io/projected/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-kube-api-access-tj7bx\") pod \"cinder-api-0\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " pod="openstack/cinder-api-0" Feb 19 23:03:17 crc kubenswrapper[4771]: I0219 23:03:17.564284 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:18 crc kubenswrapper[4771]: I0219 23:03:17.951163 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbb99998f-hqnp6"] Feb 19 23:03:18 crc kubenswrapper[4771]: I0219 23:03:18.070805 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:18 crc kubenswrapper[4771]: I0219 23:03:18.761887 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerID="deadf83b521b793eda284f746a9f5b208fa02f047212785f0e1b02edff84924f" exitCode=0 Feb 19 23:03:18 crc kubenswrapper[4771]: I0219 23:03:18.762158 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" event={"ID":"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac","Type":"ContainerDied","Data":"deadf83b521b793eda284f746a9f5b208fa02f047212785f0e1b02edff84924f"} Feb 19 23:03:18 crc kubenswrapper[4771]: I0219 23:03:18.762184 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" event={"ID":"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac","Type":"ContainerStarted","Data":"8d0849cbf20df0111e610b4cf467ce8f7e4c18a710f2a46bc6c21d1a36705324"} Feb 19 23:03:18 crc kubenswrapper[4771]: I0219 23:03:18.764783 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"811b0e0c-b8c8-4e4a-a625-51a6f45629c8","Type":"ContainerStarted","Data":"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e"} Feb 19 23:03:18 crc kubenswrapper[4771]: I0219 23:03:18.764826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"811b0e0c-b8c8-4e4a-a625-51a6f45629c8","Type":"ContainerStarted","Data":"4cd7caa54579a6ebf3e787107ce59837d65827e22f557673ceba5998bd489125"} Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.573586 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.574138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.643248 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.774115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" event={"ID":"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac","Type":"ContainerStarted","Data":"8e18567b67030bce246d306805750f317818321ebf73d735096b436fe802fe49"} Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.774279 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.775549 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"811b0e0c-b8c8-4e4a-a625-51a6f45629c8","Type":"ContainerStarted","Data":"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c"} Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.791141 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" podStartSLOduration=2.791126364 podStartE2EDuration="2.791126364s" podCreationTimestamp="2026-02-19 23:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:19.790456375 +0000 UTC m=+5700.061898865" watchObservedRunningTime="2026-02-19 23:03:19.791126364 +0000 UTC m=+5700.062568834" Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.810638 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.810621834 podStartE2EDuration="2.810621834s" podCreationTimestamp="2026-02-19 23:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:19.808191339 +0000 UTC m=+5700.079633829" watchObservedRunningTime="2026-02-19 23:03:19.810621834 +0000 UTC m=+5700.082064294" Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.823860 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:19 crc kubenswrapper[4771]: I0219 23:03:19.883042 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhhc8"] Feb 19 23:03:20 crc kubenswrapper[4771]: I0219 23:03:20.126246 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:20 crc kubenswrapper[4771]: I0219 23:03:20.781925 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 23:03:21 crc kubenswrapper[4771]: I0219 23:03:21.792121 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dhhc8" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="registry-server" containerID="cri-o://c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8" gracePeriod=2 Feb 19 23:03:21 crc kubenswrapper[4771]: I0219 23:03:21.792370 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api" containerID="cri-o://736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c" gracePeriod=30 Feb 19 23:03:21 crc kubenswrapper[4771]: I0219 23:03:21.792340 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api-log" containerID="cri-o://65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e" gracePeriod=30 Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.358501 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.448809 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.475141 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-utilities\") pod \"6f4d0a7b-999c-4285-ba3b-220837394af3\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.475318 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-catalog-content\") pod \"6f4d0a7b-999c-4285-ba3b-220837394af3\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.475499 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89kpq\" (UniqueName: \"kubernetes.io/projected/6f4d0a7b-999c-4285-ba3b-220837394af3-kube-api-access-89kpq\") pod \"6f4d0a7b-999c-4285-ba3b-220837394af3\" (UID: \"6f4d0a7b-999c-4285-ba3b-220837394af3\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.477831 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-utilities" (OuterVolumeSpecName: "utilities") pod "6f4d0a7b-999c-4285-ba3b-220837394af3" (UID: "6f4d0a7b-999c-4285-ba3b-220837394af3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.483273 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4d0a7b-999c-4285-ba3b-220837394af3-kube-api-access-89kpq" (OuterVolumeSpecName: "kube-api-access-89kpq") pod "6f4d0a7b-999c-4285-ba3b-220837394af3" (UID: "6f4d0a7b-999c-4285-ba3b-220837394af3"). InnerVolumeSpecName "kube-api-access-89kpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.514528 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f4d0a7b-999c-4285-ba3b-220837394af3" (UID: "6f4d0a7b-999c-4285-ba3b-220837394af3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577085 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-scripts\") pod \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577283 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj7bx\" (UniqueName: \"kubernetes.io/projected/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-kube-api-access-tj7bx\") pod \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577315 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-etc-machine-id\") pod \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data\") pod \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577463 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-logs\") pod \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577502 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-combined-ca-bundle\") pod \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577561 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data-custom\") pod \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\" (UID: \"811b0e0c-b8c8-4e4a-a625-51a6f45629c8\") " Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.577829 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "811b0e0c-b8c8-4e4a-a625-51a6f45629c8" (UID: "811b0e0c-b8c8-4e4a-a625-51a6f45629c8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.578123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-logs" (OuterVolumeSpecName: "logs") pod "811b0e0c-b8c8-4e4a-a625-51a6f45629c8" (UID: "811b0e0c-b8c8-4e4a-a625-51a6f45629c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.578469 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.578493 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.578507 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.578518 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4d0a7b-999c-4285-ba3b-220837394af3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.578531 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89kpq\" (UniqueName: \"kubernetes.io/projected/6f4d0a7b-999c-4285-ba3b-220837394af3-kube-api-access-89kpq\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.580647 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-kube-api-access-tj7bx" (OuterVolumeSpecName: "kube-api-access-tj7bx") pod "811b0e0c-b8c8-4e4a-a625-51a6f45629c8" (UID: "811b0e0c-b8c8-4e4a-a625-51a6f45629c8"). InnerVolumeSpecName "kube-api-access-tj7bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.581266 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "811b0e0c-b8c8-4e4a-a625-51a6f45629c8" (UID: "811b0e0c-b8c8-4e4a-a625-51a6f45629c8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.581909 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-scripts" (OuterVolumeSpecName: "scripts") pod "811b0e0c-b8c8-4e4a-a625-51a6f45629c8" (UID: "811b0e0c-b8c8-4e4a-a625-51a6f45629c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.620389 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data" (OuterVolumeSpecName: "config-data") pod "811b0e0c-b8c8-4e4a-a625-51a6f45629c8" (UID: "811b0e0c-b8c8-4e4a-a625-51a6f45629c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.621802 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "811b0e0c-b8c8-4e4a-a625-51a6f45629c8" (UID: "811b0e0c-b8c8-4e4a-a625-51a6f45629c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.681171 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.681209 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.681226 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.681237 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.681251 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj7bx\" (UniqueName: \"kubernetes.io/projected/811b0e0c-b8c8-4e4a-a625-51a6f45629c8-kube-api-access-tj7bx\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.809924 4771 generic.go:334] "Generic (PLEG): container finished" podID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerID="c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8" exitCode=0 Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.810150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhhc8" event={"ID":"6f4d0a7b-999c-4285-ba3b-220837394af3","Type":"ContainerDied","Data":"c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8"} Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.810193 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhhc8" event={"ID":"6f4d0a7b-999c-4285-ba3b-220837394af3","Type":"ContainerDied","Data":"fd28d0623f3368829053885113e7c71532a1400b9929eca9ced5d405fbfd525d"} Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.810224 4771 scope.go:117] "RemoveContainer" containerID="c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.810395 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhhc8" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.824860 4771 generic.go:334] "Generic (PLEG): container finished" podID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerID="736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c" exitCode=0 Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.824911 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.824943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"811b0e0c-b8c8-4e4a-a625-51a6f45629c8","Type":"ContainerDied","Data":"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c"} Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.824990 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"811b0e0c-b8c8-4e4a-a625-51a6f45629c8","Type":"ContainerDied","Data":"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e"} Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.824912 4771 generic.go:334] "Generic (PLEG): container finished" podID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerID="65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e" exitCode=143 Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.825045 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"811b0e0c-b8c8-4e4a-a625-51a6f45629c8","Type":"ContainerDied","Data":"4cd7caa54579a6ebf3e787107ce59837d65827e22f557673ceba5998bd489125"} Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.853128 4771 scope.go:117] "RemoveContainer" containerID="79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.881742 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhhc8"] Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.891435 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhhc8"] Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.903689 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.905299 4771 scope.go:117] "RemoveContainer" containerID="1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.917592 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940084 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:22 crc kubenswrapper[4771]: E0219 23:03:22.940526 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="registry-server" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940543 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="registry-server" Feb 19 23:03:22 crc kubenswrapper[4771]: E0219 23:03:22.940556 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940562 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api" Feb 19 23:03:22 crc kubenswrapper[4771]: E0219 23:03:22.940577 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api-log" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940583 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api-log" Feb 19 23:03:22 crc kubenswrapper[4771]: E0219 23:03:22.940591 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="extract-content" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940597 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="extract-content" Feb 19 23:03:22 crc kubenswrapper[4771]: E0219 23:03:22.940610 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="extract-utilities" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940616 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="extract-utilities" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940789 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api-log" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940801 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" containerName="registry-server" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.940811 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" containerName="cinder-api" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.941708 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.945906 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.946085 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.946232 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.946348 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.946459 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.946553 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xztbs" Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.968593 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:22 crc kubenswrapper[4771]: I0219 23:03:22.998698 4771 scope.go:117] "RemoveContainer" containerID="c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8" Feb 19 23:03:22 crc kubenswrapper[4771]: E0219 23:03:22.999936 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8\": container with ID starting with c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8 not found: ID does not exist" containerID="c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:22.999985 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8"} err="failed to get container status \"c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8\": rpc error: code = NotFound desc = could not find container \"c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8\": container with ID starting with c12c059bcfa67413f915524434a29c392ae23687a829f181001d215648ef7db8 not found: ID does not exist" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.001034 4771 scope.go:117] "RemoveContainer" containerID="79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244" Feb 19 23:03:23 crc kubenswrapper[4771]: E0219 23:03:23.001361 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244\": container with ID starting with 79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244 not found: ID does not exist" containerID="79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.001398 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244"} err="failed to get container status \"79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244\": rpc error: code = NotFound desc = could not find container \"79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244\": container with ID starting with 79a2e1f32b23d6d11af70d6f3053856b0cec9a36b1a94b2905d270a8851a2244 not found: ID does not exist" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.001431 4771 scope.go:117] "RemoveContainer" containerID="1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7" Feb 19 23:03:23 crc kubenswrapper[4771]: E0219 23:03:23.002000 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7\": container with ID starting with 1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7 not found: ID does not exist" containerID="1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.002066 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7"} err="failed to get container status \"1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7\": rpc error: code = NotFound desc = could not find container \"1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7\": container with ID starting with 1dde7395598717395e02f8cdd53bc55ef14bfbfce143dbeacb6bdc365b8210d7 not found: ID does not exist" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.002100 4771 scope.go:117] "RemoveContainer" containerID="736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.023541 4771 scope.go:117] "RemoveContainer" containerID="65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.040794 4771 scope.go:117] "RemoveContainer" containerID="736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c" Feb 19 23:03:23 crc kubenswrapper[4771]: E0219 23:03:23.041194 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c\": container with ID starting with 736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c not found: ID does not exist" containerID="736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.041244 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c"} err="failed to get container status \"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c\": rpc error: code = NotFound desc = could not find container \"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c\": container with ID starting with 736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c not found: ID does not exist" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.041279 4771 scope.go:117] "RemoveContainer" containerID="65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e" Feb 19 23:03:23 crc kubenswrapper[4771]: E0219 23:03:23.041858 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e\": container with ID starting with 65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e not found: ID does not exist" containerID="65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.041905 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e"} err="failed to get container status \"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e\": rpc error: code = NotFound desc = could not find container \"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e\": container with ID starting with 65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e not found: ID does not exist" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.041939 4771 scope.go:117] "RemoveContainer" containerID="736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.042397 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c"} err="failed to get container status \"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c\": rpc error: code = NotFound desc = could not find container \"736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c\": container with ID starting with 736fa1185e77b288c6b6ced1c9f3f86af92fc88de217d06dd232b039d2d33d5c not found: ID does not exist" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.042428 4771 scope.go:117] "RemoveContainer" containerID="65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.042724 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e"} err="failed to get container status \"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e\": rpc error: code = NotFound desc = could not find container \"65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e\": container with ID starting with 65c29afcd00e1b395ba65dc1740abcd5b728395ba584e925c436fe825d2eb66e not found: ID does not exist" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data-custom\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092658 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092693 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092754 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlpdz\" (UniqueName: \"kubernetes.io/projected/7159c06e-b766-41a7-ac9b-e64c85a3e496-kube-api-access-hlpdz\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092788 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7159c06e-b766-41a7-ac9b-e64c85a3e496-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7159c06e-b766-41a7-ac9b-e64c85a3e496-logs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.092958 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-scripts\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.093184 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194726 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data-custom\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194792 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194856 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlpdz\" (UniqueName: \"kubernetes.io/projected/7159c06e-b766-41a7-ac9b-e64c85a3e496-kube-api-access-hlpdz\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194874 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7159c06e-b766-41a7-ac9b-e64c85a3e496-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194926 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7159c06e-b766-41a7-ac9b-e64c85a3e496-logs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.194953 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-scripts\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.195003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.195529 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7159c06e-b766-41a7-ac9b-e64c85a3e496-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.195581 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7159c06e-b766-41a7-ac9b-e64c85a3e496-logs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.199414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.199813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-scripts\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.199879 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.203875 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.207181 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data-custom\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.208252 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.216912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlpdz\" (UniqueName: \"kubernetes.io/projected/7159c06e-b766-41a7-ac9b-e64c85a3e496-kube-api-access-hlpdz\") pod \"cinder-api-0\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.306392 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:23 crc kubenswrapper[4771]: W0219 23:03:23.815741 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7159c06e_b766_41a7_ac9b_e64c85a3e496.slice/crio-68cccb8c44b4a492f51b1eb75db83ac3ffc7ebc6d858f4d696df7b9e5cbc467d WatchSource:0}: Error finding container 68cccb8c44b4a492f51b1eb75db83ac3ffc7ebc6d858f4d696df7b9e5cbc467d: Status 404 returned error can't find the container with id 68cccb8c44b4a492f51b1eb75db83ac3ffc7ebc6d858f4d696df7b9e5cbc467d Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.816261 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:23 crc kubenswrapper[4771]: I0219 23:03:23.842901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7159c06e-b766-41a7-ac9b-e64c85a3e496","Type":"ContainerStarted","Data":"68cccb8c44b4a492f51b1eb75db83ac3ffc7ebc6d858f4d696df7b9e5cbc467d"} Feb 19 23:03:24 crc kubenswrapper[4771]: I0219 23:03:24.449161 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4d0a7b-999c-4285-ba3b-220837394af3" path="/var/lib/kubelet/pods/6f4d0a7b-999c-4285-ba3b-220837394af3/volumes" Feb 19 23:03:24 crc kubenswrapper[4771]: I0219 23:03:24.450970 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811b0e0c-b8c8-4e4a-a625-51a6f45629c8" path="/var/lib/kubelet/pods/811b0e0c-b8c8-4e4a-a625-51a6f45629c8/volumes" Feb 19 23:03:24 crc kubenswrapper[4771]: I0219 23:03:24.857430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7159c06e-b766-41a7-ac9b-e64c85a3e496","Type":"ContainerStarted","Data":"787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928"} Feb 19 23:03:25 crc kubenswrapper[4771]: I0219 23:03:25.870565 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7159c06e-b766-41a7-ac9b-e64c85a3e496","Type":"ContainerStarted","Data":"2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03"} Feb 19 23:03:25 crc kubenswrapper[4771]: I0219 23:03:25.871343 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 23:03:25 crc kubenswrapper[4771]: I0219 23:03:25.910518 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.910462522 podStartE2EDuration="3.910462522s" podCreationTimestamp="2026-02-19 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:25.898878983 +0000 UTC m=+5706.170321493" watchObservedRunningTime="2026-02-19 23:03:25.910462522 +0000 UTC m=+5706.181905032" Feb 19 23:03:27 crc kubenswrapper[4771]: I0219 23:03:27.433280 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:03:27 crc kubenswrapper[4771]: I0219 23:03:27.560807 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c8b56b79-llhfr"] Feb 19 23:03:27 crc kubenswrapper[4771]: I0219 23:03:27.561059 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" podUID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerName="dnsmasq-dns" containerID="cri-o://2614cee41c4ab74bb7a691f4579c67d5c169ba4567b374a683b192a41a0a7be9" gracePeriod=10 Feb 19 23:03:27 crc kubenswrapper[4771]: I0219 23:03:27.922544 4771 generic.go:334] "Generic (PLEG): container finished" podID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerID="2614cee41c4ab74bb7a691f4579c67d5c169ba4567b374a683b192a41a0a7be9" exitCode=0 Feb 19 23:03:27 crc kubenswrapper[4771]: I0219 23:03:27.922834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" event={"ID":"a9f7a722-eb78-420a-a27b-77c3cafbfa26","Type":"ContainerDied","Data":"2614cee41c4ab74bb7a691f4579c67d5c169ba4567b374a683b192a41a0a7be9"} Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.055440 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.204235 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb77p\" (UniqueName: \"kubernetes.io/projected/a9f7a722-eb78-420a-a27b-77c3cafbfa26-kube-api-access-gb77p\") pod \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.204309 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-config\") pod \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.204417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-nb\") pod \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.204521 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-sb\") pod \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.204571 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-dns-svc\") pod \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\" (UID: \"a9f7a722-eb78-420a-a27b-77c3cafbfa26\") " Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.211113 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f7a722-eb78-420a-a27b-77c3cafbfa26-kube-api-access-gb77p" (OuterVolumeSpecName: "kube-api-access-gb77p") pod "a9f7a722-eb78-420a-a27b-77c3cafbfa26" (UID: "a9f7a722-eb78-420a-a27b-77c3cafbfa26"). InnerVolumeSpecName "kube-api-access-gb77p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.246863 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9f7a722-eb78-420a-a27b-77c3cafbfa26" (UID: "a9f7a722-eb78-420a-a27b-77c3cafbfa26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.247266 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9f7a722-eb78-420a-a27b-77c3cafbfa26" (UID: "a9f7a722-eb78-420a-a27b-77c3cafbfa26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.254260 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-config" (OuterVolumeSpecName: "config") pod "a9f7a722-eb78-420a-a27b-77c3cafbfa26" (UID: "a9f7a722-eb78-420a-a27b-77c3cafbfa26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.259657 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9f7a722-eb78-420a-a27b-77c3cafbfa26" (UID: "a9f7a722-eb78-420a-a27b-77c3cafbfa26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.308193 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.308233 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.308249 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.308263 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb77p\" (UniqueName: \"kubernetes.io/projected/a9f7a722-eb78-420a-a27b-77c3cafbfa26-kube-api-access-gb77p\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.308278 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f7a722-eb78-420a-a27b-77c3cafbfa26-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.939154 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" event={"ID":"a9f7a722-eb78-420a-a27b-77c3cafbfa26","Type":"ContainerDied","Data":"b3b4afb7826f4630ab4544a22cac416f85f82762c54abef63ff92ca3251836ac"} Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.939239 4771 scope.go:117] "RemoveContainer" containerID="2614cee41c4ab74bb7a691f4579c67d5c169ba4567b374a683b192a41a0a7be9" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.939251 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b56b79-llhfr" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.979973 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c8b56b79-llhfr"] Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.986641 4771 scope.go:117] "RemoveContainer" containerID="6f7268cdef6e11a203b9e0f5baf520e2c5270023de10ebe589a6d6c50f676dc3" Feb 19 23:03:28 crc kubenswrapper[4771]: I0219 23:03:28.992311 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64c8b56b79-llhfr"] Feb 19 23:03:30 crc kubenswrapper[4771]: I0219 23:03:30.458410 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" path="/var/lib/kubelet/pods/a9f7a722-eb78-420a-a27b-77c3cafbfa26/volumes" Feb 19 23:03:35 crc kubenswrapper[4771]: I0219 23:03:35.041672 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 23:03:35 crc kubenswrapper[4771]: I0219 23:03:35.472795 4771 scope.go:117] "RemoveContainer" containerID="5d3bd9f2eb48ede7ab46d1795ce774caf4c52d267590a311874b28d8a84eb867" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.888494 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qts46"] Feb 19 23:03:50 crc kubenswrapper[4771]: E0219 23:03:50.889583 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerName="init" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.889598 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerName="init" Feb 19 23:03:50 crc kubenswrapper[4771]: E0219 23:03:50.889616 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerName="dnsmasq-dns" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.889624 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerName="dnsmasq-dns" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.889847 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f7a722-eb78-420a-a27b-77c3cafbfa26" containerName="dnsmasq-dns" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.891333 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.906469 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qts46"] Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.993243 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-catalog-content\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.993316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-utilities\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:50 crc kubenswrapper[4771]: I0219 23:03:50.993347 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v69f4\" (UniqueName: \"kubernetes.io/projected/7517f150-0f31-4e5d-80d9-01c9cace48e9-kube-api-access-v69f4\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.095903 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-catalog-content\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.095988 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-utilities\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.096056 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v69f4\" (UniqueName: \"kubernetes.io/projected/7517f150-0f31-4e5d-80d9-01c9cace48e9-kube-api-access-v69f4\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.096616 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-utilities\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.096952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-catalog-content\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.120162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v69f4\" (UniqueName: \"kubernetes.io/projected/7517f150-0f31-4e5d-80d9-01c9cace48e9-kube-api-access-v69f4\") pod \"certified-operators-qts46\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.177275 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.179520 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.181302 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.185220 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.208649 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.305000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.305464 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.305663 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.305752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.305889 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d4dc80d-daac-45b9-8931-8b532ccb31c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.305948 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44sbg\" (UniqueName: \"kubernetes.io/projected/1d4dc80d-daac-45b9-8931-8b532ccb31c7-kube-api-access-44sbg\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.409819 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.409896 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.409917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.409968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d4dc80d-daac-45b9-8931-8b532ccb31c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.409991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44sbg\" (UniqueName: \"kubernetes.io/projected/1d4dc80d-daac-45b9-8931-8b532ccb31c7-kube-api-access-44sbg\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.410030 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.414251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d4dc80d-daac-45b9-8931-8b532ccb31c7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.417983 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-scripts\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.418854 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.422954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.430831 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.445953 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44sbg\" (UniqueName: \"kubernetes.io/projected/1d4dc80d-daac-45b9-8931-8b532ccb31c7-kube-api-access-44sbg\") pod \"cinder-scheduler-0\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.508811 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.703218 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qts46"] Feb 19 23:03:51 crc kubenswrapper[4771]: I0219 23:03:51.887960 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.185178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d4dc80d-daac-45b9-8931-8b532ccb31c7","Type":"ContainerStarted","Data":"224cfdaef4185b08985a1f2803a6ef0970bd54e12911f1f2caecd02b5f0354dc"} Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.191994 4771 generic.go:334] "Generic (PLEG): container finished" podID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerID="ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81" exitCode=0 Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.192083 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qts46" event={"ID":"7517f150-0f31-4e5d-80d9-01c9cace48e9","Type":"ContainerDied","Data":"ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81"} Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.192112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qts46" event={"ID":"7517f150-0f31-4e5d-80d9-01c9cace48e9","Type":"ContainerStarted","Data":"eadf87333f2d9790b439a054ef4a85f59f6175e2a57676d7b4dd86d9c18bb92b"} Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.195403 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.284430 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.284649 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api-log" containerID="cri-o://787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928" gracePeriod=30 Feb 19 23:03:52 crc kubenswrapper[4771]: I0219 23:03:52.284970 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api" containerID="cri-o://2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03" gracePeriod=30 Feb 19 23:03:53 crc kubenswrapper[4771]: I0219 23:03:53.202184 4771 generic.go:334] "Generic (PLEG): container finished" podID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerID="787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928" exitCode=143 Feb 19 23:03:53 crc kubenswrapper[4771]: I0219 23:03:53.202280 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7159c06e-b766-41a7-ac9b-e64c85a3e496","Type":"ContainerDied","Data":"787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928"} Feb 19 23:03:53 crc kubenswrapper[4771]: I0219 23:03:53.205315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d4dc80d-daac-45b9-8931-8b532ccb31c7","Type":"ContainerStarted","Data":"b258da95301ad0564a3561e7ba692800dc447e4f1335e5c34a50f09bdf46810d"} Feb 19 23:03:53 crc kubenswrapper[4771]: I0219 23:03:53.205390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d4dc80d-daac-45b9-8931-8b532ccb31c7","Type":"ContainerStarted","Data":"9c2ddbb4eb06b040baa39f7d75647bcd136ce739a918162e5cfaf5f86566082a"} Feb 19 23:03:53 crc kubenswrapper[4771]: I0219 23:03:53.209434 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qts46" event={"ID":"7517f150-0f31-4e5d-80d9-01c9cace48e9","Type":"ContainerStarted","Data":"0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261"} Feb 19 23:03:53 crc kubenswrapper[4771]: I0219 23:03:53.235185 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.235165748 podStartE2EDuration="2.235165748s" podCreationTimestamp="2026-02-19 23:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:53.229533228 +0000 UTC m=+5733.500975698" watchObservedRunningTime="2026-02-19 23:03:53.235165748 +0000 UTC m=+5733.506608218" Feb 19 23:03:54 crc kubenswrapper[4771]: I0219 23:03:54.232504 4771 generic.go:334] "Generic (PLEG): container finished" podID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerID="0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261" exitCode=0 Feb 19 23:03:54 crc kubenswrapper[4771]: I0219 23:03:54.232699 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qts46" event={"ID":"7517f150-0f31-4e5d-80d9-01c9cace48e9","Type":"ContainerDied","Data":"0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261"} Feb 19 23:03:55 crc kubenswrapper[4771]: I0219 23:03:55.249983 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qts46" event={"ID":"7517f150-0f31-4e5d-80d9-01c9cace48e9","Type":"ContainerStarted","Data":"c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e"} Feb 19 23:03:55 crc kubenswrapper[4771]: I0219 23:03:55.286782 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qts46" podStartSLOduration=2.841676487 podStartE2EDuration="5.286753969s" podCreationTimestamp="2026-02-19 23:03:50 +0000 UTC" firstStartedPulling="2026-02-19 23:03:52.194920527 +0000 UTC m=+5732.466363037" lastFinishedPulling="2026-02-19 23:03:54.639998049 +0000 UTC m=+5734.911440519" observedRunningTime="2026-02-19 23:03:55.277571303 +0000 UTC m=+5735.549013873" watchObservedRunningTime="2026-02-19 23:03:55.286753969 +0000 UTC m=+5735.558196479" Feb 19 23:03:55 crc kubenswrapper[4771]: I0219 23:03:55.440185 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.64:8776/healthcheck\": read tcp 10.217.0.2:39980->10.217.1.64:8776: read: connection reset by peer" Feb 19 23:03:55 crc kubenswrapper[4771]: I0219 23:03:55.943603 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.115762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data-custom\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.115835 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7159c06e-b766-41a7-ac9b-e64c85a3e496-logs\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.115890 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.115908 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7159c06e-b766-41a7-ac9b-e64c85a3e496-etc-machine-id\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.115934 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-scripts\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.116044 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-internal-tls-certs\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.116083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-combined-ca-bundle\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.116142 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlpdz\" (UniqueName: \"kubernetes.io/projected/7159c06e-b766-41a7-ac9b-e64c85a3e496-kube-api-access-hlpdz\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.116191 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-public-tls-certs\") pod \"7159c06e-b766-41a7-ac9b-e64c85a3e496\" (UID: \"7159c06e-b766-41a7-ac9b-e64c85a3e496\") " Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.116352 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7159c06e-b766-41a7-ac9b-e64c85a3e496-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.116678 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7159c06e-b766-41a7-ac9b-e64c85a3e496-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.116799 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7159c06e-b766-41a7-ac9b-e64c85a3e496-logs" (OuterVolumeSpecName: "logs") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.123264 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.128140 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-scripts" (OuterVolumeSpecName: "scripts") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.143010 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7159c06e-b766-41a7-ac9b-e64c85a3e496-kube-api-access-hlpdz" (OuterVolumeSpecName: "kube-api-access-hlpdz") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "kube-api-access-hlpdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.165122 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.184415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.192180 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data" (OuterVolumeSpecName: "config-data") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.198084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7159c06e-b766-41a7-ac9b-e64c85a3e496" (UID: "7159c06e-b766-41a7-ac9b-e64c85a3e496"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218763 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218817 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218829 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7159c06e-b766-41a7-ac9b-e64c85a3e496-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218839 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218847 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218855 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218878 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7159c06e-b766-41a7-ac9b-e64c85a3e496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.218887 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlpdz\" (UniqueName: \"kubernetes.io/projected/7159c06e-b766-41a7-ac9b-e64c85a3e496-kube-api-access-hlpdz\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.261046 4771 generic.go:334] "Generic (PLEG): container finished" podID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerID="2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03" exitCode=0 Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.261844 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.262078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7159c06e-b766-41a7-ac9b-e64c85a3e496","Type":"ContainerDied","Data":"2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03"} Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.262145 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7159c06e-b766-41a7-ac9b-e64c85a3e496","Type":"ContainerDied","Data":"68cccb8c44b4a492f51b1eb75db83ac3ffc7ebc6d858f4d696df7b9e5cbc467d"} Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.262163 4771 scope.go:117] "RemoveContainer" containerID="2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.298949 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.300657 4771 scope.go:117] "RemoveContainer" containerID="787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.305519 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.327637 4771 scope.go:117] "RemoveContainer" containerID="2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.327784 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:56 crc kubenswrapper[4771]: E0219 23:03:56.328141 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api-log" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.328157 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api-log" Feb 19 23:03:56 crc kubenswrapper[4771]: E0219 23:03:56.328189 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.328196 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.328380 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.328395 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" containerName="cinder-api-log" Feb 19 23:03:56 crc kubenswrapper[4771]: E0219 23:03:56.328610 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03\": container with ID starting with 2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03 not found: ID does not exist" containerID="2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.328638 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03"} err="failed to get container status \"2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03\": rpc error: code = NotFound desc = could not find container \"2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03\": container with ID starting with 2df40e0b0b6d976d6a7ebddac566af6915644de2b1470075e3b89e80f843ac03 not found: ID does not exist" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.328657 4771 scope.go:117] "RemoveContainer" containerID="787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928" Feb 19 23:03:56 crc kubenswrapper[4771]: E0219 23:03:56.329281 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928\": container with ID starting with 787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928 not found: ID does not exist" containerID="787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.329309 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928"} err="failed to get container status \"787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928\": rpc error: code = NotFound desc = could not find container \"787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928\": container with ID starting with 787b5e65da55381308f2a8095db589bc799d0065f781cdb79f3aa48eaf4c8928 not found: ID does not exist" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.329615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.331607 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.331812 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.331941 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.335466 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421688 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3d47d2-991d-42bc-9789-4a65bfee348f-logs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421738 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421794 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-config-data\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421834 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421892 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e3d47d2-991d-42bc-9789-4a65bfee348f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-scripts\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421958 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.421983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.422000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzptk\" (UniqueName: \"kubernetes.io/projected/0e3d47d2-991d-42bc-9789-4a65bfee348f-kube-api-access-pzptk\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.447289 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7159c06e-b766-41a7-ac9b-e64c85a3e496" path="/var/lib/kubelet/pods/7159c06e-b766-41a7-ac9b-e64c85a3e496/volumes" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.508971 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523589 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e3d47d2-991d-42bc-9789-4a65bfee348f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-scripts\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523725 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523773 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523776 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e3d47d2-991d-42bc-9789-4a65bfee348f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzptk\" (UniqueName: \"kubernetes.io/projected/0e3d47d2-991d-42bc-9789-4a65bfee348f-kube-api-access-pzptk\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3d47d2-991d-42bc-9789-4a65bfee348f-logs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.523979 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-config-data\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.524527 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3d47d2-991d-42bc-9789-4a65bfee348f-logs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.527238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-scripts\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.530245 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-config-data\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.530574 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.530597 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.530843 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.531142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d47d2-991d-42bc-9789-4a65bfee348f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.547184 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzptk\" (UniqueName: \"kubernetes.io/projected/0e3d47d2-991d-42bc-9789-4a65bfee348f-kube-api-access-pzptk\") pod \"cinder-api-0\" (UID: \"0e3d47d2-991d-42bc-9789-4a65bfee348f\") " pod="openstack/cinder-api-0" Feb 19 23:03:56 crc kubenswrapper[4771]: I0219 23:03:56.643829 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:03:57 crc kubenswrapper[4771]: I0219 23:03:57.185063 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:03:57 crc kubenswrapper[4771]: W0219 23:03:57.189953 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e3d47d2_991d_42bc_9789_4a65bfee348f.slice/crio-9d270120007feda4c3f1096f770416be7b408437bb07fd8a50e0c8446bf17452 WatchSource:0}: Error finding container 9d270120007feda4c3f1096f770416be7b408437bb07fd8a50e0c8446bf17452: Status 404 returned error can't find the container with id 9d270120007feda4c3f1096f770416be7b408437bb07fd8a50e0c8446bf17452 Feb 19 23:03:57 crc kubenswrapper[4771]: I0219 23:03:57.273372 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e3d47d2-991d-42bc-9789-4a65bfee348f","Type":"ContainerStarted","Data":"9d270120007feda4c3f1096f770416be7b408437bb07fd8a50e0c8446bf17452"} Feb 19 23:03:58 crc kubenswrapper[4771]: I0219 23:03:58.301782 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e3d47d2-991d-42bc-9789-4a65bfee348f","Type":"ContainerStarted","Data":"77a8eb3906a87e5c427dbd64030e45742cbf2083fbd8b8444765b8416f07665c"} Feb 19 23:03:59 crc kubenswrapper[4771]: I0219 23:03:59.316099 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e3d47d2-991d-42bc-9789-4a65bfee348f","Type":"ContainerStarted","Data":"f14c4b27d996c18caf1f538ce76bc62c37acd4bc1f498e922cb00ba92f25043a"} Feb 19 23:03:59 crc kubenswrapper[4771]: I0219 23:03:59.316271 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 23:03:59 crc kubenswrapper[4771]: I0219 23:03:59.369268 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.36924069 podStartE2EDuration="3.36924069s" podCreationTimestamp="2026-02-19 23:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:59.342951638 +0000 UTC m=+5739.614394138" watchObservedRunningTime="2026-02-19 23:03:59.36924069 +0000 UTC m=+5739.640683200" Feb 19 23:04:01 crc kubenswrapper[4771]: I0219 23:04:01.209497 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:04:01 crc kubenswrapper[4771]: I0219 23:04:01.209568 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:04:01 crc kubenswrapper[4771]: I0219 23:04:01.293317 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:04:01 crc kubenswrapper[4771]: I0219 23:04:01.415351 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:04:01 crc kubenswrapper[4771]: I0219 23:04:01.565262 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qts46"] Feb 19 23:04:01 crc kubenswrapper[4771]: I0219 23:04:01.735955 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 23:04:01 crc kubenswrapper[4771]: I0219 23:04:01.799558 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:04:02 crc kubenswrapper[4771]: I0219 23:04:02.358553 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="cinder-scheduler" containerID="cri-o://9c2ddbb4eb06b040baa39f7d75647bcd136ce739a918162e5cfaf5f86566082a" gracePeriod=30 Feb 19 23:04:02 crc kubenswrapper[4771]: I0219 23:04:02.361290 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="probe" containerID="cri-o://b258da95301ad0564a3561e7ba692800dc447e4f1335e5c34a50f09bdf46810d" gracePeriod=30 Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.369704 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerID="b258da95301ad0564a3561e7ba692800dc447e4f1335e5c34a50f09bdf46810d" exitCode=0 Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.369976 4771 generic.go:334] "Generic (PLEG): container finished" podID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerID="9c2ddbb4eb06b040baa39f7d75647bcd136ce739a918162e5cfaf5f86566082a" exitCode=0 Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.370263 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qts46" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="registry-server" containerID="cri-o://c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e" gracePeriod=2 Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.370697 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d4dc80d-daac-45b9-8931-8b532ccb31c7","Type":"ContainerDied","Data":"b258da95301ad0564a3561e7ba692800dc447e4f1335e5c34a50f09bdf46810d"} Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.370751 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d4dc80d-daac-45b9-8931-8b532ccb31c7","Type":"ContainerDied","Data":"9c2ddbb4eb06b040baa39f7d75647bcd136ce739a918162e5cfaf5f86566082a"} Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.742773 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.790343 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44sbg\" (UniqueName: \"kubernetes.io/projected/1d4dc80d-daac-45b9-8931-8b532ccb31c7-kube-api-access-44sbg\") pod \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.790399 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-scripts\") pod \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.790418 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data-custom\") pod \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.790459 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-combined-ca-bundle\") pod \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.790490 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d4dc80d-daac-45b9-8931-8b532ccb31c7-etc-machine-id\") pod \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.790533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data\") pod \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\" (UID: \"1d4dc80d-daac-45b9-8931-8b532ccb31c7\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.792082 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d4dc80d-daac-45b9-8931-8b532ccb31c7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1d4dc80d-daac-45b9-8931-8b532ccb31c7" (UID: "1d4dc80d-daac-45b9-8931-8b532ccb31c7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.797483 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-scripts" (OuterVolumeSpecName: "scripts") pod "1d4dc80d-daac-45b9-8931-8b532ccb31c7" (UID: "1d4dc80d-daac-45b9-8931-8b532ccb31c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.798368 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d4dc80d-daac-45b9-8931-8b532ccb31c7" (UID: "1d4dc80d-daac-45b9-8931-8b532ccb31c7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.799221 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4dc80d-daac-45b9-8931-8b532ccb31c7-kube-api-access-44sbg" (OuterVolumeSpecName: "kube-api-access-44sbg") pod "1d4dc80d-daac-45b9-8931-8b532ccb31c7" (UID: "1d4dc80d-daac-45b9-8931-8b532ccb31c7"). InnerVolumeSpecName "kube-api-access-44sbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.813121 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.852687 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d4dc80d-daac-45b9-8931-8b532ccb31c7" (UID: "1d4dc80d-daac-45b9-8931-8b532ccb31c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.891909 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-utilities\") pod \"7517f150-0f31-4e5d-80d9-01c9cace48e9\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.892242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-catalog-content\") pod \"7517f150-0f31-4e5d-80d9-01c9cace48e9\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.893442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-utilities" (OuterVolumeSpecName: "utilities") pod "7517f150-0f31-4e5d-80d9-01c9cace48e9" (UID: "7517f150-0f31-4e5d-80d9-01c9cace48e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.892283 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v69f4\" (UniqueName: \"kubernetes.io/projected/7517f150-0f31-4e5d-80d9-01c9cace48e9-kube-api-access-v69f4\") pod \"7517f150-0f31-4e5d-80d9-01c9cace48e9\" (UID: \"7517f150-0f31-4e5d-80d9-01c9cace48e9\") " Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.894609 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44sbg\" (UniqueName: \"kubernetes.io/projected/1d4dc80d-daac-45b9-8931-8b532ccb31c7-kube-api-access-44sbg\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.894634 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.894714 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.894729 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.894741 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d4dc80d-daac-45b9-8931-8b532ccb31c7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.894754 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.895690 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7517f150-0f31-4e5d-80d9-01c9cace48e9-kube-api-access-v69f4" (OuterVolumeSpecName: "kube-api-access-v69f4") pod "7517f150-0f31-4e5d-80d9-01c9cace48e9" (UID: "7517f150-0f31-4e5d-80d9-01c9cace48e9"). InnerVolumeSpecName "kube-api-access-v69f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.896538 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data" (OuterVolumeSpecName: "config-data") pod "1d4dc80d-daac-45b9-8931-8b532ccb31c7" (UID: "1d4dc80d-daac-45b9-8931-8b532ccb31c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.943503 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7517f150-0f31-4e5d-80d9-01c9cace48e9" (UID: "7517f150-0f31-4e5d-80d9-01c9cace48e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.996950 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7517f150-0f31-4e5d-80d9-01c9cace48e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.996984 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v69f4\" (UniqueName: \"kubernetes.io/projected/7517f150-0f31-4e5d-80d9-01c9cace48e9-kube-api-access-v69f4\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:03 crc kubenswrapper[4771]: I0219 23:04:03.996993 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d4dc80d-daac-45b9-8931-8b532ccb31c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.385605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d4dc80d-daac-45b9-8931-8b532ccb31c7","Type":"ContainerDied","Data":"224cfdaef4185b08985a1f2803a6ef0970bd54e12911f1f2caecd02b5f0354dc"} Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.385680 4771 scope.go:117] "RemoveContainer" containerID="b258da95301ad0564a3561e7ba692800dc447e4f1335e5c34a50f09bdf46810d" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.385808 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.396576 4771 generic.go:334] "Generic (PLEG): container finished" podID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerID="c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e" exitCode=0 Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.396629 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qts46" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.396636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qts46" event={"ID":"7517f150-0f31-4e5d-80d9-01c9cace48e9","Type":"ContainerDied","Data":"c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e"} Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.396798 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qts46" event={"ID":"7517f150-0f31-4e5d-80d9-01c9cace48e9","Type":"ContainerDied","Data":"eadf87333f2d9790b439a054ef4a85f59f6175e2a57676d7b4dd86d9c18bb92b"} Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.434046 4771 scope.go:117] "RemoveContainer" containerID="9c2ddbb4eb06b040baa39f7d75647bcd136ce739a918162e5cfaf5f86566082a" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.473631 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.481126 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.497738 4771 scope.go:117] "RemoveContainer" containerID="c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.504384 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qts46"] Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.513886 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.514265 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="probe" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514285 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="probe" Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.514300 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="registry-server" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514306 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="registry-server" Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.514316 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="extract-content" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514322 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="extract-content" Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.514338 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="extract-utilities" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514344 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="extract-utilities" Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.514364 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="cinder-scheduler" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514370 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="cinder-scheduler" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514514 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" containerName="registry-server" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514534 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="probe" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.514547 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" containerName="cinder-scheduler" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.515365 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.526597 4771 scope.go:117] "RemoveContainer" containerID="0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.527865 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qts46"] Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.528050 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.547153 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.563639 4771 scope.go:117] "RemoveContainer" containerID="ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.604915 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.605099 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00ef58df-5158-4cd3-9280-7e5bb9a054cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.605231 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.605355 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.605411 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.605443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt5dm\" (UniqueName: \"kubernetes.io/projected/00ef58df-5158-4cd3-9280-7e5bb9a054cf-kube-api-access-pt5dm\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.612635 4771 scope.go:117] "RemoveContainer" containerID="c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e" Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.613089 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e\": container with ID starting with c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e not found: ID does not exist" containerID="c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.613128 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e"} err="failed to get container status \"c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e\": rpc error: code = NotFound desc = could not find container \"c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e\": container with ID starting with c97c3b7538d1c94f5c88683675ed05029c033d8b52ecd9b00de6dab9fa32dd7e not found: ID does not exist" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.613153 4771 scope.go:117] "RemoveContainer" containerID="0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261" Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.613902 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261\": container with ID starting with 0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261 not found: ID does not exist" containerID="0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.613988 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261"} err="failed to get container status \"0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261\": rpc error: code = NotFound desc = could not find container \"0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261\": container with ID starting with 0bf4a3940212d34583acbda495f1ab3165ad5268974b35905f1e805079ae6261 not found: ID does not exist" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.614072 4771 scope.go:117] "RemoveContainer" containerID="ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81" Feb 19 23:04:04 crc kubenswrapper[4771]: E0219 23:04:04.614519 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81\": container with ID starting with ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81 not found: ID does not exist" containerID="ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.614551 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81"} err="failed to get container status \"ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81\": rpc error: code = NotFound desc = could not find container \"ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81\": container with ID starting with ef146f96c7e680cff6a4b790ed2e3a1f909087accb06294cd4d92a336be33c81 not found: ID does not exist" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.707197 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.707287 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00ef58df-5158-4cd3-9280-7e5bb9a054cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.707376 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.707516 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00ef58df-5158-4cd3-9280-7e5bb9a054cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.707414 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.708642 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.708712 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt5dm\" (UniqueName: \"kubernetes.io/projected/00ef58df-5158-4cd3-9280-7e5bb9a054cf-kube-api-access-pt5dm\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.711959 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.714266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.716464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.718479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ef58df-5158-4cd3-9280-7e5bb9a054cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.725802 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt5dm\" (UniqueName: \"kubernetes.io/projected/00ef58df-5158-4cd3-9280-7e5bb9a054cf-kube-api-access-pt5dm\") pod \"cinder-scheduler-0\" (UID: \"00ef58df-5158-4cd3-9280-7e5bb9a054cf\") " pod="openstack/cinder-scheduler-0" Feb 19 23:04:04 crc kubenswrapper[4771]: I0219 23:04:04.841500 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:04:05 crc kubenswrapper[4771]: I0219 23:04:05.140879 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:04:05 crc kubenswrapper[4771]: W0219 23:04:05.149503 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00ef58df_5158_4cd3_9280_7e5bb9a054cf.slice/crio-3ffaa7980d02361980970fa9b81a408b1a67aac3af6f7232511ef3fb85a9956e WatchSource:0}: Error finding container 3ffaa7980d02361980970fa9b81a408b1a67aac3af6f7232511ef3fb85a9956e: Status 404 returned error can't find the container with id 3ffaa7980d02361980970fa9b81a408b1a67aac3af6f7232511ef3fb85a9956e Feb 19 23:04:05 crc kubenswrapper[4771]: I0219 23:04:05.407861 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00ef58df-5158-4cd3-9280-7e5bb9a054cf","Type":"ContainerStarted","Data":"3ffaa7980d02361980970fa9b81a408b1a67aac3af6f7232511ef3fb85a9956e"} Feb 19 23:04:06 crc kubenswrapper[4771]: I0219 23:04:06.425013 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00ef58df-5158-4cd3-9280-7e5bb9a054cf","Type":"ContainerStarted","Data":"5b10935c68cd846667b709a95908a7fe6539532835f7bb12dc36880e8c06a58f"} Feb 19 23:04:06 crc kubenswrapper[4771]: I0219 23:04:06.449418 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4dc80d-daac-45b9-8931-8b532ccb31c7" path="/var/lib/kubelet/pods/1d4dc80d-daac-45b9-8931-8b532ccb31c7/volumes" Feb 19 23:04:06 crc kubenswrapper[4771]: I0219 23:04:06.451460 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7517f150-0f31-4e5d-80d9-01c9cace48e9" path="/var/lib/kubelet/pods/7517f150-0f31-4e5d-80d9-01c9cace48e9/volumes" Feb 19 23:04:07 crc kubenswrapper[4771]: I0219 23:04:07.439593 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00ef58df-5158-4cd3-9280-7e5bb9a054cf","Type":"ContainerStarted","Data":"75028965b514e752759c3c123c6a12bf95c3ef625b6a6b82f1c3051dcb224c34"} Feb 19 23:04:07 crc kubenswrapper[4771]: I0219 23:04:07.488146 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.487999148 podStartE2EDuration="3.487999148s" podCreationTimestamp="2026-02-19 23:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:07.460969417 +0000 UTC m=+5747.732411907" watchObservedRunningTime="2026-02-19 23:04:07.487999148 +0000 UTC m=+5747.759441658" Feb 19 23:04:08 crc kubenswrapper[4771]: I0219 23:04:08.348477 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 23:04:09 crc kubenswrapper[4771]: I0219 23:04:09.841935 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 23:04:12 crc kubenswrapper[4771]: I0219 23:04:12.956506 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:04:12 crc kubenswrapper[4771]: I0219 23:04:12.957208 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:04:15 crc kubenswrapper[4771]: I0219 23:04:15.065674 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.756388 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kkr59"] Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.758429 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kkr59" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.772112 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kkr59"] Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.856288 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2da2-account-create-update-rjbps"] Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.857503 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.860000 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.866564 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2da2-account-create-update-rjbps"] Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.893457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367ee05c-cc9f-4e74-b756-4192639e94a9-operator-scripts\") pod \"glance-db-create-kkr59\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " pod="openstack/glance-db-create-kkr59" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.893561 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72s5\" (UniqueName: \"kubernetes.io/projected/367ee05c-cc9f-4e74-b756-4192639e94a9-kube-api-access-j72s5\") pod \"glance-db-create-kkr59\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " pod="openstack/glance-db-create-kkr59" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.995290 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-operator-scripts\") pod \"glance-2da2-account-create-update-rjbps\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.995390 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367ee05c-cc9f-4e74-b756-4192639e94a9-operator-scripts\") pod \"glance-db-create-kkr59\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " pod="openstack/glance-db-create-kkr59" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.995430 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4v5\" (UniqueName: \"kubernetes.io/projected/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-kube-api-access-vp4v5\") pod \"glance-2da2-account-create-update-rjbps\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.995521 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j72s5\" (UniqueName: \"kubernetes.io/projected/367ee05c-cc9f-4e74-b756-4192639e94a9-kube-api-access-j72s5\") pod \"glance-db-create-kkr59\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " pod="openstack/glance-db-create-kkr59" Feb 19 23:04:17 crc kubenswrapper[4771]: I0219 23:04:17.996896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367ee05c-cc9f-4e74-b756-4192639e94a9-operator-scripts\") pod \"glance-db-create-kkr59\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " pod="openstack/glance-db-create-kkr59" Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.015901 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72s5\" (UniqueName: \"kubernetes.io/projected/367ee05c-cc9f-4e74-b756-4192639e94a9-kube-api-access-j72s5\") pod \"glance-db-create-kkr59\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " pod="openstack/glance-db-create-kkr59" Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.079983 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kkr59" Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.097542 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-operator-scripts\") pod \"glance-2da2-account-create-update-rjbps\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.097610 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4v5\" (UniqueName: \"kubernetes.io/projected/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-kube-api-access-vp4v5\") pod \"glance-2da2-account-create-update-rjbps\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.098769 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-operator-scripts\") pod \"glance-2da2-account-create-update-rjbps\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.131595 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4v5\" (UniqueName: \"kubernetes.io/projected/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-kube-api-access-vp4v5\") pod \"glance-2da2-account-create-update-rjbps\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.178567 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:18 crc kubenswrapper[4771]: W0219 23:04:18.615764 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod367ee05c_cc9f_4e74_b756_4192639e94a9.slice/crio-c6963ea9f53661e02a0a610f2f76f1040301796a50266498683ccaa8be9aab73 WatchSource:0}: Error finding container c6963ea9f53661e02a0a610f2f76f1040301796a50266498683ccaa8be9aab73: Status 404 returned error can't find the container with id c6963ea9f53661e02a0a610f2f76f1040301796a50266498683ccaa8be9aab73 Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.618914 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kkr59"] Feb 19 23:04:18 crc kubenswrapper[4771]: I0219 23:04:18.707178 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2da2-account-create-update-rjbps"] Feb 19 23:04:18 crc kubenswrapper[4771]: W0219 23:04:18.713152 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45d0887b_38d5_4c3d_8ade_150f8c40e3e1.slice/crio-91fff2d727e152772deda241d7fd0ece843bd7b1fbebdb88d59703486a0261ab WatchSource:0}: Error finding container 91fff2d727e152772deda241d7fd0ece843bd7b1fbebdb88d59703486a0261ab: Status 404 returned error can't find the container with id 91fff2d727e152772deda241d7fd0ece843bd7b1fbebdb88d59703486a0261ab Feb 19 23:04:19 crc kubenswrapper[4771]: I0219 23:04:19.563177 4771 generic.go:334] "Generic (PLEG): container finished" podID="367ee05c-cc9f-4e74-b756-4192639e94a9" containerID="921dea9be940b67f1f05a4e7409d4893f28e8bdb13c4d62fe5e88341b2ee3244" exitCode=0 Feb 19 23:04:19 crc kubenswrapper[4771]: I0219 23:04:19.563273 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kkr59" event={"ID":"367ee05c-cc9f-4e74-b756-4192639e94a9","Type":"ContainerDied","Data":"921dea9be940b67f1f05a4e7409d4893f28e8bdb13c4d62fe5e88341b2ee3244"} Feb 19 23:04:19 crc kubenswrapper[4771]: I0219 23:04:19.563729 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kkr59" event={"ID":"367ee05c-cc9f-4e74-b756-4192639e94a9","Type":"ContainerStarted","Data":"c6963ea9f53661e02a0a610f2f76f1040301796a50266498683ccaa8be9aab73"} Feb 19 23:04:19 crc kubenswrapper[4771]: I0219 23:04:19.567299 4771 generic.go:334] "Generic (PLEG): container finished" podID="45d0887b-38d5-4c3d-8ade-150f8c40e3e1" containerID="4e5a179519a014ab779faf8f4789a6e0e9f877abdf333ebf92736f211faa5dae" exitCode=0 Feb 19 23:04:19 crc kubenswrapper[4771]: I0219 23:04:19.567356 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2da2-account-create-update-rjbps" event={"ID":"45d0887b-38d5-4c3d-8ade-150f8c40e3e1","Type":"ContainerDied","Data":"4e5a179519a014ab779faf8f4789a6e0e9f877abdf333ebf92736f211faa5dae"} Feb 19 23:04:19 crc kubenswrapper[4771]: I0219 23:04:19.567421 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2da2-account-create-update-rjbps" event={"ID":"45d0887b-38d5-4c3d-8ade-150f8c40e3e1","Type":"ContainerStarted","Data":"91fff2d727e152772deda241d7fd0ece843bd7b1fbebdb88d59703486a0261ab"} Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.043330 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.049529 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kkr59" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.167160 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367ee05c-cc9f-4e74-b756-4192639e94a9-operator-scripts\") pod \"367ee05c-cc9f-4e74-b756-4192639e94a9\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.167878 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/367ee05c-cc9f-4e74-b756-4192639e94a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "367ee05c-cc9f-4e74-b756-4192639e94a9" (UID: "367ee05c-cc9f-4e74-b756-4192639e94a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.168345 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp4v5\" (UniqueName: \"kubernetes.io/projected/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-kube-api-access-vp4v5\") pod \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.168404 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j72s5\" (UniqueName: \"kubernetes.io/projected/367ee05c-cc9f-4e74-b756-4192639e94a9-kube-api-access-j72s5\") pod \"367ee05c-cc9f-4e74-b756-4192639e94a9\" (UID: \"367ee05c-cc9f-4e74-b756-4192639e94a9\") " Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.169174 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45d0887b-38d5-4c3d-8ade-150f8c40e3e1" (UID: "45d0887b-38d5-4c3d-8ade-150f8c40e3e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.169388 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-operator-scripts\") pod \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\" (UID: \"45d0887b-38d5-4c3d-8ade-150f8c40e3e1\") " Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.170133 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.170169 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367ee05c-cc9f-4e74-b756-4192639e94a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.173698 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-kube-api-access-vp4v5" (OuterVolumeSpecName: "kube-api-access-vp4v5") pod "45d0887b-38d5-4c3d-8ade-150f8c40e3e1" (UID: "45d0887b-38d5-4c3d-8ade-150f8c40e3e1"). InnerVolumeSpecName "kube-api-access-vp4v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.173971 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367ee05c-cc9f-4e74-b756-4192639e94a9-kube-api-access-j72s5" (OuterVolumeSpecName: "kube-api-access-j72s5") pod "367ee05c-cc9f-4e74-b756-4192639e94a9" (UID: "367ee05c-cc9f-4e74-b756-4192639e94a9"). InnerVolumeSpecName "kube-api-access-j72s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.272437 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp4v5\" (UniqueName: \"kubernetes.io/projected/45d0887b-38d5-4c3d-8ade-150f8c40e3e1-kube-api-access-vp4v5\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.272487 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j72s5\" (UniqueName: \"kubernetes.io/projected/367ee05c-cc9f-4e74-b756-4192639e94a9-kube-api-access-j72s5\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.592360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kkr59" event={"ID":"367ee05c-cc9f-4e74-b756-4192639e94a9","Type":"ContainerDied","Data":"c6963ea9f53661e02a0a610f2f76f1040301796a50266498683ccaa8be9aab73"} Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.592773 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6963ea9f53661e02a0a610f2f76f1040301796a50266498683ccaa8be9aab73" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.592852 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kkr59" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.595788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2da2-account-create-update-rjbps" event={"ID":"45d0887b-38d5-4c3d-8ade-150f8c40e3e1","Type":"ContainerDied","Data":"91fff2d727e152772deda241d7fd0ece843bd7b1fbebdb88d59703486a0261ab"} Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.595850 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91fff2d727e152772deda241d7fd0ece843bd7b1fbebdb88d59703486a0261ab" Feb 19 23:04:21 crc kubenswrapper[4771]: I0219 23:04:21.595856 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da2-account-create-update-rjbps" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.090306 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tg6g8"] Feb 19 23:04:23 crc kubenswrapper[4771]: E0219 23:04:23.090695 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d0887b-38d5-4c3d-8ade-150f8c40e3e1" containerName="mariadb-account-create-update" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.090711 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d0887b-38d5-4c3d-8ade-150f8c40e3e1" containerName="mariadb-account-create-update" Feb 19 23:04:23 crc kubenswrapper[4771]: E0219 23:04:23.090729 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367ee05c-cc9f-4e74-b756-4192639e94a9" containerName="mariadb-database-create" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.090737 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="367ee05c-cc9f-4e74-b756-4192639e94a9" containerName="mariadb-database-create" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.090949 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="367ee05c-cc9f-4e74-b756-4192639e94a9" containerName="mariadb-database-create" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.090970 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d0887b-38d5-4c3d-8ade-150f8c40e3e1" containerName="mariadb-account-create-update" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.091653 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.093874 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2qwhw" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.093937 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.113341 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tg6g8"] Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.219968 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-config-data\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.220031 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-combined-ca-bundle\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.220378 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-db-sync-config-data\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.220508 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhs69\" (UniqueName: \"kubernetes.io/projected/0fe0c371-c296-414d-958a-7db5115e6f69-kube-api-access-lhs69\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.322003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-config-data\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.322320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-combined-ca-bundle\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.322360 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-db-sync-config-data\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.322425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhs69\" (UniqueName: \"kubernetes.io/projected/0fe0c371-c296-414d-958a-7db5115e6f69-kube-api-access-lhs69\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.327957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-db-sync-config-data\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.328491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-config-data\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.340767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-combined-ca-bundle\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.345537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhs69\" (UniqueName: \"kubernetes.io/projected/0fe0c371-c296-414d-958a-7db5115e6f69-kube-api-access-lhs69\") pod \"glance-db-sync-tg6g8\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.420612 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:23 crc kubenswrapper[4771]: I0219 23:04:23.978610 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tg6g8"] Feb 19 23:04:23 crc kubenswrapper[4771]: W0219 23:04:23.982619 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fe0c371_c296_414d_958a_7db5115e6f69.slice/crio-2e94af540974798f4d159c3441ab3c8f24d732d48c1d2a1b4501af82ee5fca59 WatchSource:0}: Error finding container 2e94af540974798f4d159c3441ab3c8f24d732d48c1d2a1b4501af82ee5fca59: Status 404 returned error can't find the container with id 2e94af540974798f4d159c3441ab3c8f24d732d48c1d2a1b4501af82ee5fca59 Feb 19 23:04:24 crc kubenswrapper[4771]: I0219 23:04:24.649355 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tg6g8" event={"ID":"0fe0c371-c296-414d-958a-7db5115e6f69","Type":"ContainerStarted","Data":"547c91ba61d2195d904a75511570c54b6870927108472ba5802873f1e9f07143"} Feb 19 23:04:24 crc kubenswrapper[4771]: I0219 23:04:24.649605 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tg6g8" event={"ID":"0fe0c371-c296-414d-958a-7db5115e6f69","Type":"ContainerStarted","Data":"2e94af540974798f4d159c3441ab3c8f24d732d48c1d2a1b4501af82ee5fca59"} Feb 19 23:04:24 crc kubenswrapper[4771]: I0219 23:04:24.678519 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tg6g8" podStartSLOduration=1.678494958 podStartE2EDuration="1.678494958s" podCreationTimestamp="2026-02-19 23:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:24.670069513 +0000 UTC m=+5764.941512023" watchObservedRunningTime="2026-02-19 23:04:24.678494958 +0000 UTC m=+5764.949937458" Feb 19 23:04:28 crc kubenswrapper[4771]: I0219 23:04:28.699579 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fe0c371-c296-414d-958a-7db5115e6f69" containerID="547c91ba61d2195d904a75511570c54b6870927108472ba5802873f1e9f07143" exitCode=0 Feb 19 23:04:28 crc kubenswrapper[4771]: I0219 23:04:28.699769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tg6g8" event={"ID":"0fe0c371-c296-414d-958a-7db5115e6f69","Type":"ContainerDied","Data":"547c91ba61d2195d904a75511570c54b6870927108472ba5802873f1e9f07143"} Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.218749 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.368826 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-config-data\") pod \"0fe0c371-c296-414d-958a-7db5115e6f69\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.369006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhs69\" (UniqueName: \"kubernetes.io/projected/0fe0c371-c296-414d-958a-7db5115e6f69-kube-api-access-lhs69\") pod \"0fe0c371-c296-414d-958a-7db5115e6f69\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.369138 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-combined-ca-bundle\") pod \"0fe0c371-c296-414d-958a-7db5115e6f69\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.369178 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-db-sync-config-data\") pod \"0fe0c371-c296-414d-958a-7db5115e6f69\" (UID: \"0fe0c371-c296-414d-958a-7db5115e6f69\") " Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.374662 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0fe0c371-c296-414d-958a-7db5115e6f69" (UID: "0fe0c371-c296-414d-958a-7db5115e6f69"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.374892 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe0c371-c296-414d-958a-7db5115e6f69-kube-api-access-lhs69" (OuterVolumeSpecName: "kube-api-access-lhs69") pod "0fe0c371-c296-414d-958a-7db5115e6f69" (UID: "0fe0c371-c296-414d-958a-7db5115e6f69"). InnerVolumeSpecName "kube-api-access-lhs69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.411140 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fe0c371-c296-414d-958a-7db5115e6f69" (UID: "0fe0c371-c296-414d-958a-7db5115e6f69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.423378 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-config-data" (OuterVolumeSpecName: "config-data") pod "0fe0c371-c296-414d-958a-7db5115e6f69" (UID: "0fe0c371-c296-414d-958a-7db5115e6f69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.471550 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.471606 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhs69\" (UniqueName: \"kubernetes.io/projected/0fe0c371-c296-414d-958a-7db5115e6f69-kube-api-access-lhs69\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.471627 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.471647 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0c371-c296-414d-958a-7db5115e6f69-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.726871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tg6g8" event={"ID":"0fe0c371-c296-414d-958a-7db5115e6f69","Type":"ContainerDied","Data":"2e94af540974798f4d159c3441ab3c8f24d732d48c1d2a1b4501af82ee5fca59"} Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.727472 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e94af540974798f4d159c3441ab3c8f24d732d48c1d2a1b4501af82ee5fca59" Feb 19 23:04:30 crc kubenswrapper[4771]: I0219 23:04:30.727122 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tg6g8" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.163396 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c9b69b789-pchxj"] Feb 19 23:04:31 crc kubenswrapper[4771]: E0219 23:04:31.163948 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe0c371-c296-414d-958a-7db5115e6f69" containerName="glance-db-sync" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.164529 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe0c371-c296-414d-958a-7db5115e6f69" containerName="glance-db-sync" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.164791 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe0c371-c296-414d-958a-7db5115e6f69" containerName="glance-db-sync" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.170491 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.178746 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c9b69b789-pchxj"] Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.193275 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-nb\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.193357 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-sb\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.193417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-config\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.193488 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-dns-svc\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.193516 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxtdj\" (UniqueName: \"kubernetes.io/projected/b96de645-4069-45d8-957c-b89c558a1ff8-kube-api-access-gxtdj\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.235414 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.237929 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.240605 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2qwhw" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.240831 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.243158 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.247741 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.281049 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.283488 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.286076 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.312429 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-scripts\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.313457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-config\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.313609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m62ld\" (UniqueName: \"kubernetes.io/projected/e9ef09fe-bb9b-4e82-a050-be17851d8974-kube-api-access-m62ld\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.313721 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314007 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314161 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ksk\" (UniqueName: \"kubernetes.io/projected/31a01b38-21b5-469a-be1f-3c51841ee129-kube-api-access-j2ksk\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314267 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314469 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-dns-svc\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314596 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxtdj\" (UniqueName: \"kubernetes.io/projected/b96de645-4069-45d8-957c-b89c558a1ff8-kube-api-access-gxtdj\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.315099 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.315229 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-config-data\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.315346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-nb\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.314988 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-config\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.316526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-nb\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.315530 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.316732 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-logs\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.316861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-sb\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.317801 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-sb\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.319909 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-dns-svc\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.322938 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.349649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxtdj\" (UniqueName: \"kubernetes.io/projected/b96de645-4069-45d8-957c-b89c558a1ff8-kube-api-access-gxtdj\") pod \"dnsmasq-dns-c9b69b789-pchxj\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-config-data\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418516 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-logs\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418572 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-scripts\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418606 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m62ld\" (UniqueName: \"kubernetes.io/projected/e9ef09fe-bb9b-4e82-a050-be17851d8974-kube-api-access-m62ld\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418628 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ksk\" (UniqueName: \"kubernetes.io/projected/31a01b38-21b5-469a-be1f-3c51841ee129-kube-api-access-j2ksk\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418713 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.418759 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.419187 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.420438 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-logs\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.420618 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.421260 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.423297 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-config-data\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.423537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-scripts\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.424918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.425317 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.427571 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.428106 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.436254 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m62ld\" (UniqueName: \"kubernetes.io/projected/e9ef09fe-bb9b-4e82-a050-be17851d8974-kube-api-access-m62ld\") pod \"glance-default-internal-api-0\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.437360 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ksk\" (UniqueName: \"kubernetes.io/projected/31a01b38-21b5-469a-be1f-3c51841ee129-kube-api-access-j2ksk\") pod \"glance-default-external-api-0\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.491109 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.585380 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:31 crc kubenswrapper[4771]: I0219 23:04:31.623412 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:31.998045 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:32.016890 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c9b69b789-pchxj"] Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:32.209302 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:32 crc kubenswrapper[4771]: W0219 23:04:32.209554 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31a01b38_21b5_469a_be1f_3c51841ee129.slice/crio-1fb8d26e451a3e701b5e9b54b1d921bc8aa43c5bdcb2d0fb65f89ef46eb8c520 WatchSource:0}: Error finding container 1fb8d26e451a3e701b5e9b54b1d921bc8aa43c5bdcb2d0fb65f89ef46eb8c520: Status 404 returned error can't find the container with id 1fb8d26e451a3e701b5e9b54b1d921bc8aa43c5bdcb2d0fb65f89ef46eb8c520 Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:32.752570 4771 generic.go:334] "Generic (PLEG): container finished" podID="b96de645-4069-45d8-957c-b89c558a1ff8" containerID="024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e" exitCode=0 Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:32.752771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" event={"ID":"b96de645-4069-45d8-957c-b89c558a1ff8","Type":"ContainerDied","Data":"024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e"} Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:32.752797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" event={"ID":"b96de645-4069-45d8-957c-b89c558a1ff8","Type":"ContainerStarted","Data":"3713c712973a411fe5d0d0fcc4812912fdd120be6f04c7e72b380690e121c9f2"} Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:32.759164 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a01b38-21b5-469a-be1f-3c51841ee129","Type":"ContainerStarted","Data":"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627"} Feb 19 23:04:32 crc kubenswrapper[4771]: I0219 23:04:32.759207 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a01b38-21b5-469a-be1f-3c51841ee129","Type":"ContainerStarted","Data":"1fb8d26e451a3e701b5e9b54b1d921bc8aa43c5bdcb2d0fb65f89ef46eb8c520"} Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.315527 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.371947 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.769303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a01b38-21b5-469a-be1f-3c51841ee129","Type":"ContainerStarted","Data":"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e"} Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.769369 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-log" containerID="cri-o://4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627" gracePeriod=30 Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.769408 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-httpd" containerID="cri-o://806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e" gracePeriod=30 Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.772833 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" event={"ID":"b96de645-4069-45d8-957c-b89c558a1ff8","Type":"ContainerStarted","Data":"1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d"} Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.773034 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.779582 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9ef09fe-bb9b-4e82-a050-be17851d8974","Type":"ContainerStarted","Data":"b35e0e3b999ba76a0ff2b64452a4224439ab7f46903d21b3609994240c211258"} Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.789203 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.789184107 podStartE2EDuration="2.789184107s" podCreationTimestamp="2026-02-19 23:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:33.789086305 +0000 UTC m=+5774.060528795" watchObservedRunningTime="2026-02-19 23:04:33.789184107 +0000 UTC m=+5774.060626577" Feb 19 23:04:33 crc kubenswrapper[4771]: I0219 23:04:33.840345 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" podStartSLOduration=2.840322302 podStartE2EDuration="2.840322302s" podCreationTimestamp="2026-02-19 23:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:33.804672461 +0000 UTC m=+5774.076114951" watchObservedRunningTime="2026-02-19 23:04:33.840322302 +0000 UTC m=+5774.111764772" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.294398 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.381978 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-scripts\") pod \"31a01b38-21b5-469a-be1f-3c51841ee129\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.382069 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-config-data\") pod \"31a01b38-21b5-469a-be1f-3c51841ee129\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.382157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-httpd-run\") pod \"31a01b38-21b5-469a-be1f-3c51841ee129\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.382205 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-combined-ca-bundle\") pod \"31a01b38-21b5-469a-be1f-3c51841ee129\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.382331 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2ksk\" (UniqueName: \"kubernetes.io/projected/31a01b38-21b5-469a-be1f-3c51841ee129-kube-api-access-j2ksk\") pod \"31a01b38-21b5-469a-be1f-3c51841ee129\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.382397 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-logs\") pod \"31a01b38-21b5-469a-be1f-3c51841ee129\" (UID: \"31a01b38-21b5-469a-be1f-3c51841ee129\") " Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.382842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "31a01b38-21b5-469a-be1f-3c51841ee129" (UID: "31a01b38-21b5-469a-be1f-3c51841ee129"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.383532 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-logs" (OuterVolumeSpecName: "logs") pod "31a01b38-21b5-469a-be1f-3c51841ee129" (UID: "31a01b38-21b5-469a-be1f-3c51841ee129"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.383643 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.383665 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a01b38-21b5-469a-be1f-3c51841ee129-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.388813 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-scripts" (OuterVolumeSpecName: "scripts") pod "31a01b38-21b5-469a-be1f-3c51841ee129" (UID: "31a01b38-21b5-469a-be1f-3c51841ee129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.389355 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a01b38-21b5-469a-be1f-3c51841ee129-kube-api-access-j2ksk" (OuterVolumeSpecName: "kube-api-access-j2ksk") pod "31a01b38-21b5-469a-be1f-3c51841ee129" (UID: "31a01b38-21b5-469a-be1f-3c51841ee129"). InnerVolumeSpecName "kube-api-access-j2ksk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.415803 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31a01b38-21b5-469a-be1f-3c51841ee129" (UID: "31a01b38-21b5-469a-be1f-3c51841ee129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.442478 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-config-data" (OuterVolumeSpecName: "config-data") pod "31a01b38-21b5-469a-be1f-3c51841ee129" (UID: "31a01b38-21b5-469a-be1f-3c51841ee129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.485567 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.485764 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.485880 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a01b38-21b5-469a-be1f-3c51841ee129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.485961 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2ksk\" (UniqueName: \"kubernetes.io/projected/31a01b38-21b5-469a-be1f-3c51841ee129-kube-api-access-j2ksk\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.788603 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9ef09fe-bb9b-4e82-a050-be17851d8974","Type":"ContainerStarted","Data":"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec"} Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.788771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9ef09fe-bb9b-4e82-a050-be17851d8974","Type":"ContainerStarted","Data":"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544"} Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.788748 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-httpd" containerID="cri-o://aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec" gracePeriod=30 Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.788664 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-log" containerID="cri-o://27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544" gracePeriod=30 Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.796096 4771 generic.go:334] "Generic (PLEG): container finished" podID="31a01b38-21b5-469a-be1f-3c51841ee129" containerID="806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e" exitCode=0 Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.796135 4771 generic.go:334] "Generic (PLEG): container finished" podID="31a01b38-21b5-469a-be1f-3c51841ee129" containerID="4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627" exitCode=143 Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.797134 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.797245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a01b38-21b5-469a-be1f-3c51841ee129","Type":"ContainerDied","Data":"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e"} Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.797285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a01b38-21b5-469a-be1f-3c51841ee129","Type":"ContainerDied","Data":"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627"} Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.797301 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a01b38-21b5-469a-be1f-3c51841ee129","Type":"ContainerDied","Data":"1fb8d26e451a3e701b5e9b54b1d921bc8aa43c5bdcb2d0fb65f89ef46eb8c520"} Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.797324 4771 scope.go:117] "RemoveContainer" containerID="806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.814002 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.8139849569999997 podStartE2EDuration="3.813984957s" podCreationTimestamp="2026-02-19 23:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:34.810613666 +0000 UTC m=+5775.082056156" watchObservedRunningTime="2026-02-19 23:04:34.813984957 +0000 UTC m=+5775.085427427" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.822422 4771 scope.go:117] "RemoveContainer" containerID="4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.833775 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.842613 4771 scope.go:117] "RemoveContainer" containerID="806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.843348 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:34 crc kubenswrapper[4771]: E0219 23:04:34.843549 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e\": container with ID starting with 806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e not found: ID does not exist" containerID="806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.843637 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e"} err="failed to get container status \"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e\": rpc error: code = NotFound desc = could not find container \"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e\": container with ID starting with 806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e not found: ID does not exist" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.843740 4771 scope.go:117] "RemoveContainer" containerID="4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627" Feb 19 23:04:34 crc kubenswrapper[4771]: E0219 23:04:34.845036 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627\": container with ID starting with 4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627 not found: ID does not exist" containerID="4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.845130 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627"} err="failed to get container status \"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627\": rpc error: code = NotFound desc = could not find container \"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627\": container with ID starting with 4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627 not found: ID does not exist" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.845193 4771 scope.go:117] "RemoveContainer" containerID="806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.847983 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e"} err="failed to get container status \"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e\": rpc error: code = NotFound desc = could not find container \"806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e\": container with ID starting with 806bec4e220ec4dbb94b5e63ef3ff8628877cb8e6d512d37a228e58b9d70619e not found: ID does not exist" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.848364 4771 scope.go:117] "RemoveContainer" containerID="4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.850596 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627"} err="failed to get container status \"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627\": rpc error: code = NotFound desc = could not find container \"4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627\": container with ID starting with 4edfe5417db1de27e1867575a4b849d6af1e0c3ff26f99cf158c00e281dd2627 not found: ID does not exist" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.878792 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:34 crc kubenswrapper[4771]: E0219 23:04:34.879255 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-log" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.879273 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-log" Feb 19 23:04:34 crc kubenswrapper[4771]: E0219 23:04:34.879300 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-httpd" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.879308 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-httpd" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.879507 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-httpd" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.879524 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" containerName="glance-log" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.880464 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.883281 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.883296 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.900518 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.999683 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.999780 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:34 crc kubenswrapper[4771]: I0219 23:04:34.999912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.000080 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.000143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-logs\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.000226 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.000255 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbsw\" (UniqueName: \"kubernetes.io/projected/8ed1dabf-d278-4999-bff5-2782672d6781-kube-api-access-6rbsw\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.102143 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.102211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.102246 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.102298 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.102324 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-logs\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.102355 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.102377 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbsw\" (UniqueName: \"kubernetes.io/projected/8ed1dabf-d278-4999-bff5-2782672d6781-kube-api-access-6rbsw\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.103053 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-logs\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.103081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.106378 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.106534 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.106732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.108057 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.120530 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbsw\" (UniqueName: \"kubernetes.io/projected/8ed1dabf-d278-4999-bff5-2782672d6781-kube-api-access-6rbsw\") pod \"glance-default-external-api-0\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.271885 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.619139 4771 scope.go:117] "RemoveContainer" containerID="633c83fac7433f2ea1100a6b7b4b2e5bfc437c8f1ada1ef96e319423d17f483e" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.651226 4771 scope.go:117] "RemoveContainer" containerID="3d408ada512836840a64b8a7aa22de182693e97e18fcca96b6d44ca0465c1829" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.801957 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.825584 4771 generic.go:334] "Generic (PLEG): container finished" podID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerID="aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec" exitCode=0 Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.825618 4771 generic.go:334] "Generic (PLEG): container finished" podID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerID="27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544" exitCode=143 Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.825680 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9ef09fe-bb9b-4e82-a050-be17851d8974","Type":"ContainerDied","Data":"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec"} Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.825712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9ef09fe-bb9b-4e82-a050-be17851d8974","Type":"ContainerDied","Data":"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544"} Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.825726 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9ef09fe-bb9b-4e82-a050-be17851d8974","Type":"ContainerDied","Data":"b35e0e3b999ba76a0ff2b64452a4224439ab7f46903d21b3609994240c211258"} Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.825733 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.825746 4771 scope.go:117] "RemoveContainer" containerID="aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.852107 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.856553 4771 scope.go:117] "RemoveContainer" containerID="27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.916491 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m62ld\" (UniqueName: \"kubernetes.io/projected/e9ef09fe-bb9b-4e82-a050-be17851d8974-kube-api-access-m62ld\") pod \"e9ef09fe-bb9b-4e82-a050-be17851d8974\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.916613 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-config-data\") pod \"e9ef09fe-bb9b-4e82-a050-be17851d8974\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.916656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-logs\") pod \"e9ef09fe-bb9b-4e82-a050-be17851d8974\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.916743 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-httpd-run\") pod \"e9ef09fe-bb9b-4e82-a050-be17851d8974\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.916786 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-combined-ca-bundle\") pod \"e9ef09fe-bb9b-4e82-a050-be17851d8974\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.916849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-scripts\") pod \"e9ef09fe-bb9b-4e82-a050-be17851d8974\" (UID: \"e9ef09fe-bb9b-4e82-a050-be17851d8974\") " Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.918570 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-logs" (OuterVolumeSpecName: "logs") pod "e9ef09fe-bb9b-4e82-a050-be17851d8974" (UID: "e9ef09fe-bb9b-4e82-a050-be17851d8974"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.918683 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e9ef09fe-bb9b-4e82-a050-be17851d8974" (UID: "e9ef09fe-bb9b-4e82-a050-be17851d8974"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.922506 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-scripts" (OuterVolumeSpecName: "scripts") pod "e9ef09fe-bb9b-4e82-a050-be17851d8974" (UID: "e9ef09fe-bb9b-4e82-a050-be17851d8974"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.923192 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ef09fe-bb9b-4e82-a050-be17851d8974-kube-api-access-m62ld" (OuterVolumeSpecName: "kube-api-access-m62ld") pod "e9ef09fe-bb9b-4e82-a050-be17851d8974" (UID: "e9ef09fe-bb9b-4e82-a050-be17851d8974"). InnerVolumeSpecName "kube-api-access-m62ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.925073 4771 scope.go:117] "RemoveContainer" containerID="aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec" Feb 19 23:04:35 crc kubenswrapper[4771]: E0219 23:04:35.925484 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec\": container with ID starting with aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec not found: ID does not exist" containerID="aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.925525 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec"} err="failed to get container status \"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec\": rpc error: code = NotFound desc = could not find container \"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec\": container with ID starting with aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec not found: ID does not exist" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.925553 4771 scope.go:117] "RemoveContainer" containerID="27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544" Feb 19 23:04:35 crc kubenswrapper[4771]: E0219 23:04:35.926002 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544\": container with ID starting with 27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544 not found: ID does not exist" containerID="27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.926055 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544"} err="failed to get container status \"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544\": rpc error: code = NotFound desc = could not find container \"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544\": container with ID starting with 27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544 not found: ID does not exist" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.926082 4771 scope.go:117] "RemoveContainer" containerID="aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.926305 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec"} err="failed to get container status \"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec\": rpc error: code = NotFound desc = could not find container \"aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec\": container with ID starting with aa6267153af7833dd9a8480201cda0ad659e3bff9380619bb98a0db5524177ec not found: ID does not exist" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.926334 4771 scope.go:117] "RemoveContainer" containerID="27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.926657 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544"} err="failed to get container status \"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544\": rpc error: code = NotFound desc = could not find container \"27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544\": container with ID starting with 27eb74eb22ebed03ff6ee9f30d72baa5720f4978d69c46f54a7c37fcc66ac544 not found: ID does not exist" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.944673 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9ef09fe-bb9b-4e82-a050-be17851d8974" (UID: "e9ef09fe-bb9b-4e82-a050-be17851d8974"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:35 crc kubenswrapper[4771]: I0219 23:04:35.967165 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-config-data" (OuterVolumeSpecName: "config-data") pod "e9ef09fe-bb9b-4e82-a050-be17851d8974" (UID: "e9ef09fe-bb9b-4e82-a050-be17851d8974"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.019149 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m62ld\" (UniqueName: \"kubernetes.io/projected/e9ef09fe-bb9b-4e82-a050-be17851d8974-kube-api-access-m62ld\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.019183 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.019194 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.019204 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09fe-bb9b-4e82-a050-be17851d8974-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.019214 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.019222 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9ef09fe-bb9b-4e82-a050-be17851d8974-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.163411 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.171002 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.181163 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:36 crc kubenswrapper[4771]: E0219 23:04:36.181474 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-httpd" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.181492 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-httpd" Feb 19 23:04:36 crc kubenswrapper[4771]: E0219 23:04:36.181510 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-log" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.181517 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-log" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.181690 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-log" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.181709 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" containerName="glance-httpd" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.182742 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.185046 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.185099 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.199942 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.223928 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.224059 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.224118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.224179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.224230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwfct\" (UniqueName: \"kubernetes.io/projected/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-kube-api-access-bwfct\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.224270 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.224303 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.327278 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.327678 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.327724 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-logs\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.327742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.327930 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.328102 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.328169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.328191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwfct\" (UniqueName: \"kubernetes.io/projected/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-kube-api-access-bwfct\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.328207 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.332920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.332931 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.334117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.335221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.343131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwfct\" (UniqueName: \"kubernetes.io/projected/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-kube-api-access-bwfct\") pod \"glance-default-internal-api-0\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.446722 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a01b38-21b5-469a-be1f-3c51841ee129" path="/var/lib/kubelet/pods/31a01b38-21b5-469a-be1f-3c51841ee129/volumes" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.447669 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ef09fe-bb9b-4e82-a050-be17851d8974" path="/var/lib/kubelet/pods/e9ef09fe-bb9b-4e82-a050-be17851d8974/volumes" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.500419 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.837679 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ed1dabf-d278-4999-bff5-2782672d6781","Type":"ContainerStarted","Data":"6f7d022613a30d88077579c03072b8654bba98d034987ddb1f3088dd192c6207"} Feb 19 23:04:36 crc kubenswrapper[4771]: I0219 23:04:36.839277 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ed1dabf-d278-4999-bff5-2782672d6781","Type":"ContainerStarted","Data":"bc7dd7f9b9a3ef0f6f46491d4251e3c15ed3acdbac4f7d111befaa87696de1fa"} Feb 19 23:04:37 crc kubenswrapper[4771]: W0219 23:04:37.046886 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3ff24aa_eeb9_483c_9e20_0fe5949bb10c.slice/crio-ab3854ff8bef1e06d55bbe6ed44a2685abc146b12135e4e8e49bd47b8fac6aa3 WatchSource:0}: Error finding container ab3854ff8bef1e06d55bbe6ed44a2685abc146b12135e4e8e49bd47b8fac6aa3: Status 404 returned error can't find the container with id ab3854ff8bef1e06d55bbe6ed44a2685abc146b12135e4e8e49bd47b8fac6aa3 Feb 19 23:04:37 crc kubenswrapper[4771]: I0219 23:04:37.047986 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:37 crc kubenswrapper[4771]: I0219 23:04:37.856107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ed1dabf-d278-4999-bff5-2782672d6781","Type":"ContainerStarted","Data":"593d8695bb5aa300d9674870adefd18c581bed197f907d9490c5f404c479c207"} Feb 19 23:04:37 crc kubenswrapper[4771]: I0219 23:04:37.861227 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c","Type":"ContainerStarted","Data":"7e020a5d4eebb9762b908dff98fecc39ab9b009269a2b6f1e6276a4bbe8e7d9a"} Feb 19 23:04:37 crc kubenswrapper[4771]: I0219 23:04:37.861534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c","Type":"ContainerStarted","Data":"ab3854ff8bef1e06d55bbe6ed44a2685abc146b12135e4e8e49bd47b8fac6aa3"} Feb 19 23:04:37 crc kubenswrapper[4771]: I0219 23:04:37.896264 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.896247044 podStartE2EDuration="3.896247044s" podCreationTimestamp="2026-02-19 23:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:37.892105933 +0000 UTC m=+5778.163548433" watchObservedRunningTime="2026-02-19 23:04:37.896247044 +0000 UTC m=+5778.167689514" Feb 19 23:04:38 crc kubenswrapper[4771]: I0219 23:04:38.875726 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c","Type":"ContainerStarted","Data":"1858a869a1a9a41d2b43873ff0e0f5f9883fdf7c5781320a478d703445d60790"} Feb 19 23:04:41 crc kubenswrapper[4771]: I0219 23:04:41.493324 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:04:41 crc kubenswrapper[4771]: I0219 23:04:41.535344 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.535310151 podStartE2EDuration="5.535310151s" podCreationTimestamp="2026-02-19 23:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:38.911754006 +0000 UTC m=+5779.183196506" watchObservedRunningTime="2026-02-19 23:04:41.535310151 +0000 UTC m=+5781.806752661" Feb 19 23:04:41 crc kubenswrapper[4771]: I0219 23:04:41.582658 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbb99998f-hqnp6"] Feb 19 23:04:41 crc kubenswrapper[4771]: I0219 23:04:41.586802 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" podUID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerName="dnsmasq-dns" containerID="cri-o://8e18567b67030bce246d306805750f317818321ebf73d735096b436fe802fe49" gracePeriod=10 Feb 19 23:04:41 crc kubenswrapper[4771]: I0219 23:04:41.913274 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerID="8e18567b67030bce246d306805750f317818321ebf73d735096b436fe802fe49" exitCode=0 Feb 19 23:04:41 crc kubenswrapper[4771]: I0219 23:04:41.913343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" event={"ID":"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac","Type":"ContainerDied","Data":"8e18567b67030bce246d306805750f317818321ebf73d735096b436fe802fe49"} Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.058339 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.161917 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-sb\") pod \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.161990 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-dns-svc\") pod \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.162095 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-nb\") pod \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.162172 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jbzs\" (UniqueName: \"kubernetes.io/projected/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-kube-api-access-2jbzs\") pod \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.162261 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-config\") pod \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\" (UID: \"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac\") " Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.178492 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-kube-api-access-2jbzs" (OuterVolumeSpecName: "kube-api-access-2jbzs") pod "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" (UID: "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac"). InnerVolumeSpecName "kube-api-access-2jbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.213480 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-config" (OuterVolumeSpecName: "config") pod "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" (UID: "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.214441 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" (UID: "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.216384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" (UID: "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.218296 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" (UID: "ff77e00f-9a86-4b4b-b081-588b4ca0a7ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.264349 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.264381 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.264395 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jbzs\" (UniqueName: \"kubernetes.io/projected/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-kube-api-access-2jbzs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.264405 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.264414 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.931505 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" event={"ID":"ff77e00f-9a86-4b4b-b081-588b4ca0a7ac","Type":"ContainerDied","Data":"8d0849cbf20df0111e610b4cf467ce8f7e4c18a710f2a46bc6c21d1a36705324"} Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.931602 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbb99998f-hqnp6" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.931623 4771 scope.go:117] "RemoveContainer" containerID="8e18567b67030bce246d306805750f317818321ebf73d735096b436fe802fe49" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.957610 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.957707 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.976213 4771 scope.go:117] "RemoveContainer" containerID="deadf83b521b793eda284f746a9f5b208fa02f047212785f0e1b02edff84924f" Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.976284 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbb99998f-hqnp6"] Feb 19 23:04:42 crc kubenswrapper[4771]: I0219 23:04:42.985982 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fbb99998f-hqnp6"] Feb 19 23:04:44 crc kubenswrapper[4771]: I0219 23:04:44.450668 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" path="/var/lib/kubelet/pods/ff77e00f-9a86-4b4b-b081-588b4ca0a7ac/volumes" Feb 19 23:04:45 crc kubenswrapper[4771]: I0219 23:04:45.272147 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:04:45 crc kubenswrapper[4771]: I0219 23:04:45.272197 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:04:45 crc kubenswrapper[4771]: I0219 23:04:45.332041 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:04:45 crc kubenswrapper[4771]: I0219 23:04:45.349078 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:04:45 crc kubenswrapper[4771]: I0219 23:04:45.962900 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:04:45 crc kubenswrapper[4771]: I0219 23:04:45.962973 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:04:46 crc kubenswrapper[4771]: I0219 23:04:46.501260 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:46 crc kubenswrapper[4771]: I0219 23:04:46.501331 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:46 crc kubenswrapper[4771]: I0219 23:04:46.554139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:46 crc kubenswrapper[4771]: I0219 23:04:46.560995 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:46 crc kubenswrapper[4771]: I0219 23:04:46.973857 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:46 crc kubenswrapper[4771]: I0219 23:04:46.974199 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:47 crc kubenswrapper[4771]: I0219 23:04:47.723327 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:04:47 crc kubenswrapper[4771]: I0219 23:04:47.780314 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:04:48 crc kubenswrapper[4771]: I0219 23:04:48.791145 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:48 crc kubenswrapper[4771]: I0219 23:04:48.900875 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.637930 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-cjzcm"] Feb 19 23:04:55 crc kubenswrapper[4771]: E0219 23:04:55.638749 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerName="init" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.638763 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerName="init" Feb 19 23:04:55 crc kubenswrapper[4771]: E0219 23:04:55.638784 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerName="dnsmasq-dns" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.638792 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerName="dnsmasq-dns" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.638986 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff77e00f-9a86-4b4b-b081-588b4ca0a7ac" containerName="dnsmasq-dns" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.639629 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.654697 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cjzcm"] Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.724804 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-52b9-account-create-update-622pv"] Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.726192 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.728147 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.733448 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-52b9-account-create-update-622pv"] Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.752334 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd72920f-34be-4cde-becc-7d1b0e5a7daf-operator-scripts\") pod \"placement-db-create-cjzcm\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.752442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4gs\" (UniqueName: \"kubernetes.io/projected/dd72920f-34be-4cde-becc-7d1b0e5a7daf-kube-api-access-xc4gs\") pod \"placement-db-create-cjzcm\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.854063 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd72920f-34be-4cde-becc-7d1b0e5a7daf-operator-scripts\") pod \"placement-db-create-cjzcm\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.854107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f7ed81-dd1c-497b-b080-6cc1bef19fca-operator-scripts\") pod \"placement-52b9-account-create-update-622pv\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.854163 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4gs\" (UniqueName: \"kubernetes.io/projected/dd72920f-34be-4cde-becc-7d1b0e5a7daf-kube-api-access-xc4gs\") pod \"placement-db-create-cjzcm\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.854341 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25xq\" (UniqueName: \"kubernetes.io/projected/01f7ed81-dd1c-497b-b080-6cc1bef19fca-kube-api-access-m25xq\") pod \"placement-52b9-account-create-update-622pv\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.854943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd72920f-34be-4cde-becc-7d1b0e5a7daf-operator-scripts\") pod \"placement-db-create-cjzcm\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.878854 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4gs\" (UniqueName: \"kubernetes.io/projected/dd72920f-34be-4cde-becc-7d1b0e5a7daf-kube-api-access-xc4gs\") pod \"placement-db-create-cjzcm\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.955824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25xq\" (UniqueName: \"kubernetes.io/projected/01f7ed81-dd1c-497b-b080-6cc1bef19fca-kube-api-access-m25xq\") pod \"placement-52b9-account-create-update-622pv\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.955962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f7ed81-dd1c-497b-b080-6cc1bef19fca-operator-scripts\") pod \"placement-52b9-account-create-update-622pv\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.956719 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f7ed81-dd1c-497b-b080-6cc1bef19fca-operator-scripts\") pod \"placement-52b9-account-create-update-622pv\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.969614 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:55 crc kubenswrapper[4771]: I0219 23:04:55.981726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25xq\" (UniqueName: \"kubernetes.io/projected/01f7ed81-dd1c-497b-b080-6cc1bef19fca-kube-api-access-m25xq\") pod \"placement-52b9-account-create-update-622pv\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:56 crc kubenswrapper[4771]: I0219 23:04:56.044473 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:56 crc kubenswrapper[4771]: I0219 23:04:56.436371 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cjzcm"] Feb 19 23:04:56 crc kubenswrapper[4771]: I0219 23:04:56.565818 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-52b9-account-create-update-622pv"] Feb 19 23:04:57 crc kubenswrapper[4771]: I0219 23:04:57.089600 4771 generic.go:334] "Generic (PLEG): container finished" podID="dd72920f-34be-4cde-becc-7d1b0e5a7daf" containerID="848ca39b23f9ea56088e48be7a326aaa56da328d26e10d79cf68a933382f1150" exitCode=0 Feb 19 23:04:57 crc kubenswrapper[4771]: I0219 23:04:57.089706 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cjzcm" event={"ID":"dd72920f-34be-4cde-becc-7d1b0e5a7daf","Type":"ContainerDied","Data":"848ca39b23f9ea56088e48be7a326aaa56da328d26e10d79cf68a933382f1150"} Feb 19 23:04:57 crc kubenswrapper[4771]: I0219 23:04:57.089746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cjzcm" event={"ID":"dd72920f-34be-4cde-becc-7d1b0e5a7daf","Type":"ContainerStarted","Data":"4662ab42e2645dbc758d64aa247d96096bdef67c0be74a132f9bd1856011fe7f"} Feb 19 23:04:57 crc kubenswrapper[4771]: I0219 23:04:57.091449 4771 generic.go:334] "Generic (PLEG): container finished" podID="01f7ed81-dd1c-497b-b080-6cc1bef19fca" containerID="1bb8904b17a681aafb840c1f8d9a661bf01d4be0966053afd12bf437f1edda6f" exitCode=0 Feb 19 23:04:57 crc kubenswrapper[4771]: I0219 23:04:57.091481 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-52b9-account-create-update-622pv" event={"ID":"01f7ed81-dd1c-497b-b080-6cc1bef19fca","Type":"ContainerDied","Data":"1bb8904b17a681aafb840c1f8d9a661bf01d4be0966053afd12bf437f1edda6f"} Feb 19 23:04:57 crc kubenswrapper[4771]: I0219 23:04:57.091497 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-52b9-account-create-update-622pv" event={"ID":"01f7ed81-dd1c-497b-b080-6cc1bef19fca","Type":"ContainerStarted","Data":"2016127fa8cea256c1b3693ba48bfd74566f94bea75f02e209a20fedb4bcdfc5"} Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.668553 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.674512 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.826205 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m25xq\" (UniqueName: \"kubernetes.io/projected/01f7ed81-dd1c-497b-b080-6cc1bef19fca-kube-api-access-m25xq\") pod \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.826291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc4gs\" (UniqueName: \"kubernetes.io/projected/dd72920f-34be-4cde-becc-7d1b0e5a7daf-kube-api-access-xc4gs\") pod \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.826431 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f7ed81-dd1c-497b-b080-6cc1bef19fca-operator-scripts\") pod \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\" (UID: \"01f7ed81-dd1c-497b-b080-6cc1bef19fca\") " Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.826560 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd72920f-34be-4cde-becc-7d1b0e5a7daf-operator-scripts\") pod \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\" (UID: \"dd72920f-34be-4cde-becc-7d1b0e5a7daf\") " Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.827372 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd72920f-34be-4cde-becc-7d1b0e5a7daf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd72920f-34be-4cde-becc-7d1b0e5a7daf" (UID: "dd72920f-34be-4cde-becc-7d1b0e5a7daf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.827438 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f7ed81-dd1c-497b-b080-6cc1bef19fca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01f7ed81-dd1c-497b-b080-6cc1bef19fca" (UID: "01f7ed81-dd1c-497b-b080-6cc1bef19fca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.838291 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd72920f-34be-4cde-becc-7d1b0e5a7daf-kube-api-access-xc4gs" (OuterVolumeSpecName: "kube-api-access-xc4gs") pod "dd72920f-34be-4cde-becc-7d1b0e5a7daf" (UID: "dd72920f-34be-4cde-becc-7d1b0e5a7daf"). InnerVolumeSpecName "kube-api-access-xc4gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.840666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f7ed81-dd1c-497b-b080-6cc1bef19fca-kube-api-access-m25xq" (OuterVolumeSpecName: "kube-api-access-m25xq") pod "01f7ed81-dd1c-497b-b080-6cc1bef19fca" (UID: "01f7ed81-dd1c-497b-b080-6cc1bef19fca"). InnerVolumeSpecName "kube-api-access-m25xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.929957 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd72920f-34be-4cde-becc-7d1b0e5a7daf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.930007 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m25xq\" (UniqueName: \"kubernetes.io/projected/01f7ed81-dd1c-497b-b080-6cc1bef19fca-kube-api-access-m25xq\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.930053 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc4gs\" (UniqueName: \"kubernetes.io/projected/dd72920f-34be-4cde-becc-7d1b0e5a7daf-kube-api-access-xc4gs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:58 crc kubenswrapper[4771]: I0219 23:04:58.930071 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f7ed81-dd1c-497b-b080-6cc1bef19fca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:59 crc kubenswrapper[4771]: I0219 23:04:59.115640 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cjzcm" event={"ID":"dd72920f-34be-4cde-becc-7d1b0e5a7daf","Type":"ContainerDied","Data":"4662ab42e2645dbc758d64aa247d96096bdef67c0be74a132f9bd1856011fe7f"} Feb 19 23:04:59 crc kubenswrapper[4771]: I0219 23:04:59.115707 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cjzcm" Feb 19 23:04:59 crc kubenswrapper[4771]: I0219 23:04:59.115722 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4662ab42e2645dbc758d64aa247d96096bdef67c0be74a132f9bd1856011fe7f" Feb 19 23:04:59 crc kubenswrapper[4771]: I0219 23:04:59.118920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-52b9-account-create-update-622pv" event={"ID":"01f7ed81-dd1c-497b-b080-6cc1bef19fca","Type":"ContainerDied","Data":"2016127fa8cea256c1b3693ba48bfd74566f94bea75f02e209a20fedb4bcdfc5"} Feb 19 23:04:59 crc kubenswrapper[4771]: I0219 23:04:59.118973 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52b9-account-create-update-622pv" Feb 19 23:04:59 crc kubenswrapper[4771]: I0219 23:04:59.118990 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2016127fa8cea256c1b3693ba48bfd74566f94bea75f02e209a20fedb4bcdfc5" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.044161 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-s2xmq"] Feb 19 23:05:01 crc kubenswrapper[4771]: E0219 23:05:01.044875 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f7ed81-dd1c-497b-b080-6cc1bef19fca" containerName="mariadb-account-create-update" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.044892 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f7ed81-dd1c-497b-b080-6cc1bef19fca" containerName="mariadb-account-create-update" Feb 19 23:05:01 crc kubenswrapper[4771]: E0219 23:05:01.044928 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd72920f-34be-4cde-becc-7d1b0e5a7daf" containerName="mariadb-database-create" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.044937 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd72920f-34be-4cde-becc-7d1b0e5a7daf" containerName="mariadb-database-create" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.045164 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f7ed81-dd1c-497b-b080-6cc1bef19fca" containerName="mariadb-account-create-update" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.045191 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd72920f-34be-4cde-becc-7d1b0e5a7daf" containerName="mariadb-database-create" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.045861 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.049783 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.050122 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.051888 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-h2hb9" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.054273 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s2xmq"] Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.067064 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c67b8c4bf-4bp95"] Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.068398 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.075294 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67b8c4bf-4bp95"] Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179580 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-config-data\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-scripts\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179669 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-config\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179691 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179713 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569px\" (UniqueName: \"kubernetes.io/projected/f86cab8a-89d1-4124-8601-450fff70eeaf-kube-api-access-569px\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-combined-ca-bundle\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179830 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86cab8a-89d1-4124-8601-450fff70eeaf-logs\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179851 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bddnf\" (UniqueName: \"kubernetes.io/projected/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-kube-api-access-bddnf\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.179909 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-dns-svc\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284066 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-config\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569px\" (UniqueName: \"kubernetes.io/projected/f86cab8a-89d1-4124-8601-450fff70eeaf-kube-api-access-569px\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284149 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284206 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-combined-ca-bundle\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86cab8a-89d1-4124-8601-450fff70eeaf-logs\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284360 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bddnf\" (UniqueName: \"kubernetes.io/projected/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-kube-api-access-bddnf\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-dns-svc\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-config-data\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.284478 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-scripts\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.289159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.289460 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86cab8a-89d1-4124-8601-450fff70eeaf-logs\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.289774 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-dns-svc\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.290591 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-config\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.291092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.299737 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-scripts\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.300740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-combined-ca-bundle\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.316035 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-config-data\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.321873 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bddnf\" (UniqueName: \"kubernetes.io/projected/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-kube-api-access-bddnf\") pod \"dnsmasq-dns-7c67b8c4bf-4bp95\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.328643 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569px\" (UniqueName: \"kubernetes.io/projected/f86cab8a-89d1-4124-8601-450fff70eeaf-kube-api-access-569px\") pod \"placement-db-sync-s2xmq\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.372428 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:01 crc kubenswrapper[4771]: I0219 23:05:01.384563 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:02 crc kubenswrapper[4771]: I0219 23:05:02.109277 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s2xmq"] Feb 19 23:05:02 crc kubenswrapper[4771]: I0219 23:05:02.146986 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s2xmq" event={"ID":"f86cab8a-89d1-4124-8601-450fff70eeaf","Type":"ContainerStarted","Data":"a8f633ad7c08d28cdd06902c7773e175f051903a2af132f50ca143c82ff3949f"} Feb 19 23:05:02 crc kubenswrapper[4771]: I0219 23:05:02.236249 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67b8c4bf-4bp95"] Feb 19 23:05:02 crc kubenswrapper[4771]: W0219 23:05:02.244092 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cebe6b2_8fbd_4efe_b58c_90d76b5b23ab.slice/crio-a6f72eb8ca94d463fb681e4c21f2b45c0fd80c488e2f1128926cb51f902f9a66 WatchSource:0}: Error finding container a6f72eb8ca94d463fb681e4c21f2b45c0fd80c488e2f1128926cb51f902f9a66: Status 404 returned error can't find the container with id a6f72eb8ca94d463fb681e4c21f2b45c0fd80c488e2f1128926cb51f902f9a66 Feb 19 23:05:03 crc kubenswrapper[4771]: I0219 23:05:03.161510 4771 generic.go:334] "Generic (PLEG): container finished" podID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerID="c7585a4e8145ee332a0b456cd2819d056bbaf97fa085a2ffd4096d213081f4dd" exitCode=0 Feb 19 23:05:03 crc kubenswrapper[4771]: I0219 23:05:03.161589 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" event={"ID":"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab","Type":"ContainerDied","Data":"c7585a4e8145ee332a0b456cd2819d056bbaf97fa085a2ffd4096d213081f4dd"} Feb 19 23:05:03 crc kubenswrapper[4771]: I0219 23:05:03.161975 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" event={"ID":"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab","Type":"ContainerStarted","Data":"a6f72eb8ca94d463fb681e4c21f2b45c0fd80c488e2f1128926cb51f902f9a66"} Feb 19 23:05:03 crc kubenswrapper[4771]: I0219 23:05:03.185918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s2xmq" event={"ID":"f86cab8a-89d1-4124-8601-450fff70eeaf","Type":"ContainerStarted","Data":"d3940958b65503525cf815eba047883920fa14b29c7289535d08c48df941c092"} Feb 19 23:05:04 crc kubenswrapper[4771]: I0219 23:05:04.203835 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" event={"ID":"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab","Type":"ContainerStarted","Data":"9c73ee3210d40c8ff9c5ab8857eab106682b16a8894abccfc314ae4a027f0f83"} Feb 19 23:05:04 crc kubenswrapper[4771]: I0219 23:05:04.204253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:04 crc kubenswrapper[4771]: I0219 23:05:04.207881 4771 generic.go:334] "Generic (PLEG): container finished" podID="f86cab8a-89d1-4124-8601-450fff70eeaf" containerID="d3940958b65503525cf815eba047883920fa14b29c7289535d08c48df941c092" exitCode=0 Feb 19 23:05:04 crc kubenswrapper[4771]: I0219 23:05:04.207935 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s2xmq" event={"ID":"f86cab8a-89d1-4124-8601-450fff70eeaf","Type":"ContainerDied","Data":"d3940958b65503525cf815eba047883920fa14b29c7289535d08c48df941c092"} Feb 19 23:05:04 crc kubenswrapper[4771]: I0219 23:05:04.231310 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" podStartSLOduration=3.230561678 podStartE2EDuration="3.230561678s" podCreationTimestamp="2026-02-19 23:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:05:04.22914957 +0000 UTC m=+5804.500592080" watchObservedRunningTime="2026-02-19 23:05:04.230561678 +0000 UTC m=+5804.502004138" Feb 19 23:05:04 crc kubenswrapper[4771]: I0219 23:05:04.238329 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-s2xmq" podStartSLOduration=3.238313895 podStartE2EDuration="3.238313895s" podCreationTimestamp="2026-02-19 23:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:05:03.226707558 +0000 UTC m=+5803.498150048" watchObservedRunningTime="2026-02-19 23:05:04.238313895 +0000 UTC m=+5804.509756365" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.664073 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.769131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-combined-ca-bundle\") pod \"f86cab8a-89d1-4124-8601-450fff70eeaf\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.769189 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-config-data\") pod \"f86cab8a-89d1-4124-8601-450fff70eeaf\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.769255 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86cab8a-89d1-4124-8601-450fff70eeaf-logs\") pod \"f86cab8a-89d1-4124-8601-450fff70eeaf\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.769483 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-569px\" (UniqueName: \"kubernetes.io/projected/f86cab8a-89d1-4124-8601-450fff70eeaf-kube-api-access-569px\") pod \"f86cab8a-89d1-4124-8601-450fff70eeaf\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.769511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-scripts\") pod \"f86cab8a-89d1-4124-8601-450fff70eeaf\" (UID: \"f86cab8a-89d1-4124-8601-450fff70eeaf\") " Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.775717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f86cab8a-89d1-4124-8601-450fff70eeaf-logs" (OuterVolumeSpecName: "logs") pod "f86cab8a-89d1-4124-8601-450fff70eeaf" (UID: "f86cab8a-89d1-4124-8601-450fff70eeaf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.779536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86cab8a-89d1-4124-8601-450fff70eeaf-kube-api-access-569px" (OuterVolumeSpecName: "kube-api-access-569px") pod "f86cab8a-89d1-4124-8601-450fff70eeaf" (UID: "f86cab8a-89d1-4124-8601-450fff70eeaf"). InnerVolumeSpecName "kube-api-access-569px". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.780369 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-scripts" (OuterVolumeSpecName: "scripts") pod "f86cab8a-89d1-4124-8601-450fff70eeaf" (UID: "f86cab8a-89d1-4124-8601-450fff70eeaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.807743 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f86cab8a-89d1-4124-8601-450fff70eeaf" (UID: "f86cab8a-89d1-4124-8601-450fff70eeaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.810578 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-config-data" (OuterVolumeSpecName: "config-data") pod "f86cab8a-89d1-4124-8601-450fff70eeaf" (UID: "f86cab8a-89d1-4124-8601-450fff70eeaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.871921 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.871958 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f86cab8a-89d1-4124-8601-450fff70eeaf-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.871967 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-569px\" (UniqueName: \"kubernetes.io/projected/f86cab8a-89d1-4124-8601-450fff70eeaf-kube-api-access-569px\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.871977 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:05 crc kubenswrapper[4771]: I0219 23:05:05.871987 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86cab8a-89d1-4124-8601-450fff70eeaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.230231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s2xmq" event={"ID":"f86cab8a-89d1-4124-8601-450fff70eeaf","Type":"ContainerDied","Data":"a8f633ad7c08d28cdd06902c7773e175f051903a2af132f50ca143c82ff3949f"} Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.230649 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f633ad7c08d28cdd06902c7773e175f051903a2af132f50ca143c82ff3949f" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.230363 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s2xmq" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.332195 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5ff55dfd66-b9dhc"] Feb 19 23:05:06 crc kubenswrapper[4771]: E0219 23:05:06.332635 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86cab8a-89d1-4124-8601-450fff70eeaf" containerName="placement-db-sync" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.332654 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86cab8a-89d1-4124-8601-450fff70eeaf" containerName="placement-db-sync" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.332938 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86cab8a-89d1-4124-8601-450fff70eeaf" containerName="placement-db-sync" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.334092 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.340546 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.341046 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.341267 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.341495 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-h2hb9" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.341514 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.345653 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ff55dfd66-b9dhc"] Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.482119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce1cb23c-e114-4893-8a75-893ea165fd71-logs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.482195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-internal-tls-certs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.482375 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfsbr\" (UniqueName: \"kubernetes.io/projected/ce1cb23c-e114-4893-8a75-893ea165fd71-kube-api-access-mfsbr\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.482563 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-config-data\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.482716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-scripts\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.482895 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-public-tls-certs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.482951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-combined-ca-bundle\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.585708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-public-tls-certs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.587092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-combined-ca-bundle\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.587328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce1cb23c-e114-4893-8a75-893ea165fd71-logs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.587507 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-internal-tls-certs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.587616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfsbr\" (UniqueName: \"kubernetes.io/projected/ce1cb23c-e114-4893-8a75-893ea165fd71-kube-api-access-mfsbr\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.587736 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-config-data\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.587825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-scripts\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.589662 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce1cb23c-e114-4893-8a75-893ea165fd71-logs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.590247 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-public-tls-certs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.592465 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-combined-ca-bundle\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.593288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-internal-tls-certs\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.593327 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-scripts\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.597343 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1cb23c-e114-4893-8a75-893ea165fd71-config-data\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.620830 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfsbr\" (UniqueName: \"kubernetes.io/projected/ce1cb23c-e114-4893-8a75-893ea165fd71-kube-api-access-mfsbr\") pod \"placement-5ff55dfd66-b9dhc\" (UID: \"ce1cb23c-e114-4893-8a75-893ea165fd71\") " pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:06 crc kubenswrapper[4771]: I0219 23:05:06.651613 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:07 crc kubenswrapper[4771]: I0219 23:05:07.161849 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5ff55dfd66-b9dhc"] Feb 19 23:05:07 crc kubenswrapper[4771]: W0219 23:05:07.168606 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce1cb23c_e114_4893_8a75_893ea165fd71.slice/crio-ba85d92cc30d627482d543102dcbd6cf104f8a0956c28122d48905695b1cacbf WatchSource:0}: Error finding container ba85d92cc30d627482d543102dcbd6cf104f8a0956c28122d48905695b1cacbf: Status 404 returned error can't find the container with id ba85d92cc30d627482d543102dcbd6cf104f8a0956c28122d48905695b1cacbf Feb 19 23:05:07 crc kubenswrapper[4771]: I0219 23:05:07.248642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ff55dfd66-b9dhc" event={"ID":"ce1cb23c-e114-4893-8a75-893ea165fd71","Type":"ContainerStarted","Data":"ba85d92cc30d627482d543102dcbd6cf104f8a0956c28122d48905695b1cacbf"} Feb 19 23:05:08 crc kubenswrapper[4771]: I0219 23:05:08.263905 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ff55dfd66-b9dhc" event={"ID":"ce1cb23c-e114-4893-8a75-893ea165fd71","Type":"ContainerStarted","Data":"6689bd0b4be20a8145c1a3feb3dc338d3829b6b3a1c0708dd0feb1cc93e43298"} Feb 19 23:05:08 crc kubenswrapper[4771]: I0219 23:05:08.264459 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:08 crc kubenswrapper[4771]: I0219 23:05:08.264494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5ff55dfd66-b9dhc" event={"ID":"ce1cb23c-e114-4893-8a75-893ea165fd71","Type":"ContainerStarted","Data":"d3559ea3aaa298d8578ce4e58560c5e731e908ab9ad62a613cbeb48e5499dcb0"} Feb 19 23:05:08 crc kubenswrapper[4771]: I0219 23:05:08.264525 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:08 crc kubenswrapper[4771]: I0219 23:05:08.295232 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5ff55dfd66-b9dhc" podStartSLOduration=2.295207983 podStartE2EDuration="2.295207983s" podCreationTimestamp="2026-02-19 23:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:05:08.292439399 +0000 UTC m=+5808.563881909" watchObservedRunningTime="2026-02-19 23:05:08.295207983 +0000 UTC m=+5808.566650493" Feb 19 23:05:11 crc kubenswrapper[4771]: I0219 23:05:11.386327 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:05:11 crc kubenswrapper[4771]: I0219 23:05:11.483622 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c9b69b789-pchxj"] Feb 19 23:05:11 crc kubenswrapper[4771]: I0219 23:05:11.483865 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" podUID="b96de645-4069-45d8-957c-b89c558a1ff8" containerName="dnsmasq-dns" containerID="cri-o://1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d" gracePeriod=10 Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.070219 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.232396 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-nb\") pod \"b96de645-4069-45d8-957c-b89c558a1ff8\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.232760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-config\") pod \"b96de645-4069-45d8-957c-b89c558a1ff8\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.232814 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-dns-svc\") pod \"b96de645-4069-45d8-957c-b89c558a1ff8\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.233040 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxtdj\" (UniqueName: \"kubernetes.io/projected/b96de645-4069-45d8-957c-b89c558a1ff8-kube-api-access-gxtdj\") pod \"b96de645-4069-45d8-957c-b89c558a1ff8\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.233155 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-sb\") pod \"b96de645-4069-45d8-957c-b89c558a1ff8\" (UID: \"b96de645-4069-45d8-957c-b89c558a1ff8\") " Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.249375 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96de645-4069-45d8-957c-b89c558a1ff8-kube-api-access-gxtdj" (OuterVolumeSpecName: "kube-api-access-gxtdj") pod "b96de645-4069-45d8-957c-b89c558a1ff8" (UID: "b96de645-4069-45d8-957c-b89c558a1ff8"). InnerVolumeSpecName "kube-api-access-gxtdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.279845 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b96de645-4069-45d8-957c-b89c558a1ff8" (UID: "b96de645-4069-45d8-957c-b89c558a1ff8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.281574 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-config" (OuterVolumeSpecName: "config") pod "b96de645-4069-45d8-957c-b89c558a1ff8" (UID: "b96de645-4069-45d8-957c-b89c558a1ff8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.299141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b96de645-4069-45d8-957c-b89c558a1ff8" (UID: "b96de645-4069-45d8-957c-b89c558a1ff8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.300928 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b96de645-4069-45d8-957c-b89c558a1ff8" (UID: "b96de645-4069-45d8-957c-b89c558a1ff8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.308163 4771 generic.go:334] "Generic (PLEG): container finished" podID="b96de645-4069-45d8-957c-b89c558a1ff8" containerID="1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d" exitCode=0 Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.308204 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" event={"ID":"b96de645-4069-45d8-957c-b89c558a1ff8","Type":"ContainerDied","Data":"1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d"} Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.308229 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" event={"ID":"b96de645-4069-45d8-957c-b89c558a1ff8","Type":"ContainerDied","Data":"3713c712973a411fe5d0d0fcc4812912fdd120be6f04c7e72b380690e121c9f2"} Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.308245 4771 scope.go:117] "RemoveContainer" containerID="1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.308347 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9b69b789-pchxj" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.335785 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.335827 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxtdj\" (UniqueName: \"kubernetes.io/projected/b96de645-4069-45d8-957c-b89c558a1ff8-kube-api-access-gxtdj\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.335851 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.335868 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.335884 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b96de645-4069-45d8-957c-b89c558a1ff8-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.337037 4771 scope.go:117] "RemoveContainer" containerID="024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.363700 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c9b69b789-pchxj"] Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.377254 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c9b69b789-pchxj"] Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.379247 4771 scope.go:117] "RemoveContainer" containerID="1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d" Feb 19 23:05:12 crc kubenswrapper[4771]: E0219 23:05:12.382467 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d\": container with ID starting with 1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d not found: ID does not exist" containerID="1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.382499 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d"} err="failed to get container status \"1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d\": rpc error: code = NotFound desc = could not find container \"1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d\": container with ID starting with 1c24ead58c9c91301a86a42517aec6cbf18004491fd0cfcf5ea168480c33f22d not found: ID does not exist" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.382522 4771 scope.go:117] "RemoveContainer" containerID="024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e" Feb 19 23:05:12 crc kubenswrapper[4771]: E0219 23:05:12.382826 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e\": container with ID starting with 024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e not found: ID does not exist" containerID="024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.382876 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e"} err="failed to get container status \"024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e\": rpc error: code = NotFound desc = could not find container \"024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e\": container with ID starting with 024a943e606a974aa77c160f456bcde06f921fb72fc7c81c6d251a98b4d1044e not found: ID does not exist" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.451498 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96de645-4069-45d8-957c-b89c558a1ff8" path="/var/lib/kubelet/pods/b96de645-4069-45d8-957c-b89c558a1ff8/volumes" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.956781 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.957043 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.957088 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.957748 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2b2a4906ade6d26077f231564fac49a46292a6f1358314d170979c9e439e6b3"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:05:12 crc kubenswrapper[4771]: I0219 23:05:12.957801 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://c2b2a4906ade6d26077f231564fac49a46292a6f1358314d170979c9e439e6b3" gracePeriod=600 Feb 19 23:05:13 crc kubenswrapper[4771]: I0219 23:05:13.319993 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="c2b2a4906ade6d26077f231564fac49a46292a6f1358314d170979c9e439e6b3" exitCode=0 Feb 19 23:05:13 crc kubenswrapper[4771]: I0219 23:05:13.320172 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"c2b2a4906ade6d26077f231564fac49a46292a6f1358314d170979c9e439e6b3"} Feb 19 23:05:13 crc kubenswrapper[4771]: I0219 23:05:13.320297 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277"} Feb 19 23:05:13 crc kubenswrapper[4771]: I0219 23:05:13.320320 4771 scope.go:117] "RemoveContainer" containerID="c0fdb551203f16b786cf2592f76ac5cef31230efe5bd2be4ed85d12adef11a41" Feb 19 23:05:35 crc kubenswrapper[4771]: I0219 23:05:35.825110 4771 scope.go:117] "RemoveContainer" containerID="321262661da65fb93b64589983526499089a587b4459fb2f5f71dbb0115041aa" Feb 19 23:05:37 crc kubenswrapper[4771]: I0219 23:05:37.607330 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:05:37 crc kubenswrapper[4771]: I0219 23:05:37.609412 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5ff55dfd66-b9dhc" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.529920 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vwf8n"] Feb 19 23:06:02 crc kubenswrapper[4771]: E0219 23:06:02.531388 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96de645-4069-45d8-957c-b89c558a1ff8" containerName="dnsmasq-dns" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.531462 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96de645-4069-45d8-957c-b89c558a1ff8" containerName="dnsmasq-dns" Feb 19 23:06:02 crc kubenswrapper[4771]: E0219 23:06:02.531539 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96de645-4069-45d8-957c-b89c558a1ff8" containerName="init" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.531600 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96de645-4069-45d8-957c-b89c558a1ff8" containerName="init" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.531808 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96de645-4069-45d8-957c-b89c558a1ff8" containerName="dnsmasq-dns" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.532392 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.544562 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vwf8n"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.613038 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9711ee8d-1ba8-43b1-ac70-c07eda069724-operator-scripts\") pod \"nova-api-db-create-vwf8n\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.613276 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxm48\" (UniqueName: \"kubernetes.io/projected/9711ee8d-1ba8-43b1-ac70-c07eda069724-kube-api-access-hxm48\") pod \"nova-api-db-create-vwf8n\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.619453 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-ccv4j"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.620461 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.636165 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ccv4j"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.715349 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnsw5\" (UniqueName: \"kubernetes.io/projected/7009bd32-7008-4634-8fb5-650244264fbf-kube-api-access-qnsw5\") pod \"nova-cell0-db-create-ccv4j\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.715436 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9711ee8d-1ba8-43b1-ac70-c07eda069724-operator-scripts\") pod \"nova-api-db-create-vwf8n\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.715499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7009bd32-7008-4634-8fb5-650244264fbf-operator-scripts\") pod \"nova-cell0-db-create-ccv4j\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.715522 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxm48\" (UniqueName: \"kubernetes.io/projected/9711ee8d-1ba8-43b1-ac70-c07eda069724-kube-api-access-hxm48\") pod \"nova-api-db-create-vwf8n\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.716278 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9711ee8d-1ba8-43b1-ac70-c07eda069724-operator-scripts\") pod \"nova-api-db-create-vwf8n\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.735424 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ae92-account-create-update-f86k2"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.740077 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.749429 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.760411 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxm48\" (UniqueName: \"kubernetes.io/projected/9711ee8d-1ba8-43b1-ac70-c07eda069724-kube-api-access-hxm48\") pod \"nova-api-db-create-vwf8n\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.762995 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ae92-account-create-update-f86k2"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.818601 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56prq\" (UniqueName: \"kubernetes.io/projected/70523008-772b-455a-9931-8ed6ea6dc6b0-kube-api-access-56prq\") pod \"nova-api-ae92-account-create-update-f86k2\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.818708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7009bd32-7008-4634-8fb5-650244264fbf-operator-scripts\") pod \"nova-cell0-db-create-ccv4j\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.818746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70523008-772b-455a-9931-8ed6ea6dc6b0-operator-scripts\") pod \"nova-api-ae92-account-create-update-f86k2\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.818986 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnsw5\" (UniqueName: \"kubernetes.io/projected/7009bd32-7008-4634-8fb5-650244264fbf-kube-api-access-qnsw5\") pod \"nova-cell0-db-create-ccv4j\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.819607 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7009bd32-7008-4634-8fb5-650244264fbf-operator-scripts\") pod \"nova-cell0-db-create-ccv4j\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.831604 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bm85p"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.832753 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.838407 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnsw5\" (UniqueName: \"kubernetes.io/projected/7009bd32-7008-4634-8fb5-650244264fbf-kube-api-access-qnsw5\") pod \"nova-cell0-db-create-ccv4j\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.847631 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bm85p"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.890477 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.920682 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70523008-772b-455a-9931-8ed6ea6dc6b0-operator-scripts\") pod \"nova-api-ae92-account-create-update-f86k2\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.921128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn4xb\" (UniqueName: \"kubernetes.io/projected/59024c00-6963-4ade-bff9-6a0b81a98369-kube-api-access-tn4xb\") pod \"nova-cell1-db-create-bm85p\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.921194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56prq\" (UniqueName: \"kubernetes.io/projected/70523008-772b-455a-9931-8ed6ea6dc6b0-kube-api-access-56prq\") pod \"nova-api-ae92-account-create-update-f86k2\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.921249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59024c00-6963-4ade-bff9-6a0b81a98369-operator-scripts\") pod \"nova-cell1-db-create-bm85p\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.922052 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70523008-772b-455a-9931-8ed6ea6dc6b0-operator-scripts\") pod \"nova-api-ae92-account-create-update-f86k2\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.933632 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2d41-account-create-update-flb7v"] Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.934783 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.936344 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.940243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56prq\" (UniqueName: \"kubernetes.io/projected/70523008-772b-455a-9931-8ed6ea6dc6b0-kube-api-access-56prq\") pod \"nova-api-ae92-account-create-update-f86k2\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.944991 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 23:06:02 crc kubenswrapper[4771]: I0219 23:06:02.947311 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2d41-account-create-update-flb7v"] Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.022724 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-operator-scripts\") pod \"nova-cell0-2d41-account-create-update-flb7v\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.022833 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn4xb\" (UniqueName: \"kubernetes.io/projected/59024c00-6963-4ade-bff9-6a0b81a98369-kube-api-access-tn4xb\") pod \"nova-cell1-db-create-bm85p\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.022973 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4n6k\" (UniqueName: \"kubernetes.io/projected/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-kube-api-access-f4n6k\") pod \"nova-cell0-2d41-account-create-update-flb7v\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.023058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59024c00-6963-4ade-bff9-6a0b81a98369-operator-scripts\") pod \"nova-cell1-db-create-bm85p\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.024541 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59024c00-6963-4ade-bff9-6a0b81a98369-operator-scripts\") pod \"nova-cell1-db-create-bm85p\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.040630 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn4xb\" (UniqueName: \"kubernetes.io/projected/59024c00-6963-4ade-bff9-6a0b81a98369-kube-api-access-tn4xb\") pod \"nova-cell1-db-create-bm85p\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.113217 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.124534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4n6k\" (UniqueName: \"kubernetes.io/projected/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-kube-api-access-f4n6k\") pod \"nova-cell0-2d41-account-create-update-flb7v\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.124613 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-operator-scripts\") pod \"nova-cell0-2d41-account-create-update-flb7v\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.125365 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-operator-scripts\") pod \"nova-cell0-2d41-account-create-update-flb7v\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.154670 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4n6k\" (UniqueName: \"kubernetes.io/projected/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-kube-api-access-f4n6k\") pod \"nova-cell0-2d41-account-create-update-flb7v\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.154741 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-03a8-account-create-update-6fqt6"] Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.156464 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.159258 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.164765 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-03a8-account-create-update-6fqt6"] Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.185879 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.228072 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65078338-c90f-4ad7-b419-c72272042cde-operator-scripts\") pod \"nova-cell1-03a8-account-create-update-6fqt6\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.228295 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tv9d\" (UniqueName: \"kubernetes.io/projected/65078338-c90f-4ad7-b419-c72272042cde-kube-api-access-4tv9d\") pod \"nova-cell1-03a8-account-create-update-6fqt6\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.330071 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65078338-c90f-4ad7-b419-c72272042cde-operator-scripts\") pod \"nova-cell1-03a8-account-create-update-6fqt6\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.330186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tv9d\" (UniqueName: \"kubernetes.io/projected/65078338-c90f-4ad7-b419-c72272042cde-kube-api-access-4tv9d\") pod \"nova-cell1-03a8-account-create-update-6fqt6\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.330858 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65078338-c90f-4ad7-b419-c72272042cde-operator-scripts\") pod \"nova-cell1-03a8-account-create-update-6fqt6\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.339409 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vwf8n"] Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.343875 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.345681 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tv9d\" (UniqueName: \"kubernetes.io/projected/65078338-c90f-4ad7-b419-c72272042cde-kube-api-access-4tv9d\") pod \"nova-cell1-03a8-account-create-update-6fqt6\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.442364 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ccv4j"] Feb 19 23:06:03 crc kubenswrapper[4771]: W0219 23:06:03.460965 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7009bd32_7008_4634_8fb5_650244264fbf.slice/crio-a291e90cf4c3f2f2006426200e2df0938a7d112df7d36e53972787a57fe1b1ce WatchSource:0}: Error finding container a291e90cf4c3f2f2006426200e2df0938a7d112df7d36e53972787a57fe1b1ce: Status 404 returned error can't find the container with id a291e90cf4c3f2f2006426200e2df0938a7d112df7d36e53972787a57fe1b1ce Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.472452 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.640923 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ae92-account-create-update-f86k2"] Feb 19 23:06:03 crc kubenswrapper[4771]: W0219 23:06:03.650218 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70523008_772b_455a_9931_8ed6ea6dc6b0.slice/crio-655d082bc92bfce580bd9ffe368388f5e394f50d7ea256054a83f6abeb996089 WatchSource:0}: Error finding container 655d082bc92bfce580bd9ffe368388f5e394f50d7ea256054a83f6abeb996089: Status 404 returned error can't find the container with id 655d082bc92bfce580bd9ffe368388f5e394f50d7ea256054a83f6abeb996089 Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.689317 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bm85p"] Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.804570 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2d41-account-create-update-flb7v"] Feb 19 23:06:03 crc kubenswrapper[4771]: W0219 23:06:03.818417 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7ca9a86_ec44_44d2_b792_c8e14f43b0cc.slice/crio-9f6682fecd851b03fb3512f112c848f55d0eb054fad0dd0e6cf349c2b211a3a6 WatchSource:0}: Error finding container 9f6682fecd851b03fb3512f112c848f55d0eb054fad0dd0e6cf349c2b211a3a6: Status 404 returned error can't find the container with id 9f6682fecd851b03fb3512f112c848f55d0eb054fad0dd0e6cf349c2b211a3a6 Feb 19 23:06:03 crc kubenswrapper[4771]: I0219 23:06:03.978798 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-03a8-account-create-update-6fqt6"] Feb 19 23:06:03 crc kubenswrapper[4771]: W0219 23:06:03.985072 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65078338_c90f_4ad7_b419_c72272042cde.slice/crio-7d58ee67c138110cc6cdf5984445e1290551ad7143f2bf063a3ddf4d75e65ef7 WatchSource:0}: Error finding container 7d58ee67c138110cc6cdf5984445e1290551ad7143f2bf063a3ddf4d75e65ef7: Status 404 returned error can't find the container with id 7d58ee67c138110cc6cdf5984445e1290551ad7143f2bf063a3ddf4d75e65ef7 Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.156117 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ae92-account-create-update-f86k2" event={"ID":"70523008-772b-455a-9931-8ed6ea6dc6b0","Type":"ContainerStarted","Data":"ff3cac6c7cdb99823f9f21144e816d2f08185e42098aa21aadb58acc71bd559b"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.156164 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ae92-account-create-update-f86k2" event={"ID":"70523008-772b-455a-9931-8ed6ea6dc6b0","Type":"ContainerStarted","Data":"655d082bc92bfce580bd9ffe368388f5e394f50d7ea256054a83f6abeb996089"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.157720 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" event={"ID":"65078338-c90f-4ad7-b419-c72272042cde","Type":"ContainerStarted","Data":"7d58ee67c138110cc6cdf5984445e1290551ad7143f2bf063a3ddf4d75e65ef7"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.159586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2d41-account-create-update-flb7v" event={"ID":"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc","Type":"ContainerStarted","Data":"9f6682fecd851b03fb3512f112c848f55d0eb054fad0dd0e6cf349c2b211a3a6"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.163115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bm85p" event={"ID":"59024c00-6963-4ade-bff9-6a0b81a98369","Type":"ContainerStarted","Data":"13aac0fa1273c849d61dda255befe006e51a51e3cb45e5a3ce32d2f1fd1345e4"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.163150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bm85p" event={"ID":"59024c00-6963-4ade-bff9-6a0b81a98369","Type":"ContainerStarted","Data":"5b0d62a260c8093a453f98adef5f98da3481d9e3b0c4379aa3c6611398e325c6"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.166661 4771 generic.go:334] "Generic (PLEG): container finished" podID="7009bd32-7008-4634-8fb5-650244264fbf" containerID="ce210e944b894fb5354050822e3053e5657f70859f0203fdc6e4af88ec19a242" exitCode=0 Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.166789 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ccv4j" event={"ID":"7009bd32-7008-4634-8fb5-650244264fbf","Type":"ContainerDied","Data":"ce210e944b894fb5354050822e3053e5657f70859f0203fdc6e4af88ec19a242"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.166964 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ccv4j" event={"ID":"7009bd32-7008-4634-8fb5-650244264fbf","Type":"ContainerStarted","Data":"a291e90cf4c3f2f2006426200e2df0938a7d112df7d36e53972787a57fe1b1ce"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.171593 4771 generic.go:334] "Generic (PLEG): container finished" podID="9711ee8d-1ba8-43b1-ac70-c07eda069724" containerID="6768f57661f1581c1f917fdd40b1309b3e1434312a68016421a19bf86837b3c3" exitCode=0 Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.171636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vwf8n" event={"ID":"9711ee8d-1ba8-43b1-ac70-c07eda069724","Type":"ContainerDied","Data":"6768f57661f1581c1f917fdd40b1309b3e1434312a68016421a19bf86837b3c3"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.171660 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vwf8n" event={"ID":"9711ee8d-1ba8-43b1-ac70-c07eda069724","Type":"ContainerStarted","Data":"f29df7e1b0d236496a397266da25db3ec1bb664c37e23f39708fac531a838f81"} Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.181369 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-ae92-account-create-update-f86k2" podStartSLOduration=2.181349058 podStartE2EDuration="2.181349058s" podCreationTimestamp="2026-02-19 23:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:04.171159706 +0000 UTC m=+5864.442602196" watchObservedRunningTime="2026-02-19 23:06:04.181349058 +0000 UTC m=+5864.452791528" Feb 19 23:06:04 crc kubenswrapper[4771]: I0219 23:06:04.219317 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-bm85p" podStartSLOduration=2.2192992 podStartE2EDuration="2.2192992s" podCreationTimestamp="2026-02-19 23:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:04.211100671 +0000 UTC m=+5864.482543161" watchObservedRunningTime="2026-02-19 23:06:04.2192992 +0000 UTC m=+5864.490741670" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.192193 4771 generic.go:334] "Generic (PLEG): container finished" podID="65078338-c90f-4ad7-b419-c72272042cde" containerID="a7b4fc49ac89cca91c577f95232c0a2ec49f882f49fc8d7230b7beddf06d5778" exitCode=0 Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.192385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" event={"ID":"65078338-c90f-4ad7-b419-c72272042cde","Type":"ContainerDied","Data":"a7b4fc49ac89cca91c577f95232c0a2ec49f882f49fc8d7230b7beddf06d5778"} Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.195506 4771 generic.go:334] "Generic (PLEG): container finished" podID="c7ca9a86-ec44-44d2-b792-c8e14f43b0cc" containerID="33af11130d84c5f08c7b8badbd17fcea2fcf547807a7391f0de991f5658f71b1" exitCode=0 Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.195689 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2d41-account-create-update-flb7v" event={"ID":"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc","Type":"ContainerDied","Data":"33af11130d84c5f08c7b8badbd17fcea2fcf547807a7391f0de991f5658f71b1"} Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.198825 4771 generic.go:334] "Generic (PLEG): container finished" podID="59024c00-6963-4ade-bff9-6a0b81a98369" containerID="13aac0fa1273c849d61dda255befe006e51a51e3cb45e5a3ce32d2f1fd1345e4" exitCode=0 Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.198976 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bm85p" event={"ID":"59024c00-6963-4ade-bff9-6a0b81a98369","Type":"ContainerDied","Data":"13aac0fa1273c849d61dda255befe006e51a51e3cb45e5a3ce32d2f1fd1345e4"} Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.201746 4771 generic.go:334] "Generic (PLEG): container finished" podID="70523008-772b-455a-9931-8ed6ea6dc6b0" containerID="ff3cac6c7cdb99823f9f21144e816d2f08185e42098aa21aadb58acc71bd559b" exitCode=0 Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.201831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ae92-account-create-update-f86k2" event={"ID":"70523008-772b-455a-9931-8ed6ea6dc6b0","Type":"ContainerDied","Data":"ff3cac6c7cdb99823f9f21144e816d2f08185e42098aa21aadb58acc71bd559b"} Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.693115 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.709011 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.783334 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7009bd32-7008-4634-8fb5-650244264fbf-operator-scripts\") pod \"7009bd32-7008-4634-8fb5-650244264fbf\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.783392 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxm48\" (UniqueName: \"kubernetes.io/projected/9711ee8d-1ba8-43b1-ac70-c07eda069724-kube-api-access-hxm48\") pod \"9711ee8d-1ba8-43b1-ac70-c07eda069724\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.783424 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9711ee8d-1ba8-43b1-ac70-c07eda069724-operator-scripts\") pod \"9711ee8d-1ba8-43b1-ac70-c07eda069724\" (UID: \"9711ee8d-1ba8-43b1-ac70-c07eda069724\") " Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.783722 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnsw5\" (UniqueName: \"kubernetes.io/projected/7009bd32-7008-4634-8fb5-650244264fbf-kube-api-access-qnsw5\") pod \"7009bd32-7008-4634-8fb5-650244264fbf\" (UID: \"7009bd32-7008-4634-8fb5-650244264fbf\") " Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.783806 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9711ee8d-1ba8-43b1-ac70-c07eda069724-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9711ee8d-1ba8-43b1-ac70-c07eda069724" (UID: "9711ee8d-1ba8-43b1-ac70-c07eda069724"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.783827 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7009bd32-7008-4634-8fb5-650244264fbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7009bd32-7008-4634-8fb5-650244264fbf" (UID: "7009bd32-7008-4634-8fb5-650244264fbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.784224 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7009bd32-7008-4634-8fb5-650244264fbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.784248 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9711ee8d-1ba8-43b1-ac70-c07eda069724-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.788978 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7009bd32-7008-4634-8fb5-650244264fbf-kube-api-access-qnsw5" (OuterVolumeSpecName: "kube-api-access-qnsw5") pod "7009bd32-7008-4634-8fb5-650244264fbf" (UID: "7009bd32-7008-4634-8fb5-650244264fbf"). InnerVolumeSpecName "kube-api-access-qnsw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.789072 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9711ee8d-1ba8-43b1-ac70-c07eda069724-kube-api-access-hxm48" (OuterVolumeSpecName: "kube-api-access-hxm48") pod "9711ee8d-1ba8-43b1-ac70-c07eda069724" (UID: "9711ee8d-1ba8-43b1-ac70-c07eda069724"). InnerVolumeSpecName "kube-api-access-hxm48". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.886240 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxm48\" (UniqueName: \"kubernetes.io/projected/9711ee8d-1ba8-43b1-ac70-c07eda069724-kube-api-access-hxm48\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:05 crc kubenswrapper[4771]: I0219 23:06:05.886292 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnsw5\" (UniqueName: \"kubernetes.io/projected/7009bd32-7008-4634-8fb5-650244264fbf-kube-api-access-qnsw5\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.215361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vwf8n" event={"ID":"9711ee8d-1ba8-43b1-ac70-c07eda069724","Type":"ContainerDied","Data":"f29df7e1b0d236496a397266da25db3ec1bb664c37e23f39708fac531a838f81"} Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.215409 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f29df7e1b0d236496a397266da25db3ec1bb664c37e23f39708fac531a838f81" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.215434 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vwf8n" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.217692 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ccv4j" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.221376 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ccv4j" event={"ID":"7009bd32-7008-4634-8fb5-650244264fbf","Type":"ContainerDied","Data":"a291e90cf4c3f2f2006426200e2df0938a7d112df7d36e53972787a57fe1b1ce"} Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.221432 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a291e90cf4c3f2f2006426200e2df0938a7d112df7d36e53972787a57fe1b1ce" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.466030 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.602782 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn4xb\" (UniqueName: \"kubernetes.io/projected/59024c00-6963-4ade-bff9-6a0b81a98369-kube-api-access-tn4xb\") pod \"59024c00-6963-4ade-bff9-6a0b81a98369\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.602924 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59024c00-6963-4ade-bff9-6a0b81a98369-operator-scripts\") pod \"59024c00-6963-4ade-bff9-6a0b81a98369\" (UID: \"59024c00-6963-4ade-bff9-6a0b81a98369\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.604076 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59024c00-6963-4ade-bff9-6a0b81a98369-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59024c00-6963-4ade-bff9-6a0b81a98369" (UID: "59024c00-6963-4ade-bff9-6a0b81a98369"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.606389 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59024c00-6963-4ade-bff9-6a0b81a98369-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.611754 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59024c00-6963-4ade-bff9-6a0b81a98369-kube-api-access-tn4xb" (OuterVolumeSpecName: "kube-api-access-tn4xb") pod "59024c00-6963-4ade-bff9-6a0b81a98369" (UID: "59024c00-6963-4ade-bff9-6a0b81a98369"). InnerVolumeSpecName "kube-api-access-tn4xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.708235 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn4xb\" (UniqueName: \"kubernetes.io/projected/59024c00-6963-4ade-bff9-6a0b81a98369-kube-api-access-tn4xb\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.711477 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.716766 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.722700 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.911078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65078338-c90f-4ad7-b419-c72272042cde-operator-scripts\") pod \"65078338-c90f-4ad7-b419-c72272042cde\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.911172 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tv9d\" (UniqueName: \"kubernetes.io/projected/65078338-c90f-4ad7-b419-c72272042cde-kube-api-access-4tv9d\") pod \"65078338-c90f-4ad7-b419-c72272042cde\" (UID: \"65078338-c90f-4ad7-b419-c72272042cde\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.911256 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70523008-772b-455a-9931-8ed6ea6dc6b0-operator-scripts\") pod \"70523008-772b-455a-9931-8ed6ea6dc6b0\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.911280 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-operator-scripts\") pod \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.911386 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4n6k\" (UniqueName: \"kubernetes.io/projected/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-kube-api-access-f4n6k\") pod \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\" (UID: \"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.911427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56prq\" (UniqueName: \"kubernetes.io/projected/70523008-772b-455a-9931-8ed6ea6dc6b0-kube-api-access-56prq\") pod \"70523008-772b-455a-9931-8ed6ea6dc6b0\" (UID: \"70523008-772b-455a-9931-8ed6ea6dc6b0\") " Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.912479 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70523008-772b-455a-9931-8ed6ea6dc6b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70523008-772b-455a-9931-8ed6ea6dc6b0" (UID: "70523008-772b-455a-9931-8ed6ea6dc6b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.912536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7ca9a86-ec44-44d2-b792-c8e14f43b0cc" (UID: "c7ca9a86-ec44-44d2-b792-c8e14f43b0cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.912741 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65078338-c90f-4ad7-b419-c72272042cde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65078338-c90f-4ad7-b419-c72272042cde" (UID: "65078338-c90f-4ad7-b419-c72272042cde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.914974 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70523008-772b-455a-9931-8ed6ea6dc6b0-kube-api-access-56prq" (OuterVolumeSpecName: "kube-api-access-56prq") pod "70523008-772b-455a-9931-8ed6ea6dc6b0" (UID: "70523008-772b-455a-9931-8ed6ea6dc6b0"). InnerVolumeSpecName "kube-api-access-56prq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.916294 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-kube-api-access-f4n6k" (OuterVolumeSpecName: "kube-api-access-f4n6k") pod "c7ca9a86-ec44-44d2-b792-c8e14f43b0cc" (UID: "c7ca9a86-ec44-44d2-b792-c8e14f43b0cc"). InnerVolumeSpecName "kube-api-access-f4n6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:06 crc kubenswrapper[4771]: I0219 23:06:06.917202 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65078338-c90f-4ad7-b419-c72272042cde-kube-api-access-4tv9d" (OuterVolumeSpecName: "kube-api-access-4tv9d") pod "65078338-c90f-4ad7-b419-c72272042cde" (UID: "65078338-c90f-4ad7-b419-c72272042cde"). InnerVolumeSpecName "kube-api-access-4tv9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.015506 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65078338-c90f-4ad7-b419-c72272042cde-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.015562 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tv9d\" (UniqueName: \"kubernetes.io/projected/65078338-c90f-4ad7-b419-c72272042cde-kube-api-access-4tv9d\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.015585 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70523008-772b-455a-9931-8ed6ea6dc6b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.015603 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.015622 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4n6k\" (UniqueName: \"kubernetes.io/projected/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc-kube-api-access-f4n6k\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.015639 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56prq\" (UniqueName: \"kubernetes.io/projected/70523008-772b-455a-9931-8ed6ea6dc6b0-kube-api-access-56prq\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.231895 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.233843 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2d41-account-create-update-flb7v" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.231897 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-03a8-account-create-update-6fqt6" event={"ID":"65078338-c90f-4ad7-b419-c72272042cde","Type":"ContainerDied","Data":"7d58ee67c138110cc6cdf5984445e1290551ad7143f2bf063a3ddf4d75e65ef7"} Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.234322 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d58ee67c138110cc6cdf5984445e1290551ad7143f2bf063a3ddf4d75e65ef7" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.234364 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2d41-account-create-update-flb7v" event={"ID":"c7ca9a86-ec44-44d2-b792-c8e14f43b0cc","Type":"ContainerDied","Data":"9f6682fecd851b03fb3512f112c848f55d0eb054fad0dd0e6cf349c2b211a3a6"} Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.234412 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6682fecd851b03fb3512f112c848f55d0eb054fad0dd0e6cf349c2b211a3a6" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.235956 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bm85p" event={"ID":"59024c00-6963-4ade-bff9-6a0b81a98369","Type":"ContainerDied","Data":"5b0d62a260c8093a453f98adef5f98da3481d9e3b0c4379aa3c6611398e325c6"} Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.236012 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0d62a260c8093a453f98adef5f98da3481d9e3b0c4379aa3c6611398e325c6" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.236099 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm85p" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.238054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ae92-account-create-update-f86k2" event={"ID":"70523008-772b-455a-9931-8ed6ea6dc6b0","Type":"ContainerDied","Data":"655d082bc92bfce580bd9ffe368388f5e394f50d7ea256054a83f6abeb996089"} Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.238095 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="655d082bc92bfce580bd9ffe368388f5e394f50d7ea256054a83f6abeb996089" Feb 19 23:06:07 crc kubenswrapper[4771]: I0219 23:06:07.238154 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ae92-account-create-update-f86k2" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.173369 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fxqj4"] Feb 19 23:06:08 crc kubenswrapper[4771]: E0219 23:06:08.174248 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59024c00-6963-4ade-bff9-6a0b81a98369" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174274 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="59024c00-6963-4ade-bff9-6a0b81a98369" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: E0219 23:06:08.174291 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70523008-772b-455a-9931-8ed6ea6dc6b0" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174303 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="70523008-772b-455a-9931-8ed6ea6dc6b0" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: E0219 23:06:08.174331 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65078338-c90f-4ad7-b419-c72272042cde" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174344 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="65078338-c90f-4ad7-b419-c72272042cde" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: E0219 23:06:08.174376 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca9a86-ec44-44d2-b792-c8e14f43b0cc" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174389 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca9a86-ec44-44d2-b792-c8e14f43b0cc" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: E0219 23:06:08.174408 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9711ee8d-1ba8-43b1-ac70-c07eda069724" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174419 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9711ee8d-1ba8-43b1-ac70-c07eda069724" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: E0219 23:06:08.174432 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7009bd32-7008-4634-8fb5-650244264fbf" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174444 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7009bd32-7008-4634-8fb5-650244264fbf" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174721 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7009bd32-7008-4634-8fb5-650244264fbf" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174761 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca9a86-ec44-44d2-b792-c8e14f43b0cc" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174788 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="59024c00-6963-4ade-bff9-6a0b81a98369" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174809 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9711ee8d-1ba8-43b1-ac70-c07eda069724" containerName="mariadb-database-create" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174827 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="70523008-772b-455a-9931-8ed6ea6dc6b0" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.174847 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="65078338-c90f-4ad7-b419-c72272042cde" containerName="mariadb-account-create-update" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.175848 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.178466 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wjvhb" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.178918 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.179292 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.180480 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fxqj4"] Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.255061 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.255121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-scripts\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.255151 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wsp2\" (UniqueName: \"kubernetes.io/projected/71d642f3-ad84-4cc5-b273-e790472f40fe-kube-api-access-8wsp2\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.255193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-config-data\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.357510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.357568 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-scripts\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.357601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wsp2\" (UniqueName: \"kubernetes.io/projected/71d642f3-ad84-4cc5-b273-e790472f40fe-kube-api-access-8wsp2\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.357654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-config-data\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.364897 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.368254 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-config-data\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.370255 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-scripts\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.375249 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wsp2\" (UniqueName: \"kubernetes.io/projected/71d642f3-ad84-4cc5-b273-e790472f40fe-kube-api-access-8wsp2\") pod \"nova-cell0-conductor-db-sync-fxqj4\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:08 crc kubenswrapper[4771]: I0219 23:06:08.498285 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:09 crc kubenswrapper[4771]: I0219 23:06:09.023397 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fxqj4"] Feb 19 23:06:09 crc kubenswrapper[4771]: I0219 23:06:09.259339 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" event={"ID":"71d642f3-ad84-4cc5-b273-e790472f40fe","Type":"ContainerStarted","Data":"5776aa983426e70ecaeb5a0ee1155a2d0bb3c9d678b8871b4bb991d51a156be1"} Feb 19 23:06:09 crc kubenswrapper[4771]: I0219 23:06:09.259742 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" event={"ID":"71d642f3-ad84-4cc5-b273-e790472f40fe","Type":"ContainerStarted","Data":"9eb3c8c560c61a139915c5f08d3afd6c936f489cb6b8a3e4d0133d696a218370"} Feb 19 23:06:09 crc kubenswrapper[4771]: I0219 23:06:09.279516 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" podStartSLOduration=1.279491704 podStartE2EDuration="1.279491704s" podCreationTimestamp="2026-02-19 23:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:09.27525022 +0000 UTC m=+5869.546692700" watchObservedRunningTime="2026-02-19 23:06:09.279491704 +0000 UTC m=+5869.550934174" Feb 19 23:06:14 crc kubenswrapper[4771]: I0219 23:06:14.329694 4771 generic.go:334] "Generic (PLEG): container finished" podID="71d642f3-ad84-4cc5-b273-e790472f40fe" containerID="5776aa983426e70ecaeb5a0ee1155a2d0bb3c9d678b8871b4bb991d51a156be1" exitCode=0 Feb 19 23:06:14 crc kubenswrapper[4771]: I0219 23:06:14.329772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" event={"ID":"71d642f3-ad84-4cc5-b273-e790472f40fe","Type":"ContainerDied","Data":"5776aa983426e70ecaeb5a0ee1155a2d0bb3c9d678b8871b4bb991d51a156be1"} Feb 19 23:06:15 crc kubenswrapper[4771]: I0219 23:06:15.846955 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:15 crc kubenswrapper[4771]: I0219 23:06:15.961878 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-combined-ca-bundle\") pod \"71d642f3-ad84-4cc5-b273-e790472f40fe\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " Feb 19 23:06:15 crc kubenswrapper[4771]: I0219 23:06:15.961960 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wsp2\" (UniqueName: \"kubernetes.io/projected/71d642f3-ad84-4cc5-b273-e790472f40fe-kube-api-access-8wsp2\") pod \"71d642f3-ad84-4cc5-b273-e790472f40fe\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " Feb 19 23:06:15 crc kubenswrapper[4771]: I0219 23:06:15.962217 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-config-data\") pod \"71d642f3-ad84-4cc5-b273-e790472f40fe\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " Feb 19 23:06:15 crc kubenswrapper[4771]: I0219 23:06:15.962442 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-scripts\") pod \"71d642f3-ad84-4cc5-b273-e790472f40fe\" (UID: \"71d642f3-ad84-4cc5-b273-e790472f40fe\") " Feb 19 23:06:15 crc kubenswrapper[4771]: I0219 23:06:15.968319 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-scripts" (OuterVolumeSpecName: "scripts") pod "71d642f3-ad84-4cc5-b273-e790472f40fe" (UID: "71d642f3-ad84-4cc5-b273-e790472f40fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:15 crc kubenswrapper[4771]: I0219 23:06:15.993447 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d642f3-ad84-4cc5-b273-e790472f40fe-kube-api-access-8wsp2" (OuterVolumeSpecName: "kube-api-access-8wsp2") pod "71d642f3-ad84-4cc5-b273-e790472f40fe" (UID: "71d642f3-ad84-4cc5-b273-e790472f40fe"). InnerVolumeSpecName "kube-api-access-8wsp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.007991 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-config-data" (OuterVolumeSpecName: "config-data") pod "71d642f3-ad84-4cc5-b273-e790472f40fe" (UID: "71d642f3-ad84-4cc5-b273-e790472f40fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.009650 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71d642f3-ad84-4cc5-b273-e790472f40fe" (UID: "71d642f3-ad84-4cc5-b273-e790472f40fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.066754 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.066789 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.066802 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d642f3-ad84-4cc5-b273-e790472f40fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.066816 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wsp2\" (UniqueName: \"kubernetes.io/projected/71d642f3-ad84-4cc5-b273-e790472f40fe-kube-api-access-8wsp2\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.356641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" event={"ID":"71d642f3-ad84-4cc5-b273-e790472f40fe","Type":"ContainerDied","Data":"9eb3c8c560c61a139915c5f08d3afd6c936f489cb6b8a3e4d0133d696a218370"} Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.356680 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb3c8c560c61a139915c5f08d3afd6c936f489cb6b8a3e4d0133d696a218370" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.356782 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fxqj4" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.464887 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:06:16 crc kubenswrapper[4771]: E0219 23:06:16.465649 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d642f3-ad84-4cc5-b273-e790472f40fe" containerName="nova-cell0-conductor-db-sync" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.465687 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d642f3-ad84-4cc5-b273-e790472f40fe" containerName="nova-cell0-conductor-db-sync" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.466111 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d642f3-ad84-4cc5-b273-e790472f40fe" containerName="nova-cell0-conductor-db-sync" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.467330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.469995 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wjvhb" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.471062 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.478649 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.578647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttt98\" (UniqueName: \"kubernetes.io/projected/cb4fdebe-aba4-4847-ba12-c20a793ca719-kube-api-access-ttt98\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.579049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.579106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.681265 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.681739 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttt98\" (UniqueName: \"kubernetes.io/projected/cb4fdebe-aba4-4847-ba12-c20a793ca719-kube-api-access-ttt98\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.681810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.685612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.687841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.698346 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttt98\" (UniqueName: \"kubernetes.io/projected/cb4fdebe-aba4-4847-ba12-c20a793ca719-kube-api-access-ttt98\") pod \"nova-cell0-conductor-0\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:16 crc kubenswrapper[4771]: I0219 23:06:16.803370 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:17 crc kubenswrapper[4771]: I0219 23:06:17.314596 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:06:17 crc kubenswrapper[4771]: I0219 23:06:17.369712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb4fdebe-aba4-4847-ba12-c20a793ca719","Type":"ContainerStarted","Data":"2f855b1f474a3afda26bb4b3c15130befb5929b6908ab3f9a55768143388e8ec"} Feb 19 23:06:18 crc kubenswrapper[4771]: I0219 23:06:18.385627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb4fdebe-aba4-4847-ba12-c20a793ca719","Type":"ContainerStarted","Data":"a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74"} Feb 19 23:06:18 crc kubenswrapper[4771]: I0219 23:06:18.387301 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:18 crc kubenswrapper[4771]: I0219 23:06:18.427451 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.427423278 podStartE2EDuration="2.427423278s" podCreationTimestamp="2026-02-19 23:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:18.408085862 +0000 UTC m=+5878.679528372" watchObservedRunningTime="2026-02-19 23:06:18.427423278 +0000 UTC m=+5878.698865778" Feb 19 23:06:26 crc kubenswrapper[4771]: I0219 23:06:26.849726 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.404843 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mvzsn"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.405939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.408583 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.412198 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.418857 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvzsn"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.515944 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s76b\" (UniqueName: \"kubernetes.io/projected/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-kube-api-access-8s76b\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.516059 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.516550 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-scripts\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.516703 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-config-data\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.559762 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.560778 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.569413 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.603378 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.618066 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.618148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-scripts\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.618189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-config-data\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.618275 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s76b\" (UniqueName: \"kubernetes.io/projected/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-kube-api-access-8s76b\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.635830 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-config-data\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.648946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-scripts\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.667167 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.679682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s76b\" (UniqueName: \"kubernetes.io/projected/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-kube-api-access-8s76b\") pod \"nova-cell0-cell-mapping-mvzsn\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.690376 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.694770 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.703309 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.719922 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc9tb\" (UniqueName: \"kubernetes.io/projected/4fc5036c-cb17-4add-aa63-cac9dc2f6835-kube-api-access-fc9tb\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.720032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.720084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.723280 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.727480 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.740598 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.742126 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.756597 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.797400 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.838825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs64t\" (UniqueName: \"kubernetes.io/projected/f83e63be-3e95-41da-9028-37602da9626f-kube-api-access-gs64t\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.838986 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc9tb\" (UniqueName: \"kubernetes.io/projected/4fc5036c-cb17-4add-aa63-cac9dc2f6835-kube-api-access-fc9tb\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.855627 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-config-data\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.855650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-config-data\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.855787 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfpw9\" (UniqueName: \"kubernetes.io/projected/90d936ae-5756-4ef9-adad-c80bc3e650c6-kube-api-access-lfpw9\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.855828 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.855927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90d936ae-5756-4ef9-adad-c80bc3e650c6-logs\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.855985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.856014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.856089 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.887663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc9tb\" (UniqueName: \"kubernetes.io/projected/4fc5036c-cb17-4add-aa63-cac9dc2f6835-kube-api-access-fc9tb\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.892694 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.924879 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.926654 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.960063 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90d936ae-5756-4ef9-adad-c80bc3e650c6-logs\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.960151 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.960191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.960238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs64t\" (UniqueName: \"kubernetes.io/projected/f83e63be-3e95-41da-9028-37602da9626f-kube-api-access-gs64t\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.960329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-config-data\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.960355 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-config-data\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.960410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfpw9\" (UniqueName: \"kubernetes.io/projected/90d936ae-5756-4ef9-adad-c80bc3e650c6-kube-api-access-lfpw9\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.961107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90d936ae-5756-4ef9-adad-c80bc3e650c6-logs\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.976667 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-config-data\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.977203 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-config-data\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.979707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.988679 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.990173 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.990537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs64t\" (UniqueName: \"kubernetes.io/projected/f83e63be-3e95-41da-9028-37602da9626f-kube-api-access-gs64t\") pod \"nova-scheduler-0\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " pod="openstack/nova-scheduler-0" Feb 19 23:06:27 crc kubenswrapper[4771]: I0219 23:06:27.994549 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfpw9\" (UniqueName: \"kubernetes.io/projected/90d936ae-5756-4ef9-adad-c80bc3e650c6-kube-api-access-lfpw9\") pod \"nova-metadata-0\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " pod="openstack/nova-metadata-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.014800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.044201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.051879 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.060082 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.068581 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b64ddc7-k89m4"] Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.079396 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b64ddc7-k89m4"] Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.079498 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.083732 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-config-data\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.083848 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7lnc\" (UniqueName: \"kubernetes.io/projected/ff0c5ddb-a9df-4215-898c-79bde7eb584a-kube-api-access-j7lnc\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.083938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0c5ddb-a9df-4215-898c-79bde7eb584a-logs\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.084002 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.185587 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0c5ddb-a9df-4215-898c-79bde7eb584a-logs\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.185659 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-sb\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.185710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.185791 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4wd\" (UniqueName: \"kubernetes.io/projected/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-kube-api-access-pt4wd\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.185868 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-config\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.185920 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-config-data\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.185979 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-nb\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.186009 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-dns-svc\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.186058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7lnc\" (UniqueName: \"kubernetes.io/projected/ff0c5ddb-a9df-4215-898c-79bde7eb584a-kube-api-access-j7lnc\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.187065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0c5ddb-a9df-4215-898c-79bde7eb584a-logs\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.194108 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-config-data\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.195036 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.206798 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7lnc\" (UniqueName: \"kubernetes.io/projected/ff0c5ddb-a9df-4215-898c-79bde7eb584a-kube-api-access-j7lnc\") pod \"nova-api-0\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.216696 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.287457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4wd\" (UniqueName: \"kubernetes.io/projected/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-kube-api-access-pt4wd\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.287548 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-config\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.287637 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-nb\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.287690 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-dns-svc\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.287756 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-sb\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.288535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-config\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.288649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-nb\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.289065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-dns-svc\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.289690 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-sb\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.302585 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4wd\" (UniqueName: \"kubernetes.io/projected/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-kube-api-access-pt4wd\") pod \"dnsmasq-dns-57b64ddc7-k89m4\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.407471 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.451364 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvzsn"] Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.470792 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.495198 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvzsn" event={"ID":"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303","Type":"ContainerStarted","Data":"3211492c140bcb924903767b00c89c43309332acac8ab648c96c971ca4b50638"} Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.564325 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:28 crc kubenswrapper[4771]: W0219 23:06:28.613257 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fc5036c_cb17_4add_aa63_cac9dc2f6835.slice/crio-2603458556b2a0ab838a318de562e2d9765a83c0c5d6112d209b825a6fa936f8 WatchSource:0}: Error finding container 2603458556b2a0ab838a318de562e2d9765a83c0c5d6112d209b825a6fa936f8: Status 404 returned error can't find the container with id 2603458556b2a0ab838a318de562e2d9765a83c0c5d6112d209b825a6fa936f8 Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.628446 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggftl"] Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.629669 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.633273 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.634491 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.636260 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggftl"] Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.674399 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.703394 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.703578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5nvh\" (UniqueName: \"kubernetes.io/projected/93356608-bac2-4fee-bcf1-e4ec814ddfe4-kube-api-access-b5nvh\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.703831 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-scripts\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.705180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-config-data\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.750089 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:28 crc kubenswrapper[4771]: W0219 23:06:28.774312 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90d936ae_5756_4ef9_adad_c80bc3e650c6.slice/crio-7b09a933e2240ecae6508795b3f281601f2585ca1387b652d01d64b245b97e66 WatchSource:0}: Error finding container 7b09a933e2240ecae6508795b3f281601f2585ca1387b652d01d64b245b97e66: Status 404 returned error can't find the container with id 7b09a933e2240ecae6508795b3f281601f2585ca1387b652d01d64b245b97e66 Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.808744 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.808799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5nvh\" (UniqueName: \"kubernetes.io/projected/93356608-bac2-4fee-bcf1-e4ec814ddfe4-kube-api-access-b5nvh\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.808897 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-scripts\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.808959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-config-data\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.812721 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-config-data\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.812962 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.813421 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-scripts\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.828468 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5nvh\" (UniqueName: \"kubernetes.io/projected/93356608-bac2-4fee-bcf1-e4ec814ddfe4-kube-api-access-b5nvh\") pod \"nova-cell1-conductor-db-sync-ggftl\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.875503 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:28 crc kubenswrapper[4771]: I0219 23:06:28.913096 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.064132 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b64ddc7-k89m4"] Feb 19 23:06:29 crc kubenswrapper[4771]: W0219 23:06:29.096840 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc47dc4f8_d8a0_4b02_9f9f_2ab710fdb3ac.slice/crio-93f6bb88960227fede59891ed1432dd567b41b363a1dd5d7912a50772e0590f8 WatchSource:0}: Error finding container 93f6bb88960227fede59891ed1432dd567b41b363a1dd5d7912a50772e0590f8: Status 404 returned error can't find the container with id 93f6bb88960227fede59891ed1432dd567b41b363a1dd5d7912a50772e0590f8 Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.434077 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggftl"] Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.518710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvzsn" event={"ID":"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303","Type":"ContainerStarted","Data":"9200c4d3453b5ec6a86e456695e4c48f952e38fc11fc04bf30bc0f07bb9b77fa"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.525586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fc5036c-cb17-4add-aa63-cac9dc2f6835","Type":"ContainerStarted","Data":"7374cfa7ed1b6b6525c2ea1625afdc02ffa3905e33686815490a2327de64744f"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.525620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fc5036c-cb17-4add-aa63-cac9dc2f6835","Type":"ContainerStarted","Data":"2603458556b2a0ab838a318de562e2d9765a83c0c5d6112d209b825a6fa936f8"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.527603 4771 generic.go:334] "Generic (PLEG): container finished" podID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerID="7f75ef3ce5e26e818f49f787ada5dc2ae57d2a35f5bf3c17ee280ecb3b4cd859" exitCode=0 Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.527644 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" event={"ID":"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac","Type":"ContainerDied","Data":"7f75ef3ce5e26e818f49f787ada5dc2ae57d2a35f5bf3c17ee280ecb3b4cd859"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.527658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" event={"ID":"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac","Type":"ContainerStarted","Data":"93f6bb88960227fede59891ed1432dd567b41b363a1dd5d7912a50772e0590f8"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.534843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f83e63be-3e95-41da-9028-37602da9626f","Type":"ContainerStarted","Data":"42a349932485cdcd94ce9f869f64908efb19df8aaf7e2b9940e35cb36932e98d"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.534871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f83e63be-3e95-41da-9028-37602da9626f","Type":"ContainerStarted","Data":"9a6c3ab94f1e98f9bdc6e19499f3d278ced29110a9806eb559aa2889529952f3"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.539046 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff0c5ddb-a9df-4215-898c-79bde7eb584a","Type":"ContainerStarted","Data":"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.539070 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff0c5ddb-a9df-4215-898c-79bde7eb584a","Type":"ContainerStarted","Data":"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.539080 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff0c5ddb-a9df-4215-898c-79bde7eb584a","Type":"ContainerStarted","Data":"67e4383ece263401aae3fe51f7abf295e29f115f5a8836b0e0d687a2c473acea"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.540220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggftl" event={"ID":"93356608-bac2-4fee-bcf1-e4ec814ddfe4","Type":"ContainerStarted","Data":"e6809e4ce66416cd5be5df430dc118dbc5b3aa75d96fc36d56c603ddc83dc56f"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.541621 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90d936ae-5756-4ef9-adad-c80bc3e650c6","Type":"ContainerStarted","Data":"71e3d2122f0352e0d6e5d7a3a0c4cf5eee888beaa5845d189b94ecaa3ffe5815"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.541657 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90d936ae-5756-4ef9-adad-c80bc3e650c6","Type":"ContainerStarted","Data":"82668acd3e1e3d85c5ce7476b1a25c4c4a3197e2dc810a7020f7c01737ef0209"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.541671 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90d936ae-5756-4ef9-adad-c80bc3e650c6","Type":"ContainerStarted","Data":"7b09a933e2240ecae6508795b3f281601f2585ca1387b652d01d64b245b97e66"} Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.545136 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mvzsn" podStartSLOduration=2.54511573 podStartE2EDuration="2.54511573s" podCreationTimestamp="2026-02-19 23:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:29.532835692 +0000 UTC m=+5889.804278162" watchObservedRunningTime="2026-02-19 23:06:29.54511573 +0000 UTC m=+5889.816558200" Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.570898 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.570875578 podStartE2EDuration="2.570875578s" podCreationTimestamp="2026-02-19 23:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:29.560273145 +0000 UTC m=+5889.831715625" watchObservedRunningTime="2026-02-19 23:06:29.570875578 +0000 UTC m=+5889.842318048" Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.581238 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.581221754 podStartE2EDuration="2.581221754s" podCreationTimestamp="2026-02-19 23:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:29.577053192 +0000 UTC m=+5889.848495672" watchObservedRunningTime="2026-02-19 23:06:29.581221754 +0000 UTC m=+5889.852664224" Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.599385 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.599370058 podStartE2EDuration="2.599370058s" podCreationTimestamp="2026-02-19 23:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:29.598587027 +0000 UTC m=+5889.870029497" watchObservedRunningTime="2026-02-19 23:06:29.599370058 +0000 UTC m=+5889.870812528" Feb 19 23:06:29 crc kubenswrapper[4771]: I0219 23:06:29.623610 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.623594244 podStartE2EDuration="2.623594244s" podCreationTimestamp="2026-02-19 23:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:29.620480061 +0000 UTC m=+5889.891922531" watchObservedRunningTime="2026-02-19 23:06:29.623594244 +0000 UTC m=+5889.895036714" Feb 19 23:06:30 crc kubenswrapper[4771]: I0219 23:06:30.554192 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" event={"ID":"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac","Type":"ContainerStarted","Data":"e60abef7e799ddede668637c96e1d038d8ca96499e37e7b44ef9bd2e9c7c3ee0"} Feb 19 23:06:30 crc kubenswrapper[4771]: I0219 23:06:30.554428 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:30 crc kubenswrapper[4771]: I0219 23:06:30.556555 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggftl" event={"ID":"93356608-bac2-4fee-bcf1-e4ec814ddfe4","Type":"ContainerStarted","Data":"7264dd981f72e749a7b9fd57314d94e5de584b191b6b1bb796ddc61772f0220e"} Feb 19 23:06:30 crc kubenswrapper[4771]: I0219 23:06:30.581768 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" podStartSLOduration=3.581750046 podStartE2EDuration="3.581750046s" podCreationTimestamp="2026-02-19 23:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:30.579267109 +0000 UTC m=+5890.850709589" watchObservedRunningTime="2026-02-19 23:06:30.581750046 +0000 UTC m=+5890.853192526" Feb 19 23:06:30 crc kubenswrapper[4771]: I0219 23:06:30.602886 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ggftl" podStartSLOduration=2.602868179 podStartE2EDuration="2.602868179s" podCreationTimestamp="2026-02-19 23:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:30.599048737 +0000 UTC m=+5890.870491207" watchObservedRunningTime="2026-02-19 23:06:30.602868179 +0000 UTC m=+5890.874310659" Feb 19 23:06:31 crc kubenswrapper[4771]: I0219 23:06:31.861458 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:31 crc kubenswrapper[4771]: I0219 23:06:31.861945 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-log" containerID="cri-o://82668acd3e1e3d85c5ce7476b1a25c4c4a3197e2dc810a7020f7c01737ef0209" gracePeriod=30 Feb 19 23:06:31 crc kubenswrapper[4771]: I0219 23:06:31.862571 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-metadata" containerID="cri-o://71e3d2122f0352e0d6e5d7a3a0c4cf5eee888beaa5845d189b94ecaa3ffe5815" gracePeriod=30 Feb 19 23:06:31 crc kubenswrapper[4771]: I0219 23:06:31.906411 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:31 crc kubenswrapper[4771]: I0219 23:06:31.906591 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4fc5036c-cb17-4add-aa63-cac9dc2f6835" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7374cfa7ed1b6b6525c2ea1625afdc02ffa3905e33686815490a2327de64744f" gracePeriod=30 Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.578973 4771 generic.go:334] "Generic (PLEG): container finished" podID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerID="71e3d2122f0352e0d6e5d7a3a0c4cf5eee888beaa5845d189b94ecaa3ffe5815" exitCode=0 Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.579371 4771 generic.go:334] "Generic (PLEG): container finished" podID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerID="82668acd3e1e3d85c5ce7476b1a25c4c4a3197e2dc810a7020f7c01737ef0209" exitCode=143 Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.579046 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90d936ae-5756-4ef9-adad-c80bc3e650c6","Type":"ContainerDied","Data":"71e3d2122f0352e0d6e5d7a3a0c4cf5eee888beaa5845d189b94ecaa3ffe5815"} Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.579519 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90d936ae-5756-4ef9-adad-c80bc3e650c6","Type":"ContainerDied","Data":"82668acd3e1e3d85c5ce7476b1a25c4c4a3197e2dc810a7020f7c01737ef0209"} Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.582286 4771 generic.go:334] "Generic (PLEG): container finished" podID="4fc5036c-cb17-4add-aa63-cac9dc2f6835" containerID="7374cfa7ed1b6b6525c2ea1625afdc02ffa3905e33686815490a2327de64744f" exitCode=0 Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.582361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fc5036c-cb17-4add-aa63-cac9dc2f6835","Type":"ContainerDied","Data":"7374cfa7ed1b6b6525c2ea1625afdc02ffa3905e33686815490a2327de64744f"} Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.585784 4771 generic.go:334] "Generic (PLEG): container finished" podID="93356608-bac2-4fee-bcf1-e4ec814ddfe4" containerID="7264dd981f72e749a7b9fd57314d94e5de584b191b6b1bb796ddc61772f0220e" exitCode=0 Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.585836 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggftl" event={"ID":"93356608-bac2-4fee-bcf1-e4ec814ddfe4","Type":"ContainerDied","Data":"7264dd981f72e749a7b9fd57314d94e5de584b191b6b1bb796ddc61772f0220e"} Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.769529 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.776917 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.789976 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc9tb\" (UniqueName: \"kubernetes.io/projected/4fc5036c-cb17-4add-aa63-cac9dc2f6835-kube-api-access-fc9tb\") pod \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.790118 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-combined-ca-bundle\") pod \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.790153 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-config-data\") pod \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\" (UID: \"4fc5036c-cb17-4add-aa63-cac9dc2f6835\") " Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.796287 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc5036c-cb17-4add-aa63-cac9dc2f6835-kube-api-access-fc9tb" (OuterVolumeSpecName: "kube-api-access-fc9tb") pod "4fc5036c-cb17-4add-aa63-cac9dc2f6835" (UID: "4fc5036c-cb17-4add-aa63-cac9dc2f6835"). InnerVolumeSpecName "kube-api-access-fc9tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.817851 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-config-data" (OuterVolumeSpecName: "config-data") pod "4fc5036c-cb17-4add-aa63-cac9dc2f6835" (UID: "4fc5036c-cb17-4add-aa63-cac9dc2f6835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.834188 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fc5036c-cb17-4add-aa63-cac9dc2f6835" (UID: "4fc5036c-cb17-4add-aa63-cac9dc2f6835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.892294 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-config-data\") pod \"90d936ae-5756-4ef9-adad-c80bc3e650c6\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.893344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfpw9\" (UniqueName: \"kubernetes.io/projected/90d936ae-5756-4ef9-adad-c80bc3e650c6-kube-api-access-lfpw9\") pod \"90d936ae-5756-4ef9-adad-c80bc3e650c6\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.893452 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90d936ae-5756-4ef9-adad-c80bc3e650c6-logs\") pod \"90d936ae-5756-4ef9-adad-c80bc3e650c6\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.893557 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-combined-ca-bundle\") pod \"90d936ae-5756-4ef9-adad-c80bc3e650c6\" (UID: \"90d936ae-5756-4ef9-adad-c80bc3e650c6\") " Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.893769 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d936ae-5756-4ef9-adad-c80bc3e650c6-logs" (OuterVolumeSpecName: "logs") pod "90d936ae-5756-4ef9-adad-c80bc3e650c6" (UID: "90d936ae-5756-4ef9-adad-c80bc3e650c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.894225 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.894308 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc5036c-cb17-4add-aa63-cac9dc2f6835-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.894376 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90d936ae-5756-4ef9-adad-c80bc3e650c6-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.894485 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc9tb\" (UniqueName: \"kubernetes.io/projected/4fc5036c-cb17-4add-aa63-cac9dc2f6835-kube-api-access-fc9tb\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.896304 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d936ae-5756-4ef9-adad-c80bc3e650c6-kube-api-access-lfpw9" (OuterVolumeSpecName: "kube-api-access-lfpw9") pod "90d936ae-5756-4ef9-adad-c80bc3e650c6" (UID: "90d936ae-5756-4ef9-adad-c80bc3e650c6"). InnerVolumeSpecName "kube-api-access-lfpw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.916893 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-config-data" (OuterVolumeSpecName: "config-data") pod "90d936ae-5756-4ef9-adad-c80bc3e650c6" (UID: "90d936ae-5756-4ef9-adad-c80bc3e650c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.943032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90d936ae-5756-4ef9-adad-c80bc3e650c6" (UID: "90d936ae-5756-4ef9-adad-c80bc3e650c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.996433 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.996474 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfpw9\" (UniqueName: \"kubernetes.io/projected/90d936ae-5756-4ef9-adad-c80bc3e650c6-kube-api-access-lfpw9\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:32 crc kubenswrapper[4771]: I0219 23:06:32.996489 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d936ae-5756-4ef9-adad-c80bc3e650c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.046196 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.616099 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.616305 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"90d936ae-5756-4ef9-adad-c80bc3e650c6","Type":"ContainerDied","Data":"7b09a933e2240ecae6508795b3f281601f2585ca1387b652d01d64b245b97e66"} Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.616416 4771 scope.go:117] "RemoveContainer" containerID="71e3d2122f0352e0d6e5d7a3a0c4cf5eee888beaa5845d189b94ecaa3ffe5815" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.623656 4771 generic.go:334] "Generic (PLEG): container finished" podID="0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" containerID="9200c4d3453b5ec6a86e456695e4c48f952e38fc11fc04bf30bc0f07bb9b77fa" exitCode=0 Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.623786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvzsn" event={"ID":"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303","Type":"ContainerDied","Data":"9200c4d3453b5ec6a86e456695e4c48f952e38fc11fc04bf30bc0f07bb9b77fa"} Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.627809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fc5036c-cb17-4add-aa63-cac9dc2f6835","Type":"ContainerDied","Data":"2603458556b2a0ab838a318de562e2d9765a83c0c5d6112d209b825a6fa936f8"} Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.627834 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.659863 4771 scope.go:117] "RemoveContainer" containerID="82668acd3e1e3d85c5ce7476b1a25c4c4a3197e2dc810a7020f7c01737ef0209" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.691514 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.722166 4771 scope.go:117] "RemoveContainer" containerID="7374cfa7ed1b6b6525c2ea1625afdc02ffa3905e33686815490a2327de64744f" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.722323 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.747178 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.774582 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.788800 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: E0219 23:06:33.789233 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-metadata" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.789248 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-metadata" Feb 19 23:06:33 crc kubenswrapper[4771]: E0219 23:06:33.789277 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc5036c-cb17-4add-aa63-cac9dc2f6835" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.789286 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc5036c-cb17-4add-aa63-cac9dc2f6835" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:06:33 crc kubenswrapper[4771]: E0219 23:06:33.789313 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-log" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.789320 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-log" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.789499 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-metadata" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.789517 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc5036c-cb17-4add-aa63-cac9dc2f6835" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.789541 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" containerName="nova-metadata-log" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.791747 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.793863 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.794435 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.801990 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.803118 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.807660 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.807879 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.807934 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.812621 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.817175 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4531578e-feac-48d4-822a-b340a9bdabfd-logs\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.817308 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.817342 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqmxl\" (UniqueName: \"kubernetes.io/projected/4531578e-feac-48d4-822a-b340a9bdabfd-kube-api-access-gqmxl\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.817731 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-config-data\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.817849 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.822875 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-config-data\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919311 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919332 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4531578e-feac-48d4-822a-b340a9bdabfd-logs\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919397 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919427 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919453 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919495 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwkj\" (UniqueName: \"kubernetes.io/projected/5be949dd-1063-4695-8a38-9b2fb21f9cb4-kube-api-access-tkwkj\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.919512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmxl\" (UniqueName: \"kubernetes.io/projected/4531578e-feac-48d4-822a-b340a9bdabfd-kube-api-access-gqmxl\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.922157 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4531578e-feac-48d4-822a-b340a9bdabfd-logs\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.932673 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.932719 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.936585 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqmxl\" (UniqueName: \"kubernetes.io/projected/4531578e-feac-48d4-822a-b340a9bdabfd-kube-api-access-gqmxl\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:33 crc kubenswrapper[4771]: I0219 23:06:33.935753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-config-data\") pod \"nova-metadata-0\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " pod="openstack/nova-metadata-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.020835 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.020932 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.021062 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.021106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwkj\" (UniqueName: \"kubernetes.io/projected/5be949dd-1063-4695-8a38-9b2fb21f9cb4-kube-api-access-tkwkj\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.021274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.024952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.025578 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.025604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.034095 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5be949dd-1063-4695-8a38-9b2fb21f9cb4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.036606 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwkj\" (UniqueName: \"kubernetes.io/projected/5be949dd-1063-4695-8a38-9b2fb21f9cb4-kube-api-access-tkwkj\") pod \"nova-cell1-novncproxy-0\" (UID: \"5be949dd-1063-4695-8a38-9b2fb21f9cb4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.103734 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.112739 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.127521 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.224940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-combined-ca-bundle\") pod \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.225013 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5nvh\" (UniqueName: \"kubernetes.io/projected/93356608-bac2-4fee-bcf1-e4ec814ddfe4-kube-api-access-b5nvh\") pod \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.225175 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-scripts\") pod \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.225289 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-config-data\") pod \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\" (UID: \"93356608-bac2-4fee-bcf1-e4ec814ddfe4\") " Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.229141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93356608-bac2-4fee-bcf1-e4ec814ddfe4-kube-api-access-b5nvh" (OuterVolumeSpecName: "kube-api-access-b5nvh") pod "93356608-bac2-4fee-bcf1-e4ec814ddfe4" (UID: "93356608-bac2-4fee-bcf1-e4ec814ddfe4"). InnerVolumeSpecName "kube-api-access-b5nvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.231399 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-scripts" (OuterVolumeSpecName: "scripts") pod "93356608-bac2-4fee-bcf1-e4ec814ddfe4" (UID: "93356608-bac2-4fee-bcf1-e4ec814ddfe4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.257792 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93356608-bac2-4fee-bcf1-e4ec814ddfe4" (UID: "93356608-bac2-4fee-bcf1-e4ec814ddfe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.277079 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-config-data" (OuterVolumeSpecName: "config-data") pod "93356608-bac2-4fee-bcf1-e4ec814ddfe4" (UID: "93356608-bac2-4fee-bcf1-e4ec814ddfe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.327921 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.328118 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.328162 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93356608-bac2-4fee-bcf1-e4ec814ddfe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.328177 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5nvh\" (UniqueName: \"kubernetes.io/projected/93356608-bac2-4fee-bcf1-e4ec814ddfe4-kube-api-access-b5nvh\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.445708 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fc5036c-cb17-4add-aa63-cac9dc2f6835" path="/var/lib/kubelet/pods/4fc5036c-cb17-4add-aa63-cac9dc2f6835/volumes" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.446267 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d936ae-5756-4ef9-adad-c80bc3e650c6" path="/var/lib/kubelet/pods/90d936ae-5756-4ef9-adad-c80bc3e650c6/volumes" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.598450 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:34 crc kubenswrapper[4771]: W0219 23:06:34.608270 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4531578e_feac_48d4_822a_b340a9bdabfd.slice/crio-ef8a78322f075b6f1cfadd018430dd22fb016e108ded345298816ae05b7c1650 WatchSource:0}: Error finding container ef8a78322f075b6f1cfadd018430dd22fb016e108ded345298816ae05b7c1650: Status 404 returned error can't find the container with id ef8a78322f075b6f1cfadd018430dd22fb016e108ded345298816ae05b7c1650 Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.650391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4531578e-feac-48d4-822a-b340a9bdabfd","Type":"ContainerStarted","Data":"ef8a78322f075b6f1cfadd018430dd22fb016e108ded345298816ae05b7c1650"} Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.654408 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggftl" event={"ID":"93356608-bac2-4fee-bcf1-e4ec814ddfe4","Type":"ContainerDied","Data":"e6809e4ce66416cd5be5df430dc118dbc5b3aa75d96fc36d56c603ddc83dc56f"} Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.654439 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6809e4ce66416cd5be5df430dc118dbc5b3aa75d96fc36d56c603ddc83dc56f" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.654471 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggftl" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.680974 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.701822 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:06:34 crc kubenswrapper[4771]: E0219 23:06:34.702507 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93356608-bac2-4fee-bcf1-e4ec814ddfe4" containerName="nova-cell1-conductor-db-sync" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.702541 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="93356608-bac2-4fee-bcf1-e4ec814ddfe4" containerName="nova-cell1-conductor-db-sync" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.702854 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="93356608-bac2-4fee-bcf1-e4ec814ddfe4" containerName="nova-cell1-conductor-db-sync" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.703896 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.709675 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.714770 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.735719 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.736319 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr5m6\" (UniqueName: \"kubernetes.io/projected/2c591c46-e4c4-4402-a868-a4d4dde101b3-kube-api-access-tr5m6\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.736361 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.838127 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.838523 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr5m6\" (UniqueName: \"kubernetes.io/projected/2c591c46-e4c4-4402-a868-a4d4dde101b3-kube-api-access-tr5m6\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.838546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.845971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.855820 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.874081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr5m6\" (UniqueName: \"kubernetes.io/projected/2c591c46-e4c4-4402-a868-a4d4dde101b3-kube-api-access-tr5m6\") pod \"nova-cell1-conductor-0\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:34 crc kubenswrapper[4771]: I0219 23:06:34.986532 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.041448 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-scripts\") pod \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.041571 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s76b\" (UniqueName: \"kubernetes.io/projected/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-kube-api-access-8s76b\") pod \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.041656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-combined-ca-bundle\") pod \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.041681 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-config-data\") pod \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\" (UID: \"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303\") " Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.049134 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-scripts" (OuterVolumeSpecName: "scripts") pod "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" (UID: "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.052461 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-kube-api-access-8s76b" (OuterVolumeSpecName: "kube-api-access-8s76b") pod "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" (UID: "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303"). InnerVolumeSpecName "kube-api-access-8s76b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.070840 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.071954 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-config-data" (OuterVolumeSpecName: "config-data") pod "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" (UID: "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.111251 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" (UID: "0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.144593 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.144638 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s76b\" (UniqueName: \"kubernetes.io/projected/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-kube-api-access-8s76b\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.144649 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.144658 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.517041 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:06:35 crc kubenswrapper[4771]: W0219 23:06:35.525485 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c591c46_e4c4_4402_a868_a4d4dde101b3.slice/crio-a75e073b3a072396ba96f5f0f832cc1fb30a3bccc431da10e23e686b29d3e7fb WatchSource:0}: Error finding container a75e073b3a072396ba96f5f0f832cc1fb30a3bccc431da10e23e686b29d3e7fb: Status 404 returned error can't find the container with id a75e073b3a072396ba96f5f0f832cc1fb30a3bccc431da10e23e686b29d3e7fb Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.674914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5be949dd-1063-4695-8a38-9b2fb21f9cb4","Type":"ContainerStarted","Data":"d69c1d7b27e4fcda0aa197b51dce3cbd5abe65f7ea24e6100e05d041d2777239"} Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.674956 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5be949dd-1063-4695-8a38-9b2fb21f9cb4","Type":"ContainerStarted","Data":"989add70d108855a1eed08146ef00a413393a8af22eebbb58ef9786bdeb74899"} Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.682399 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4531578e-feac-48d4-822a-b340a9bdabfd","Type":"ContainerStarted","Data":"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758"} Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.682427 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4531578e-feac-48d4-822a-b340a9bdabfd","Type":"ContainerStarted","Data":"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e"} Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.703584 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvzsn" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.704169 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvzsn" event={"ID":"0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303","Type":"ContainerDied","Data":"3211492c140bcb924903767b00c89c43309332acac8ab648c96c971ca4b50638"} Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.704222 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3211492c140bcb924903767b00c89c43309332acac8ab648c96c971ca4b50638" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.712621 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c591c46-e4c4-4402-a868-a4d4dde101b3","Type":"ContainerStarted","Data":"a75e073b3a072396ba96f5f0f832cc1fb30a3bccc431da10e23e686b29d3e7fb"} Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.733370 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.733306997 podStartE2EDuration="2.733306997s" podCreationTimestamp="2026-02-19 23:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:35.713837417 +0000 UTC m=+5895.985279897" watchObservedRunningTime="2026-02-19 23:06:35.733306997 +0000 UTC m=+5896.004749507" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.754158 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.754143353 podStartE2EDuration="2.754143353s" podCreationTimestamp="2026-02-19 23:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:35.744580168 +0000 UTC m=+5896.016022648" watchObservedRunningTime="2026-02-19 23:06:35.754143353 +0000 UTC m=+5896.025585813" Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.847362 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.847634 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-log" containerID="cri-o://077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb" gracePeriod=30 Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.847715 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-api" containerID="cri-o://908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37" gracePeriod=30 Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.856213 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.856425 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f83e63be-3e95-41da-9028-37602da9626f" containerName="nova-scheduler-scheduler" containerID="cri-o://42a349932485cdcd94ce9f869f64908efb19df8aaf7e2b9940e35cb36932e98d" gracePeriod=30 Feb 19 23:06:35 crc kubenswrapper[4771]: I0219 23:06:35.891113 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.421356 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.474962 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-config-data\") pod \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.475102 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7lnc\" (UniqueName: \"kubernetes.io/projected/ff0c5ddb-a9df-4215-898c-79bde7eb584a-kube-api-access-j7lnc\") pod \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.475185 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-combined-ca-bundle\") pod \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.475417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0c5ddb-a9df-4215-898c-79bde7eb584a-logs\") pod \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\" (UID: \"ff0c5ddb-a9df-4215-898c-79bde7eb584a\") " Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.477268 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0c5ddb-a9df-4215-898c-79bde7eb584a-logs" (OuterVolumeSpecName: "logs") pod "ff0c5ddb-a9df-4215-898c-79bde7eb584a" (UID: "ff0c5ddb-a9df-4215-898c-79bde7eb584a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.498322 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0c5ddb-a9df-4215-898c-79bde7eb584a-kube-api-access-j7lnc" (OuterVolumeSpecName: "kube-api-access-j7lnc") pod "ff0c5ddb-a9df-4215-898c-79bde7eb584a" (UID: "ff0c5ddb-a9df-4215-898c-79bde7eb584a"). InnerVolumeSpecName "kube-api-access-j7lnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.510267 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-config-data" (OuterVolumeSpecName: "config-data") pod "ff0c5ddb-a9df-4215-898c-79bde7eb584a" (UID: "ff0c5ddb-a9df-4215-898c-79bde7eb584a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.518689 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff0c5ddb-a9df-4215-898c-79bde7eb584a" (UID: "ff0c5ddb-a9df-4215-898c-79bde7eb584a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.577626 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.577666 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7lnc\" (UniqueName: \"kubernetes.io/projected/ff0c5ddb-a9df-4215-898c-79bde7eb584a-kube-api-access-j7lnc\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.577680 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0c5ddb-a9df-4215-898c-79bde7eb584a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.577693 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0c5ddb-a9df-4215-898c-79bde7eb584a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.726444 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c591c46-e4c4-4402-a868-a4d4dde101b3","Type":"ContainerStarted","Data":"9e024790acb94f007987f7b27d3b53e1297a5172b31221baada6e80d27f6232f"} Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.726825 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.729406 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerID="908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37" exitCode=0 Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.729431 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerID="077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb" exitCode=143 Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.729488 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.729534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff0c5ddb-a9df-4215-898c-79bde7eb584a","Type":"ContainerDied","Data":"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37"} Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.729568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff0c5ddb-a9df-4215-898c-79bde7eb584a","Type":"ContainerDied","Data":"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb"} Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.729583 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff0c5ddb-a9df-4215-898c-79bde7eb584a","Type":"ContainerDied","Data":"67e4383ece263401aae3fe51f7abf295e29f115f5a8836b0e0d687a2c473acea"} Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.729603 4771 scope.go:117] "RemoveContainer" containerID="908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.762555 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.762510154 podStartE2EDuration="2.762510154s" podCreationTimestamp="2026-02-19 23:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:36.747702458 +0000 UTC m=+5897.019144948" watchObservedRunningTime="2026-02-19 23:06:36.762510154 +0000 UTC m=+5897.033952634" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.768222 4771 scope.go:117] "RemoveContainer" containerID="077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.778105 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.787306 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.787758 4771 scope.go:117] "RemoveContainer" containerID="908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37" Feb 19 23:06:36 crc kubenswrapper[4771]: E0219 23:06:36.788240 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37\": container with ID starting with 908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37 not found: ID does not exist" containerID="908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.788309 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37"} err="failed to get container status \"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37\": rpc error: code = NotFound desc = could not find container \"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37\": container with ID starting with 908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37 not found: ID does not exist" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.788352 4771 scope.go:117] "RemoveContainer" containerID="077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb" Feb 19 23:06:36 crc kubenswrapper[4771]: E0219 23:06:36.797291 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb\": container with ID starting with 077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb not found: ID does not exist" containerID="077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.797363 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb"} err="failed to get container status \"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb\": rpc error: code = NotFound desc = could not find container \"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb\": container with ID starting with 077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb not found: ID does not exist" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.797406 4771 scope.go:117] "RemoveContainer" containerID="908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.797799 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37"} err="failed to get container status \"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37\": rpc error: code = NotFound desc = could not find container \"908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37\": container with ID starting with 908482fcccde638248a77f5115f261b60e56abe808dc8af66d0ab4e818bb6f37 not found: ID does not exist" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.797831 4771 scope.go:117] "RemoveContainer" containerID="077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.798926 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb"} err="failed to get container status \"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb\": rpc error: code = NotFound desc = could not find container \"077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb\": container with ID starting with 077eda76869dc14c7c503366529cfc3d8c67c3d39163d3cdc2cc24088fc59bbb not found: ID does not exist" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.812513 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:36 crc kubenswrapper[4771]: E0219 23:06:36.813069 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-log" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.813090 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-log" Feb 19 23:06:36 crc kubenswrapper[4771]: E0219 23:06:36.813110 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-api" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.813118 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-api" Feb 19 23:06:36 crc kubenswrapper[4771]: E0219 23:06:36.813148 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" containerName="nova-manage" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.813158 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" containerName="nova-manage" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.813384 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" containerName="nova-manage" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.813411 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-log" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.813432 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" containerName="nova-api-api" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.815073 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.818274 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.824814 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.886680 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.886784 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-config-data\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.886818 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07425903-5e50-4567-ae7b-d87b890ff93a-logs\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.887307 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwnm\" (UniqueName: \"kubernetes.io/projected/07425903-5e50-4567-ae7b-d87b890ff93a-kube-api-access-8mwnm\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.989472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.989612 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-config-data\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.989648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07425903-5e50-4567-ae7b-d87b890ff93a-logs\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.989809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwnm\" (UniqueName: \"kubernetes.io/projected/07425903-5e50-4567-ae7b-d87b890ff93a-kube-api-access-8mwnm\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.990691 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07425903-5e50-4567-ae7b-d87b890ff93a-logs\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.996412 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-config-data\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:36 crc kubenswrapper[4771]: I0219 23:06:36.996816 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:37 crc kubenswrapper[4771]: I0219 23:06:37.020558 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwnm\" (UniqueName: \"kubernetes.io/projected/07425903-5e50-4567-ae7b-d87b890ff93a-kube-api-access-8mwnm\") pod \"nova-api-0\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " pod="openstack/nova-api-0" Feb 19 23:06:37 crc kubenswrapper[4771]: I0219 23:06:37.154285 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:06:37 crc kubenswrapper[4771]: I0219 23:06:37.678156 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:37 crc kubenswrapper[4771]: I0219 23:06:37.740410 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07425903-5e50-4567-ae7b-d87b890ff93a","Type":"ContainerStarted","Data":"dc104dc61320aa360153291ca04f17e33693fa2aa7604ee0e71eb30868b62d26"} Feb 19 23:06:37 crc kubenswrapper[4771]: I0219 23:06:37.742369 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-metadata" containerID="cri-o://44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758" gracePeriod=30 Feb 19 23:06:37 crc kubenswrapper[4771]: I0219 23:06:37.742318 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-log" containerID="cri-o://b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e" gracePeriod=30 Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.378696 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.416993 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-nova-metadata-tls-certs\") pod \"4531578e-feac-48d4-822a-b340a9bdabfd\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.417379 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-combined-ca-bundle\") pod \"4531578e-feac-48d4-822a-b340a9bdabfd\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.425345 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4531578e-feac-48d4-822a-b340a9bdabfd-logs\") pod \"4531578e-feac-48d4-822a-b340a9bdabfd\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.425619 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqmxl\" (UniqueName: \"kubernetes.io/projected/4531578e-feac-48d4-822a-b340a9bdabfd-kube-api-access-gqmxl\") pod \"4531578e-feac-48d4-822a-b340a9bdabfd\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.425721 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-config-data\") pod \"4531578e-feac-48d4-822a-b340a9bdabfd\" (UID: \"4531578e-feac-48d4-822a-b340a9bdabfd\") " Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.425776 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4531578e-feac-48d4-822a-b340a9bdabfd-logs" (OuterVolumeSpecName: "logs") pod "4531578e-feac-48d4-822a-b340a9bdabfd" (UID: "4531578e-feac-48d4-822a-b340a9bdabfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.426550 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4531578e-feac-48d4-822a-b340a9bdabfd-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.429812 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4531578e-feac-48d4-822a-b340a9bdabfd-kube-api-access-gqmxl" (OuterVolumeSpecName: "kube-api-access-gqmxl") pod "4531578e-feac-48d4-822a-b340a9bdabfd" (UID: "4531578e-feac-48d4-822a-b340a9bdabfd"). InnerVolumeSpecName "kube-api-access-gqmxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.451359 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0c5ddb-a9df-4215-898c-79bde7eb584a" path="/var/lib/kubelet/pods/ff0c5ddb-a9df-4215-898c-79bde7eb584a/volumes" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.462363 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4531578e-feac-48d4-822a-b340a9bdabfd" (UID: "4531578e-feac-48d4-822a-b340a9bdabfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.463214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-config-data" (OuterVolumeSpecName: "config-data") pod "4531578e-feac-48d4-822a-b340a9bdabfd" (UID: "4531578e-feac-48d4-822a-b340a9bdabfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.472206 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4531578e-feac-48d4-822a-b340a9bdabfd" (UID: "4531578e-feac-48d4-822a-b340a9bdabfd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.511679 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.530549 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.530584 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.530597 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4531578e-feac-48d4-822a-b340a9bdabfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.530608 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqmxl\" (UniqueName: \"kubernetes.io/projected/4531578e-feac-48d4-822a-b340a9bdabfd-kube-api-access-gqmxl\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.580247 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67b8c4bf-4bp95"] Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.580478 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" podUID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerName="dnsmasq-dns" containerID="cri-o://9c73ee3210d40c8ff9c5ab8857eab106682b16a8894abccfc314ae4a027f0f83" gracePeriod=10 Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.762539 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07425903-5e50-4567-ae7b-d87b890ff93a","Type":"ContainerStarted","Data":"93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c"} Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.762892 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07425903-5e50-4567-ae7b-d87b890ff93a","Type":"ContainerStarted","Data":"1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328"} Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.767380 4771 generic.go:334] "Generic (PLEG): container finished" podID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerID="9c73ee3210d40c8ff9c5ab8857eab106682b16a8894abccfc314ae4a027f0f83" exitCode=0 Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.767429 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" event={"ID":"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab","Type":"ContainerDied","Data":"9c73ee3210d40c8ff9c5ab8857eab106682b16a8894abccfc314ae4a027f0f83"} Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.768730 4771 generic.go:334] "Generic (PLEG): container finished" podID="4531578e-feac-48d4-822a-b340a9bdabfd" containerID="44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758" exitCode=0 Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.768750 4771 generic.go:334] "Generic (PLEG): container finished" podID="4531578e-feac-48d4-822a-b340a9bdabfd" containerID="b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e" exitCode=143 Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.768764 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4531578e-feac-48d4-822a-b340a9bdabfd","Type":"ContainerDied","Data":"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758"} Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.768779 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4531578e-feac-48d4-822a-b340a9bdabfd","Type":"ContainerDied","Data":"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e"} Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.768788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4531578e-feac-48d4-822a-b340a9bdabfd","Type":"ContainerDied","Data":"ef8a78322f075b6f1cfadd018430dd22fb016e108ded345298816ae05b7c1650"} Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.768803 4771 scope.go:117] "RemoveContainer" containerID="44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.768908 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.789597 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.789575941 podStartE2EDuration="2.789575941s" podCreationTimestamp="2026-02-19 23:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:38.776089951 +0000 UTC m=+5899.047532441" watchObservedRunningTime="2026-02-19 23:06:38.789575941 +0000 UTC m=+5899.061018421" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.822221 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.830907 4771 scope.go:117] "RemoveContainer" containerID="b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.843109 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.852477 4771 scope.go:117] "RemoveContainer" containerID="44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758" Feb 19 23:06:38 crc kubenswrapper[4771]: E0219 23:06:38.852775 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758\": container with ID starting with 44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758 not found: ID does not exist" containerID="44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.852809 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758"} err="failed to get container status \"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758\": rpc error: code = NotFound desc = could not find container \"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758\": container with ID starting with 44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758 not found: ID does not exist" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.852830 4771 scope.go:117] "RemoveContainer" containerID="b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e" Feb 19 23:06:38 crc kubenswrapper[4771]: E0219 23:06:38.853103 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e\": container with ID starting with b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e not found: ID does not exist" containerID="b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.853242 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e"} err="failed to get container status \"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e\": rpc error: code = NotFound desc = could not find container \"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e\": container with ID starting with b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e not found: ID does not exist" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.853260 4771 scope.go:117] "RemoveContainer" containerID="44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.853440 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758"} err="failed to get container status \"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758\": rpc error: code = NotFound desc = could not find container \"44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758\": container with ID starting with 44aae7b7dc95d3c96a48207f5768cecfbb7e963abb2b9b155e53cfa4018d9758 not found: ID does not exist" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.853459 4771 scope.go:117] "RemoveContainer" containerID="b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.853698 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e"} err="failed to get container status \"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e\": rpc error: code = NotFound desc = could not find container \"b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e\": container with ID starting with b4f3c223c5a02f505e7a695e66c162e13d70434ef5559f47bcfd0057bfe0b71e not found: ID does not exist" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.857431 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:38 crc kubenswrapper[4771]: E0219 23:06:38.858786 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-log" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.858805 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-log" Feb 19 23:06:38 crc kubenswrapper[4771]: E0219 23:06:38.858820 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-metadata" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.858827 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-metadata" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.859005 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-metadata" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.859049 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" containerName="nova-metadata-log" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.862538 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.864507 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.864566 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.865300 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:06:38 crc kubenswrapper[4771]: I0219 23:06:38.975259 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.057746 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-dns-svc\") pod \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.057805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-config\") pod \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.057867 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bddnf\" (UniqueName: \"kubernetes.io/projected/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-kube-api-access-bddnf\") pod \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.057934 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-nb\") pod \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.057967 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-sb\") pod \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\" (UID: \"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab\") " Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.058247 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.058282 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ssd\" (UniqueName: \"kubernetes.io/projected/eee4b4da-7c50-4405-8d6a-0d2e13e66379-kube-api-access-28ssd\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.058338 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.058413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-config-data\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.058456 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee4b4da-7c50-4405-8d6a-0d2e13e66379-logs\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.070592 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-kube-api-access-bddnf" (OuterVolumeSpecName: "kube-api-access-bddnf") pod "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" (UID: "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab"). InnerVolumeSpecName "kube-api-access-bddnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.128587 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.161847 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee4b4da-7c50-4405-8d6a-0d2e13e66379-logs\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.162131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.162157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ssd\" (UniqueName: \"kubernetes.io/projected/eee4b4da-7c50-4405-8d6a-0d2e13e66379-kube-api-access-28ssd\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.162217 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.162299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-config-data\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.165766 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee4b4da-7c50-4405-8d6a-0d2e13e66379-logs\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.173010 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bddnf\" (UniqueName: \"kubernetes.io/projected/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-kube-api-access-bddnf\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.174494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.179969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-config-data\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.190605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.201572 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ssd\" (UniqueName: \"kubernetes.io/projected/eee4b4da-7c50-4405-8d6a-0d2e13e66379-kube-api-access-28ssd\") pod \"nova-metadata-0\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.222855 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" (UID: "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.232989 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" (UID: "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.237361 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" (UID: "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.248839 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-config" (OuterVolumeSpecName: "config") pod "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" (UID: "8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.274917 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.274952 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.274962 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.274971 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.478403 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.780530 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" event={"ID":"8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab","Type":"ContainerDied","Data":"a6f72eb8ca94d463fb681e4c21f2b45c0fd80c488e2f1128926cb51f902f9a66"} Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.780747 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67b8c4bf-4bp95" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.780789 4771 scope.go:117] "RemoveContainer" containerID="9c73ee3210d40c8ff9c5ab8857eab106682b16a8894abccfc314ae4a027f0f83" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.803707 4771 scope.go:117] "RemoveContainer" containerID="c7585a4e8145ee332a0b456cd2819d056bbaf97fa085a2ffd4096d213081f4dd" Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.819808 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67b8c4bf-4bp95"] Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.829424 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c67b8c4bf-4bp95"] Feb 19 23:06:39 crc kubenswrapper[4771]: I0219 23:06:39.967320 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:40 crc kubenswrapper[4771]: I0219 23:06:40.143441 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 23:06:40 crc kubenswrapper[4771]: I0219 23:06:40.449092 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4531578e-feac-48d4-822a-b340a9bdabfd" path="/var/lib/kubelet/pods/4531578e-feac-48d4-822a-b340a9bdabfd/volumes" Feb 19 23:06:40 crc kubenswrapper[4771]: I0219 23:06:40.449937 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" path="/var/lib/kubelet/pods/8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab/volumes" Feb 19 23:06:40 crc kubenswrapper[4771]: I0219 23:06:40.797725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee4b4da-7c50-4405-8d6a-0d2e13e66379","Type":"ContainerStarted","Data":"c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22"} Feb 19 23:06:40 crc kubenswrapper[4771]: I0219 23:06:40.798083 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee4b4da-7c50-4405-8d6a-0d2e13e66379","Type":"ContainerStarted","Data":"8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8"} Feb 19 23:06:40 crc kubenswrapper[4771]: I0219 23:06:40.798105 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee4b4da-7c50-4405-8d6a-0d2e13e66379","Type":"ContainerStarted","Data":"9d81c66bee649e4653c955302f14d8804d4bdb7d640ffdf95014849d7ba5de46"} Feb 19 23:06:40 crc kubenswrapper[4771]: I0219 23:06:40.840363 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8403433099999997 podStartE2EDuration="2.84034331s" podCreationTimestamp="2026-02-19 23:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:40.83321924 +0000 UTC m=+5901.104661740" watchObservedRunningTime="2026-02-19 23:06:40.84034331 +0000 UTC m=+5901.111785790" Feb 19 23:06:44 crc kubenswrapper[4771]: I0219 23:06:44.128049 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:44 crc kubenswrapper[4771]: I0219 23:06:44.169578 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:44 crc kubenswrapper[4771]: I0219 23:06:44.478588 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:06:44 crc kubenswrapper[4771]: I0219 23:06:44.480072 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:06:44 crc kubenswrapper[4771]: I0219 23:06:44.877146 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.148941 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-knpb5"] Feb 19 23:06:45 crc kubenswrapper[4771]: E0219 23:06:45.149525 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerName="dnsmasq-dns" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.149538 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerName="dnsmasq-dns" Feb 19 23:06:45 crc kubenswrapper[4771]: E0219 23:06:45.149558 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerName="init" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.149564 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerName="init" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.149736 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cebe6b2-8fbd-4efe-b58c-90d76b5b23ab" containerName="dnsmasq-dns" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.150494 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.152763 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.154892 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.169506 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-knpb5"] Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.211574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-scripts\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.211666 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwqd\" (UniqueName: \"kubernetes.io/projected/de417c6b-799d-41a8-8c43-795e1868759e-kube-api-access-9fwqd\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.211818 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.211876 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-config-data\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.314013 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwqd\" (UniqueName: \"kubernetes.io/projected/de417c6b-799d-41a8-8c43-795e1868759e-kube-api-access-9fwqd\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.314236 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.314321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-config-data\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.314633 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-scripts\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.322430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-scripts\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.324804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.324918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-config-data\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.337683 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwqd\" (UniqueName: \"kubernetes.io/projected/de417c6b-799d-41a8-8c43-795e1868759e-kube-api-access-9fwqd\") pod \"nova-cell1-cell-mapping-knpb5\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.476843 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:45 crc kubenswrapper[4771]: I0219 23:06:45.975677 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-knpb5"] Feb 19 23:06:46 crc kubenswrapper[4771]: I0219 23:06:46.867553 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-knpb5" event={"ID":"de417c6b-799d-41a8-8c43-795e1868759e","Type":"ContainerStarted","Data":"a37c6fd0a20e5c0d77c5b847b38f566c682d7b51929be901c12c4f7a2f3910ae"} Feb 19 23:06:46 crc kubenswrapper[4771]: I0219 23:06:46.867654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-knpb5" event={"ID":"de417c6b-799d-41a8-8c43-795e1868759e","Type":"ContainerStarted","Data":"7884d5a0ebc9d339082e6b1748d2665b3cd51dcaa338cbb7d042952e357f2a56"} Feb 19 23:06:46 crc kubenswrapper[4771]: I0219 23:06:46.894795 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-knpb5" podStartSLOduration=1.894728947 podStartE2EDuration="1.894728947s" podCreationTimestamp="2026-02-19 23:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:46.891005887 +0000 UTC m=+5907.162448417" watchObservedRunningTime="2026-02-19 23:06:46.894728947 +0000 UTC m=+5907.166171457" Feb 19 23:06:47 crc kubenswrapper[4771]: I0219 23:06:47.154982 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:06:47 crc kubenswrapper[4771]: I0219 23:06:47.155495 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:06:48 crc kubenswrapper[4771]: I0219 23:06:48.237248 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:06:48 crc kubenswrapper[4771]: I0219 23:06:48.237529 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.100:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:06:49 crc kubenswrapper[4771]: I0219 23:06:49.479650 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:06:49 crc kubenswrapper[4771]: I0219 23:06:49.479994 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:06:50 crc kubenswrapper[4771]: I0219 23:06:50.499221 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.101:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:06:50 crc kubenswrapper[4771]: I0219 23:06:50.499512 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.101:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:06:50 crc kubenswrapper[4771]: I0219 23:06:50.913621 4771 generic.go:334] "Generic (PLEG): container finished" podID="de417c6b-799d-41a8-8c43-795e1868759e" containerID="a37c6fd0a20e5c0d77c5b847b38f566c682d7b51929be901c12c4f7a2f3910ae" exitCode=0 Feb 19 23:06:50 crc kubenswrapper[4771]: I0219 23:06:50.913839 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-knpb5" event={"ID":"de417c6b-799d-41a8-8c43-795e1868759e","Type":"ContainerDied","Data":"a37c6fd0a20e5c0d77c5b847b38f566c682d7b51929be901c12c4f7a2f3910ae"} Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.354672 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.378274 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-config-data\") pod \"de417c6b-799d-41a8-8c43-795e1868759e\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.378410 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwqd\" (UniqueName: \"kubernetes.io/projected/de417c6b-799d-41a8-8c43-795e1868759e-kube-api-access-9fwqd\") pod \"de417c6b-799d-41a8-8c43-795e1868759e\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.378437 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-combined-ca-bundle\") pod \"de417c6b-799d-41a8-8c43-795e1868759e\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.378477 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-scripts\") pod \"de417c6b-799d-41a8-8c43-795e1868759e\" (UID: \"de417c6b-799d-41a8-8c43-795e1868759e\") " Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.385640 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de417c6b-799d-41a8-8c43-795e1868759e-kube-api-access-9fwqd" (OuterVolumeSpecName: "kube-api-access-9fwqd") pod "de417c6b-799d-41a8-8c43-795e1868759e" (UID: "de417c6b-799d-41a8-8c43-795e1868759e"). InnerVolumeSpecName "kube-api-access-9fwqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.398035 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-scripts" (OuterVolumeSpecName: "scripts") pod "de417c6b-799d-41a8-8c43-795e1868759e" (UID: "de417c6b-799d-41a8-8c43-795e1868759e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.415523 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de417c6b-799d-41a8-8c43-795e1868759e" (UID: "de417c6b-799d-41a8-8c43-795e1868759e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.430961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-config-data" (OuterVolumeSpecName: "config-data") pod "de417c6b-799d-41a8-8c43-795e1868759e" (UID: "de417c6b-799d-41a8-8c43-795e1868759e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.481759 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwqd\" (UniqueName: \"kubernetes.io/projected/de417c6b-799d-41a8-8c43-795e1868759e-kube-api-access-9fwqd\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.481796 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.481808 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.481820 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de417c6b-799d-41a8-8c43-795e1868759e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.939684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-knpb5" event={"ID":"de417c6b-799d-41a8-8c43-795e1868759e","Type":"ContainerDied","Data":"7884d5a0ebc9d339082e6b1748d2665b3cd51dcaa338cbb7d042952e357f2a56"} Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.940003 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7884d5a0ebc9d339082e6b1748d2665b3cd51dcaa338cbb7d042952e357f2a56" Feb 19 23:06:52 crc kubenswrapper[4771]: I0219 23:06:52.939740 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-knpb5" Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.201254 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.201889 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-log" containerID="cri-o://1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328" gracePeriod=30 Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.202010 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-api" containerID="cri-o://93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c" gracePeriod=30 Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.236862 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.237565 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-log" containerID="cri-o://8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8" gracePeriod=30 Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.237671 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-metadata" containerID="cri-o://c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22" gracePeriod=30 Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.954721 4771 generic.go:334] "Generic (PLEG): container finished" podID="07425903-5e50-4567-ae7b-d87b890ff93a" containerID="1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328" exitCode=143 Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.954853 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07425903-5e50-4567-ae7b-d87b890ff93a","Type":"ContainerDied","Data":"1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328"} Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.958504 4771 generic.go:334] "Generic (PLEG): container finished" podID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerID="8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8" exitCode=143 Feb 19 23:06:53 crc kubenswrapper[4771]: I0219 23:06:53.958579 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee4b4da-7c50-4405-8d6a-0d2e13e66379","Type":"ContainerDied","Data":"8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8"} Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.909716 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.982836 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-combined-ca-bundle\") pod \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.983199 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee4b4da-7c50-4405-8d6a-0d2e13e66379-logs\") pod \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.983692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee4b4da-7c50-4405-8d6a-0d2e13e66379-logs" (OuterVolumeSpecName: "logs") pod "eee4b4da-7c50-4405-8d6a-0d2e13e66379" (UID: "eee4b4da-7c50-4405-8d6a-0d2e13e66379"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.983943 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28ssd\" (UniqueName: \"kubernetes.io/projected/eee4b4da-7c50-4405-8d6a-0d2e13e66379-kube-api-access-28ssd\") pod \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.984112 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-config-data\") pod \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.984482 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-nova-metadata-tls-certs\") pod \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\" (UID: \"eee4b4da-7c50-4405-8d6a-0d2e13e66379\") " Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.985106 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee4b4da-7c50-4405-8d6a-0d2e13e66379-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.989676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee4b4da-7c50-4405-8d6a-0d2e13e66379-kube-api-access-28ssd" (OuterVolumeSpecName: "kube-api-access-28ssd") pod "eee4b4da-7c50-4405-8d6a-0d2e13e66379" (UID: "eee4b4da-7c50-4405-8d6a-0d2e13e66379"). InnerVolumeSpecName "kube-api-access-28ssd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.993746 4771 generic.go:334] "Generic (PLEG): container finished" podID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerID="c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22" exitCode=0 Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.993801 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee4b4da-7c50-4405-8d6a-0d2e13e66379","Type":"ContainerDied","Data":"c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22"} Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.993845 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.993884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eee4b4da-7c50-4405-8d6a-0d2e13e66379","Type":"ContainerDied","Data":"9d81c66bee649e4653c955302f14d8804d4bdb7d640ffdf95014849d7ba5de46"} Feb 19 23:06:56 crc kubenswrapper[4771]: I0219 23:06:56.993914 4771 scope.go:117] "RemoveContainer" containerID="c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.009784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-config-data" (OuterVolumeSpecName: "config-data") pod "eee4b4da-7c50-4405-8d6a-0d2e13e66379" (UID: "eee4b4da-7c50-4405-8d6a-0d2e13e66379"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.019530 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eee4b4da-7c50-4405-8d6a-0d2e13e66379" (UID: "eee4b4da-7c50-4405-8d6a-0d2e13e66379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.032949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "eee4b4da-7c50-4405-8d6a-0d2e13e66379" (UID: "eee4b4da-7c50-4405-8d6a-0d2e13e66379"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.087166 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.087206 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.087242 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28ssd\" (UniqueName: \"kubernetes.io/projected/eee4b4da-7c50-4405-8d6a-0d2e13e66379-kube-api-access-28ssd\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.087253 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee4b4da-7c50-4405-8d6a-0d2e13e66379-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.093711 4771 scope.go:117] "RemoveContainer" containerID="8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.116995 4771 scope.go:117] "RemoveContainer" containerID="c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22" Feb 19 23:06:57 crc kubenswrapper[4771]: E0219 23:06:57.117431 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22\": container with ID starting with c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22 not found: ID does not exist" containerID="c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.117460 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22"} err="failed to get container status \"c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22\": rpc error: code = NotFound desc = could not find container \"c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22\": container with ID starting with c36c23eadc41483b126e5c0e0183e4a22efdfee4954a01a28aaadbf58c902c22 not found: ID does not exist" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.117481 4771 scope.go:117] "RemoveContainer" containerID="8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8" Feb 19 23:06:57 crc kubenswrapper[4771]: E0219 23:06:57.117747 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8\": container with ID starting with 8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8 not found: ID does not exist" containerID="8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.117792 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8"} err="failed to get container status \"8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8\": rpc error: code = NotFound desc = could not find container \"8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8\": container with ID starting with 8bc46b66eb8d6af3c56d25b0c0157921b9ff162f09498cb5178b26ac9ba523d8 not found: ID does not exist" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.342350 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.361595 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.372151 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:57 crc kubenswrapper[4771]: E0219 23:06:57.372778 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-log" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.372809 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-log" Feb 19 23:06:57 crc kubenswrapper[4771]: E0219 23:06:57.372846 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de417c6b-799d-41a8-8c43-795e1868759e" containerName="nova-manage" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.372859 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="de417c6b-799d-41a8-8c43-795e1868759e" containerName="nova-manage" Feb 19 23:06:57 crc kubenswrapper[4771]: E0219 23:06:57.372891 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-metadata" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.372903 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-metadata" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.373264 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-metadata" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.373297 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" containerName="nova-metadata-log" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.373319 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="de417c6b-799d-41a8-8c43-795e1868759e" containerName="nova-manage" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.374935 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.378505 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.378758 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.380737 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.518739 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-logs\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.518798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9xz\" (UniqueName: \"kubernetes.io/projected/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-kube-api-access-dl9xz\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.518838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.518963 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.518999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-config-data\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.620474 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.620850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-config-data\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.622202 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-logs\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.622286 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9xz\" (UniqueName: \"kubernetes.io/projected/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-kube-api-access-dl9xz\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.622995 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.622905 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-logs\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.625875 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-config-data\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.628139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.629102 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.647118 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9xz\" (UniqueName: \"kubernetes.io/projected/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-kube-api-access-dl9xz\") pod \"nova-metadata-0\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " pod="openstack/nova-metadata-0" Feb 19 23:06:57 crc kubenswrapper[4771]: I0219 23:06:57.726675 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:06:58 crc kubenswrapper[4771]: W0219 23:06:58.210053 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c0051f_548d_4a65_b4c1_1c7ccaa3ae81.slice/crio-ea78ba87e6c4e734ea3dcaa663905d09b4454b40fc80062194ae696d7b848117 WatchSource:0}: Error finding container ea78ba87e6c4e734ea3dcaa663905d09b4454b40fc80062194ae696d7b848117: Status 404 returned error can't find the container with id ea78ba87e6c4e734ea3dcaa663905d09b4454b40fc80062194ae696d7b848117 Feb 19 23:06:58 crc kubenswrapper[4771]: I0219 23:06:58.218052 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:06:58 crc kubenswrapper[4771]: I0219 23:06:58.462973 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee4b4da-7c50-4405-8d6a-0d2e13e66379" path="/var/lib/kubelet/pods/eee4b4da-7c50-4405-8d6a-0d2e13e66379/volumes" Feb 19 23:06:59 crc kubenswrapper[4771]: I0219 23:06:59.015745 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81","Type":"ContainerStarted","Data":"ad2946ab4ef001da23936de729acc2e1f8d7a36e735134c565d78c56c9146957"} Feb 19 23:06:59 crc kubenswrapper[4771]: I0219 23:06:59.015792 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81","Type":"ContainerStarted","Data":"48e8dc4d58568b5589e042f4e32c0eb8d77975bd450146d13402de9361ed4871"} Feb 19 23:06:59 crc kubenswrapper[4771]: I0219 23:06:59.015807 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81","Type":"ContainerStarted","Data":"ea78ba87e6c4e734ea3dcaa663905d09b4454b40fc80062194ae696d7b848117"} Feb 19 23:06:59 crc kubenswrapper[4771]: I0219 23:06:59.038985 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.038953074 podStartE2EDuration="2.038953074s" podCreationTimestamp="2026-02-19 23:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:59.029975515 +0000 UTC m=+5919.301418005" watchObservedRunningTime="2026-02-19 23:06:59.038953074 +0000 UTC m=+5919.310395544" Feb 19 23:07:02 crc kubenswrapper[4771]: I0219 23:07:02.726916 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:07:02 crc kubenswrapper[4771]: I0219 23:07:02.727573 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.210926 4771 generic.go:334] "Generic (PLEG): container finished" podID="f83e63be-3e95-41da-9028-37602da9626f" containerID="42a349932485cdcd94ce9f869f64908efb19df8aaf7e2b9940e35cb36932e98d" exitCode=137 Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.211122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f83e63be-3e95-41da-9028-37602da9626f","Type":"ContainerDied","Data":"42a349932485cdcd94ce9f869f64908efb19df8aaf7e2b9940e35cb36932e98d"} Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.464121 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.541903 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-config-data\") pod \"f83e63be-3e95-41da-9028-37602da9626f\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.541968 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs64t\" (UniqueName: \"kubernetes.io/projected/f83e63be-3e95-41da-9028-37602da9626f-kube-api-access-gs64t\") pod \"f83e63be-3e95-41da-9028-37602da9626f\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.541991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-combined-ca-bundle\") pod \"f83e63be-3e95-41da-9028-37602da9626f\" (UID: \"f83e63be-3e95-41da-9028-37602da9626f\") " Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.561263 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83e63be-3e95-41da-9028-37602da9626f-kube-api-access-gs64t" (OuterVolumeSpecName: "kube-api-access-gs64t") pod "f83e63be-3e95-41da-9028-37602da9626f" (UID: "f83e63be-3e95-41da-9028-37602da9626f"). InnerVolumeSpecName "kube-api-access-gs64t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.568615 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-config-data" (OuterVolumeSpecName: "config-data") pod "f83e63be-3e95-41da-9028-37602da9626f" (UID: "f83e63be-3e95-41da-9028-37602da9626f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.585965 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f83e63be-3e95-41da-9028-37602da9626f" (UID: "f83e63be-3e95-41da-9028-37602da9626f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.644213 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.644249 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs64t\" (UniqueName: \"kubernetes.io/projected/f83e63be-3e95-41da-9028-37602da9626f-kube-api-access-gs64t\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.644260 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e63be-3e95-41da-9028-37602da9626f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:06 crc kubenswrapper[4771]: I0219 23:07:06.893396 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.050344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-config-data\") pod \"07425903-5e50-4567-ae7b-d87b890ff93a\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.050418 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-combined-ca-bundle\") pod \"07425903-5e50-4567-ae7b-d87b890ff93a\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.050486 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mwnm\" (UniqueName: \"kubernetes.io/projected/07425903-5e50-4567-ae7b-d87b890ff93a-kube-api-access-8mwnm\") pod \"07425903-5e50-4567-ae7b-d87b890ff93a\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.050589 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07425903-5e50-4567-ae7b-d87b890ff93a-logs\") pod \"07425903-5e50-4567-ae7b-d87b890ff93a\" (UID: \"07425903-5e50-4567-ae7b-d87b890ff93a\") " Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.051650 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07425903-5e50-4567-ae7b-d87b890ff93a-logs" (OuterVolumeSpecName: "logs") pod "07425903-5e50-4567-ae7b-d87b890ff93a" (UID: "07425903-5e50-4567-ae7b-d87b890ff93a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.058519 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07425903-5e50-4567-ae7b-d87b890ff93a-kube-api-access-8mwnm" (OuterVolumeSpecName: "kube-api-access-8mwnm") pod "07425903-5e50-4567-ae7b-d87b890ff93a" (UID: "07425903-5e50-4567-ae7b-d87b890ff93a"). InnerVolumeSpecName "kube-api-access-8mwnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.080474 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-config-data" (OuterVolumeSpecName: "config-data") pod "07425903-5e50-4567-ae7b-d87b890ff93a" (UID: "07425903-5e50-4567-ae7b-d87b890ff93a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.083904 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07425903-5e50-4567-ae7b-d87b890ff93a" (UID: "07425903-5e50-4567-ae7b-d87b890ff93a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.152559 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07425903-5e50-4567-ae7b-d87b890ff93a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.152815 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.153189 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07425903-5e50-4567-ae7b-d87b890ff93a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.153349 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mwnm\" (UniqueName: \"kubernetes.io/projected/07425903-5e50-4567-ae7b-d87b890ff93a-kube-api-access-8mwnm\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.227362 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.227353 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f83e63be-3e95-41da-9028-37602da9626f","Type":"ContainerDied","Data":"9a6c3ab94f1e98f9bdc6e19499f3d278ced29110a9806eb559aa2889529952f3"} Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.228704 4771 scope.go:117] "RemoveContainer" containerID="42a349932485cdcd94ce9f869f64908efb19df8aaf7e2b9940e35cb36932e98d" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.229593 4771 generic.go:334] "Generic (PLEG): container finished" podID="07425903-5e50-4567-ae7b-d87b890ff93a" containerID="93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c" exitCode=0 Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.229662 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07425903-5e50-4567-ae7b-d87b890ff93a","Type":"ContainerDied","Data":"93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c"} Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.229707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07425903-5e50-4567-ae7b-d87b890ff93a","Type":"ContainerDied","Data":"dc104dc61320aa360153291ca04f17e33693fa2aa7604ee0e71eb30868b62d26"} Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.229796 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.254619 4771 scope.go:117] "RemoveContainer" containerID="93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.311370 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.327545 4771 scope.go:117] "RemoveContainer" containerID="1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.331845 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.348615 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.368663 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.380671 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: E0219 23:07:07.382805 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-log" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.382836 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-log" Feb 19 23:07:07 crc kubenswrapper[4771]: E0219 23:07:07.382869 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e63be-3e95-41da-9028-37602da9626f" containerName="nova-scheduler-scheduler" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.382880 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e63be-3e95-41da-9028-37602da9626f" containerName="nova-scheduler-scheduler" Feb 19 23:07:07 crc kubenswrapper[4771]: E0219 23:07:07.382910 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-api" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.382916 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-api" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.383158 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-log" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.383187 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" containerName="nova-api-api" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.383197 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83e63be-3e95-41da-9028-37602da9626f" containerName="nova-scheduler-scheduler" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.384132 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.387404 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.391770 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.394211 4771 scope.go:117] "RemoveContainer" containerID="93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.394560 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: E0219 23:07:07.395812 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c\": container with ID starting with 93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c not found: ID does not exist" containerID="93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.395907 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c"} err="failed to get container status \"93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c\": rpc error: code = NotFound desc = could not find container \"93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c\": container with ID starting with 93a7e296a0c6c0b633d4eeb88ee27aa3663542ca75065035e14ecf8ec0c2011c not found: ID does not exist" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.395999 4771 scope.go:117] "RemoveContainer" containerID="1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.399346 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:07:07 crc kubenswrapper[4771]: E0219 23:07:07.400188 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328\": container with ID starting with 1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328 not found: ID does not exist" containerID="1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.400235 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328"} err="failed to get container status \"1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328\": rpc error: code = NotFound desc = could not find container \"1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328\": container with ID starting with 1993fdf19ef9c3826b939cbc1c22a7328ce28c848d60f5b8e61b0329d2cee328 not found: ID does not exist" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.409571 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.425491 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.460661 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.460730 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-config-data\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.460773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-config-data\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.460814 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e375a6-999c-4272-a25c-a8b87239095d-logs\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.460863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlrsd\" (UniqueName: \"kubernetes.io/projected/069fb38a-cbd9-4bb9-8441-682d97292af1-kube-api-access-vlrsd\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.460886 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsphl\" (UniqueName: \"kubernetes.io/projected/96e375a6-999c-4272-a25c-a8b87239095d-kube-api-access-vsphl\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.460930 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.563224 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e375a6-999c-4272-a25c-a8b87239095d-logs\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.563429 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlrsd\" (UniqueName: \"kubernetes.io/projected/069fb38a-cbd9-4bb9-8441-682d97292af1-kube-api-access-vlrsd\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.564114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsphl\" (UniqueName: \"kubernetes.io/projected/96e375a6-999c-4272-a25c-a8b87239095d-kube-api-access-vsphl\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.563941 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e375a6-999c-4272-a25c-a8b87239095d-logs\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.564367 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.564959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.565042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-config-data\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.565122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-config-data\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.570180 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-config-data\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.570201 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-config-data\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.572646 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.578227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.582613 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlrsd\" (UniqueName: \"kubernetes.io/projected/069fb38a-cbd9-4bb9-8441-682d97292af1-kube-api-access-vlrsd\") pod \"nova-scheduler-0\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.587692 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsphl\" (UniqueName: \"kubernetes.io/projected/96e375a6-999c-4272-a25c-a8b87239095d-kube-api-access-vsphl\") pod \"nova-api-0\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.705312 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.719047 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.727701 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:07:07 crc kubenswrapper[4771]: I0219 23:07:07.728569 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:07:08 crc kubenswrapper[4771]: W0219 23:07:08.199976 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96e375a6_999c_4272_a25c_a8b87239095d.slice/crio-74245bbcd69b700d1a2d284a95f8d8b3f61c42eeaf242ab5bc1c07b118c70260 WatchSource:0}: Error finding container 74245bbcd69b700d1a2d284a95f8d8b3f61c42eeaf242ab5bc1c07b118c70260: Status 404 returned error can't find the container with id 74245bbcd69b700d1a2d284a95f8d8b3f61c42eeaf242ab5bc1c07b118c70260 Feb 19 23:07:08 crc kubenswrapper[4771]: I0219 23:07:08.215981 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:08 crc kubenswrapper[4771]: I0219 23:07:08.239782 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96e375a6-999c-4272-a25c-a8b87239095d","Type":"ContainerStarted","Data":"74245bbcd69b700d1a2d284a95f8d8b3f61c42eeaf242ab5bc1c07b118c70260"} Feb 19 23:07:08 crc kubenswrapper[4771]: W0219 23:07:08.349257 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod069fb38a_cbd9_4bb9_8441_682d97292af1.slice/crio-33eed73d7be7a0e3e3e7d8a1c0c32eacfe66ccc7860fb13bbe489a57908588ef WatchSource:0}: Error finding container 33eed73d7be7a0e3e3e7d8a1c0c32eacfe66ccc7860fb13bbe489a57908588ef: Status 404 returned error can't find the container with id 33eed73d7be7a0e3e3e7d8a1c0c32eacfe66ccc7860fb13bbe489a57908588ef Feb 19 23:07:08 crc kubenswrapper[4771]: I0219 23:07:08.353847 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:07:08 crc kubenswrapper[4771]: I0219 23:07:08.455495 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07425903-5e50-4567-ae7b-d87b890ff93a" path="/var/lib/kubelet/pods/07425903-5e50-4567-ae7b-d87b890ff93a/volumes" Feb 19 23:07:08 crc kubenswrapper[4771]: I0219 23:07:08.457365 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83e63be-3e95-41da-9028-37602da9626f" path="/var/lib/kubelet/pods/f83e63be-3e95-41da-9028-37602da9626f/volumes" Feb 19 23:07:08 crc kubenswrapper[4771]: I0219 23:07:08.745216 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.103:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:07:08 crc kubenswrapper[4771]: I0219 23:07:08.745229 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.103:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:07:09 crc kubenswrapper[4771]: I0219 23:07:09.252857 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96e375a6-999c-4272-a25c-a8b87239095d","Type":"ContainerStarted","Data":"9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648"} Feb 19 23:07:09 crc kubenswrapper[4771]: I0219 23:07:09.252919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96e375a6-999c-4272-a25c-a8b87239095d","Type":"ContainerStarted","Data":"a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286"} Feb 19 23:07:09 crc kubenswrapper[4771]: I0219 23:07:09.281689 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"069fb38a-cbd9-4bb9-8441-682d97292af1","Type":"ContainerStarted","Data":"ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed"} Feb 19 23:07:09 crc kubenswrapper[4771]: I0219 23:07:09.281987 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"069fb38a-cbd9-4bb9-8441-682d97292af1","Type":"ContainerStarted","Data":"33eed73d7be7a0e3e3e7d8a1c0c32eacfe66ccc7860fb13bbe489a57908588ef"} Feb 19 23:07:09 crc kubenswrapper[4771]: I0219 23:07:09.307315 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.307297068 podStartE2EDuration="2.307297068s" podCreationTimestamp="2026-02-19 23:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:07:09.297794855 +0000 UTC m=+5929.569237335" watchObservedRunningTime="2026-02-19 23:07:09.307297068 +0000 UTC m=+5929.578739528" Feb 19 23:07:09 crc kubenswrapper[4771]: I0219 23:07:09.348385 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.348361824 podStartE2EDuration="2.348361824s" podCreationTimestamp="2026-02-19 23:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:07:09.340431573 +0000 UTC m=+5929.611874073" watchObservedRunningTime="2026-02-19 23:07:09.348361824 +0000 UTC m=+5929.619804294" Feb 19 23:07:12 crc kubenswrapper[4771]: I0219 23:07:12.705969 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 23:07:17 crc kubenswrapper[4771]: I0219 23:07:17.705645 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 23:07:17 crc kubenswrapper[4771]: I0219 23:07:17.720500 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:07:17 crc kubenswrapper[4771]: I0219 23:07:17.721145 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:07:17 crc kubenswrapper[4771]: I0219 23:07:17.733724 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:07:17 crc kubenswrapper[4771]: I0219 23:07:17.740452 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:07:17 crc kubenswrapper[4771]: I0219 23:07:17.740592 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:07:17 crc kubenswrapper[4771]: I0219 23:07:17.750269 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 23:07:18 crc kubenswrapper[4771]: I0219 23:07:18.429957 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:07:18 crc kubenswrapper[4771]: I0219 23:07:18.474738 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 23:07:18 crc kubenswrapper[4771]: I0219 23:07:18.762265 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.105:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:07:18 crc kubenswrapper[4771]: I0219 23:07:18.803242 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.105:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:07:27 crc kubenswrapper[4771]: I0219 23:07:27.725473 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:07:27 crc kubenswrapper[4771]: I0219 23:07:27.727694 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:07:27 crc kubenswrapper[4771]: I0219 23:07:27.728643 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:07:27 crc kubenswrapper[4771]: I0219 23:07:27.738781 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.524325 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.527317 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.728612 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59f59d5469-nmcdd"] Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.730386 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.749523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59f59d5469-nmcdd"] Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.868746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns278\" (UniqueName: \"kubernetes.io/projected/4deb3c04-8000-4b51-b811-5fed89da11f9-kube-api-access-ns278\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.868808 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-nb\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.869094 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-sb\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.869738 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-dns-svc\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.869856 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-config\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.971437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns278\" (UniqueName: \"kubernetes.io/projected/4deb3c04-8000-4b51-b811-5fed89da11f9-kube-api-access-ns278\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.971498 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-nb\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.971555 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-sb\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.971594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-dns-svc\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.971621 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-config\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.972703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-dns-svc\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.972701 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-nb\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.972785 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-sb\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.973238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-config\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:28 crc kubenswrapper[4771]: I0219 23:07:28.994764 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns278\" (UniqueName: \"kubernetes.io/projected/4deb3c04-8000-4b51-b811-5fed89da11f9-kube-api-access-ns278\") pod \"dnsmasq-dns-59f59d5469-nmcdd\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:29 crc kubenswrapper[4771]: I0219 23:07:29.060808 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:29 crc kubenswrapper[4771]: I0219 23:07:29.528574 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59f59d5469-nmcdd"] Feb 19 23:07:30 crc kubenswrapper[4771]: I0219 23:07:30.547641 4771 generic.go:334] "Generic (PLEG): container finished" podID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerID="ee259689c0dd1c5f2195500575a146ea48e5b1ee33a49c370c0f3a578c9777c4" exitCode=0 Feb 19 23:07:30 crc kubenswrapper[4771]: I0219 23:07:30.547731 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" event={"ID":"4deb3c04-8000-4b51-b811-5fed89da11f9","Type":"ContainerDied","Data":"ee259689c0dd1c5f2195500575a146ea48e5b1ee33a49c370c0f3a578c9777c4"} Feb 19 23:07:30 crc kubenswrapper[4771]: I0219 23:07:30.548118 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" event={"ID":"4deb3c04-8000-4b51-b811-5fed89da11f9","Type":"ContainerStarted","Data":"489d8b9e7a5d3b5027f595993b83f403fdd83ad356dae895e14c6c9271139364"} Feb 19 23:07:31 crc kubenswrapper[4771]: I0219 23:07:31.560780 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" event={"ID":"4deb3c04-8000-4b51-b811-5fed89da11f9","Type":"ContainerStarted","Data":"a125b5df89feb4839f77a4bfae4c008ef9e001055736db446ca43ca7673360ce"} Feb 19 23:07:31 crc kubenswrapper[4771]: I0219 23:07:31.561072 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:31 crc kubenswrapper[4771]: I0219 23:07:31.580759 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" podStartSLOduration=3.580731578 podStartE2EDuration="3.580731578s" podCreationTimestamp="2026-02-19 23:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:07:31.580525843 +0000 UTC m=+5951.851968323" watchObservedRunningTime="2026-02-19 23:07:31.580731578 +0000 UTC m=+5951.852174048" Feb 19 23:07:31 crc kubenswrapper[4771]: I0219 23:07:31.608795 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:31 crc kubenswrapper[4771]: I0219 23:07:31.609097 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-log" containerID="cri-o://a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286" gracePeriod=30 Feb 19 23:07:31 crc kubenswrapper[4771]: I0219 23:07:31.609164 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-api" containerID="cri-o://9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648" gracePeriod=30 Feb 19 23:07:32 crc kubenswrapper[4771]: I0219 23:07:32.569718 4771 generic.go:334] "Generic (PLEG): container finished" podID="96e375a6-999c-4272-a25c-a8b87239095d" containerID="a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286" exitCode=143 Feb 19 23:07:32 crc kubenswrapper[4771]: I0219 23:07:32.569830 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96e375a6-999c-4272-a25c-a8b87239095d","Type":"ContainerDied","Data":"a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286"} Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.248182 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.294705 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e375a6-999c-4272-a25c-a8b87239095d-logs\") pod \"96e375a6-999c-4272-a25c-a8b87239095d\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.294918 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-combined-ca-bundle\") pod \"96e375a6-999c-4272-a25c-a8b87239095d\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.295068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsphl\" (UniqueName: \"kubernetes.io/projected/96e375a6-999c-4272-a25c-a8b87239095d-kube-api-access-vsphl\") pod \"96e375a6-999c-4272-a25c-a8b87239095d\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.295180 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-config-data\") pod \"96e375a6-999c-4272-a25c-a8b87239095d\" (UID: \"96e375a6-999c-4272-a25c-a8b87239095d\") " Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.306159 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e375a6-999c-4272-a25c-a8b87239095d-logs" (OuterVolumeSpecName: "logs") pod "96e375a6-999c-4272-a25c-a8b87239095d" (UID: "96e375a6-999c-4272-a25c-a8b87239095d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.321615 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e375a6-999c-4272-a25c-a8b87239095d-kube-api-access-vsphl" (OuterVolumeSpecName: "kube-api-access-vsphl") pod "96e375a6-999c-4272-a25c-a8b87239095d" (UID: "96e375a6-999c-4272-a25c-a8b87239095d"). InnerVolumeSpecName "kube-api-access-vsphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.349812 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96e375a6-999c-4272-a25c-a8b87239095d" (UID: "96e375a6-999c-4272-a25c-a8b87239095d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.361398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-config-data" (OuterVolumeSpecName: "config-data") pod "96e375a6-999c-4272-a25c-a8b87239095d" (UID: "96e375a6-999c-4272-a25c-a8b87239095d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.402160 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e375a6-999c-4272-a25c-a8b87239095d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.402337 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.402395 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsphl\" (UniqueName: \"kubernetes.io/projected/96e375a6-999c-4272-a25c-a8b87239095d-kube-api-access-vsphl\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.402447 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e375a6-999c-4272-a25c-a8b87239095d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.597074 4771 generic.go:334] "Generic (PLEG): container finished" podID="96e375a6-999c-4272-a25c-a8b87239095d" containerID="9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648" exitCode=0 Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.597428 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.597425 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96e375a6-999c-4272-a25c-a8b87239095d","Type":"ContainerDied","Data":"9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648"} Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.597848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96e375a6-999c-4272-a25c-a8b87239095d","Type":"ContainerDied","Data":"74245bbcd69b700d1a2d284a95f8d8b3f61c42eeaf242ab5bc1c07b118c70260"} Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.597875 4771 scope.go:117] "RemoveContainer" containerID="9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.629209 4771 scope.go:117] "RemoveContainer" containerID="a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.637675 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.654682 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.657530 4771 scope.go:117] "RemoveContainer" containerID="9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648" Feb 19 23:07:35 crc kubenswrapper[4771]: E0219 23:07:35.658473 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648\": container with ID starting with 9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648 not found: ID does not exist" containerID="9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.658693 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648"} err="failed to get container status \"9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648\": rpc error: code = NotFound desc = could not find container \"9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648\": container with ID starting with 9f315fa9f599af836b7b907a8a709f51c56795cb354b30d95122411f60edb648 not found: ID does not exist" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.658715 4771 scope.go:117] "RemoveContainer" containerID="a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286" Feb 19 23:07:35 crc kubenswrapper[4771]: E0219 23:07:35.659191 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286\": container with ID starting with a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286 not found: ID does not exist" containerID="a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.659212 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286"} err="failed to get container status \"a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286\": rpc error: code = NotFound desc = could not find container \"a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286\": container with ID starting with a6db1d8c627f1bc9a124338687e05e0a68ccc3cb2751ab7b3266ed61e61a9286 not found: ID does not exist" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.669897 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:35 crc kubenswrapper[4771]: E0219 23:07:35.670559 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-log" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.670633 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-log" Feb 19 23:07:35 crc kubenswrapper[4771]: E0219 23:07:35.670701 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-api" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.670758 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-api" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.670968 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-log" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.671057 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e375a6-999c-4272-a25c-a8b87239095d" containerName="nova-api-api" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.672298 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.674683 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.674926 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.675053 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.680514 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.710662 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.710740 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-config-data\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.710764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.710829 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b665f9c9-6923-456e-8e7e-01e55da9d3af-logs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.710850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-public-tls-certs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.710867 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd67m\" (UniqueName: \"kubernetes.io/projected/b665f9c9-6923-456e-8e7e-01e55da9d3af-kube-api-access-kd67m\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.812174 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-config-data\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.812227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.812317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b665f9c9-6923-456e-8e7e-01e55da9d3af-logs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.812347 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-public-tls-certs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.812372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd67m\" (UniqueName: \"kubernetes.io/projected/b665f9c9-6923-456e-8e7e-01e55da9d3af-kube-api-access-kd67m\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.813225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.814375 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b665f9c9-6923-456e-8e7e-01e55da9d3af-logs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.816049 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.816182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-public-tls-certs\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.816604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-config-data\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.817052 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:35 crc kubenswrapper[4771]: I0219 23:07:35.828344 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd67m\" (UniqueName: \"kubernetes.io/projected/b665f9c9-6923-456e-8e7e-01e55da9d3af-kube-api-access-kd67m\") pod \"nova-api-0\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " pod="openstack/nova-api-0" Feb 19 23:07:36 crc kubenswrapper[4771]: I0219 23:07:36.026459 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:07:36 crc kubenswrapper[4771]: I0219 23:07:36.453692 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e375a6-999c-4272-a25c-a8b87239095d" path="/var/lib/kubelet/pods/96e375a6-999c-4272-a25c-a8b87239095d/volumes" Feb 19 23:07:36 crc kubenswrapper[4771]: I0219 23:07:36.502883 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:07:36 crc kubenswrapper[4771]: I0219 23:07:36.613534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b665f9c9-6923-456e-8e7e-01e55da9d3af","Type":"ContainerStarted","Data":"69dd320a71f64bd79198e5ffb2eb3f857f73532cc4923eb2cdc9819fc501d46a"} Feb 19 23:07:37 crc kubenswrapper[4771]: I0219 23:07:37.626964 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b665f9c9-6923-456e-8e7e-01e55da9d3af","Type":"ContainerStarted","Data":"dad4a04c947f88686d8a1958e783ebbcd808b0638945c37446ccb842f6bfda10"} Feb 19 23:07:37 crc kubenswrapper[4771]: I0219 23:07:37.627354 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b665f9c9-6923-456e-8e7e-01e55da9d3af","Type":"ContainerStarted","Data":"6a5c974672114f685f86c13e4161538433eb963951f5922fc576ce2a839ec846"} Feb 19 23:07:37 crc kubenswrapper[4771]: I0219 23:07:37.667499 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.667473518 podStartE2EDuration="2.667473518s" podCreationTimestamp="2026-02-19 23:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:07:37.666161362 +0000 UTC m=+5957.937603882" watchObservedRunningTime="2026-02-19 23:07:37.667473518 +0000 UTC m=+5957.938915998" Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.063324 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.154910 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b64ddc7-k89m4"] Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.155637 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" podUID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerName="dnsmasq-dns" containerID="cri-o://e60abef7e799ddede668637c96e1d038d8ca96499e37e7b44ef9bd2e9c7c3ee0" gracePeriod=10 Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.650467 4771 generic.go:334] "Generic (PLEG): container finished" podID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerID="e60abef7e799ddede668637c96e1d038d8ca96499e37e7b44ef9bd2e9c7c3ee0" exitCode=0 Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.650511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" event={"ID":"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac","Type":"ContainerDied","Data":"e60abef7e799ddede668637c96e1d038d8ca96499e37e7b44ef9bd2e9c7c3ee0"} Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.771174 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.924727 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4wd\" (UniqueName: \"kubernetes.io/projected/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-kube-api-access-pt4wd\") pod \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.924870 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-dns-svc\") pod \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.924896 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-sb\") pod \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.924949 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-config\") pod \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.924980 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-nb\") pod \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\" (UID: \"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac\") " Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.931296 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-kube-api-access-pt4wd" (OuterVolumeSpecName: "kube-api-access-pt4wd") pod "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" (UID: "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac"). InnerVolumeSpecName "kube-api-access-pt4wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.988986 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" (UID: "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.989632 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-config" (OuterVolumeSpecName: "config") pod "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" (UID: "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:07:39 crc kubenswrapper[4771]: I0219 23:07:39.990421 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" (UID: "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.006607 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" (UID: "c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.026964 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.027030 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.027050 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.027063 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.027077 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt4wd\" (UniqueName: \"kubernetes.io/projected/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac-kube-api-access-pt4wd\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.677606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" event={"ID":"c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac","Type":"ContainerDied","Data":"93f6bb88960227fede59891ed1432dd567b41b363a1dd5d7912a50772e0590f8"} Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.677670 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b64ddc7-k89m4" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.677681 4771 scope.go:117] "RemoveContainer" containerID="e60abef7e799ddede668637c96e1d038d8ca96499e37e7b44ef9bd2e9c7c3ee0" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.726964 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b64ddc7-k89m4"] Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.727053 4771 scope.go:117] "RemoveContainer" containerID="7f75ef3ce5e26e818f49f787ada5dc2ae57d2a35f5bf3c17ee280ecb3b4cd859" Feb 19 23:07:40 crc kubenswrapper[4771]: I0219 23:07:40.737450 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b64ddc7-k89m4"] Feb 19 23:07:42 crc kubenswrapper[4771]: I0219 23:07:42.456672 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" path="/var/lib/kubelet/pods/c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac/volumes" Feb 19 23:07:42 crc kubenswrapper[4771]: I0219 23:07:42.957447 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:07:42 crc kubenswrapper[4771]: I0219 23:07:42.957523 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:07:46 crc kubenswrapper[4771]: I0219 23:07:46.027104 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:07:46 crc kubenswrapper[4771]: I0219 23:07:46.029055 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:07:47 crc kubenswrapper[4771]: I0219 23:07:47.048323 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.107:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:07:47 crc kubenswrapper[4771]: I0219 23:07:47.048317 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.107:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:07:56 crc kubenswrapper[4771]: I0219 23:07:56.039217 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:07:56 crc kubenswrapper[4771]: I0219 23:07:56.040156 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:07:56 crc kubenswrapper[4771]: I0219 23:07:56.041230 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:07:56 crc kubenswrapper[4771]: I0219 23:07:56.042004 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:07:56 crc kubenswrapper[4771]: I0219 23:07:56.057307 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:07:56 crc kubenswrapper[4771]: I0219 23:07:56.061830 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:08:12 crc kubenswrapper[4771]: I0219 23:08:12.957077 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:08:12 crc kubenswrapper[4771]: I0219 23:08:12.957723 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.544300 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-npgwb"] Feb 19 23:08:16 crc kubenswrapper[4771]: E0219 23:08:16.545289 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerName="init" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.545306 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerName="init" Feb 19 23:08:16 crc kubenswrapper[4771]: E0219 23:08:16.545314 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerName="dnsmasq-dns" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.545320 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerName="dnsmasq-dns" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.545528 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c47dc4f8-d8a0-4b02-9f9f-2ab710fdb3ac" containerName="dnsmasq-dns" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.546264 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.548703 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.549183 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zcckm" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.550489 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.560283 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-npgwb"] Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.571428 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-xtdt7"] Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.573779 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.598101 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xtdt7"] Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.616771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/763b0b3a-ac9e-4528-8cfc-039575701bb2-scripts\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.616827 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v797h\" (UniqueName: \"kubernetes.io/projected/763b0b3a-ac9e-4528-8cfc-039575701bb2-kube-api-access-v797h\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.616846 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-lib\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.616887 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-log\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.616920 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-run\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.616940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-etc-ovs\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.616958 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-run\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.617009 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-run-ovn\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.617056 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b0b3a-ac9e-4528-8cfc-039575701bb2-ovn-controller-tls-certs\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.617086 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763b0b3a-ac9e-4528-8cfc-039575701bb2-combined-ca-bundle\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.617103 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a899f18-4f99-4fd0-95c8-5198aa786938-scripts\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.617122 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-log-ovn\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.617149 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg9lh\" (UniqueName: \"kubernetes.io/projected/7a899f18-4f99-4fd0-95c8-5198aa786938-kube-api-access-gg9lh\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.718665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-run-ovn\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.719119 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b0b3a-ac9e-4528-8cfc-039575701bb2-ovn-controller-tls-certs\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.719330 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763b0b3a-ac9e-4528-8cfc-039575701bb2-combined-ca-bundle\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.719564 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a899f18-4f99-4fd0-95c8-5198aa786938-scripts\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.719801 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-log-ovn\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.719971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-log-ovn\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.719146 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-run-ovn\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.720394 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg9lh\" (UniqueName: \"kubernetes.io/projected/7a899f18-4f99-4fd0-95c8-5198aa786938-kube-api-access-gg9lh\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.721122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/763b0b3a-ac9e-4528-8cfc-039575701bb2-scripts\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.721331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v797h\" (UniqueName: \"kubernetes.io/projected/763b0b3a-ac9e-4528-8cfc-039575701bb2-kube-api-access-v797h\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.721476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-lib\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.721667 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-log\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.721841 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-run\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.722007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-etc-ovs\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.722225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-run\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.722585 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/763b0b3a-ac9e-4528-8cfc-039575701bb2-var-run\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.723167 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a899f18-4f99-4fd0-95c8-5198aa786938-scripts\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.723307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-log\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.723757 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-lib\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.723847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-var-run\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.723944 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7a899f18-4f99-4fd0-95c8-5198aa786938-etc-ovs\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.725384 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/763b0b3a-ac9e-4528-8cfc-039575701bb2-ovn-controller-tls-certs\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.725789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/763b0b3a-ac9e-4528-8cfc-039575701bb2-scripts\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.726772 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763b0b3a-ac9e-4528-8cfc-039575701bb2-combined-ca-bundle\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.739139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg9lh\" (UniqueName: \"kubernetes.io/projected/7a899f18-4f99-4fd0-95c8-5198aa786938-kube-api-access-gg9lh\") pod \"ovn-controller-ovs-xtdt7\" (UID: \"7a899f18-4f99-4fd0-95c8-5198aa786938\") " pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.746521 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v797h\" (UniqueName: \"kubernetes.io/projected/763b0b3a-ac9e-4528-8cfc-039575701bb2-kube-api-access-v797h\") pod \"ovn-controller-npgwb\" (UID: \"763b0b3a-ac9e-4528-8cfc-039575701bb2\") " pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.864800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-npgwb" Feb 19 23:08:16 crc kubenswrapper[4771]: I0219 23:08:16.898181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:17 crc kubenswrapper[4771]: I0219 23:08:17.391631 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-npgwb"] Feb 19 23:08:17 crc kubenswrapper[4771]: I0219 23:08:17.792795 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xtdt7"] Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.067125 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xtdt7" event={"ID":"7a899f18-4f99-4fd0-95c8-5198aa786938","Type":"ContainerStarted","Data":"4355b7d9190cc5c24cdce525258d9f82de8a4cd5e56f215da01850d6f5bf98c2"} Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.067641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xtdt7" event={"ID":"7a899f18-4f99-4fd0-95c8-5198aa786938","Type":"ContainerStarted","Data":"d0a69a57d9c2fd67b877599e62ca16a74d499ac9fa305522ea9114ccc9b3dafd"} Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.070463 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-npgwb" event={"ID":"763b0b3a-ac9e-4528-8cfc-039575701bb2","Type":"ContainerStarted","Data":"734c6793e033411f2f36368d5cc3cd0af2cca373f69a4a15a42cc8f8d4d004e6"} Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.070525 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-npgwb" event={"ID":"763b0b3a-ac9e-4528-8cfc-039575701bb2","Type":"ContainerStarted","Data":"3e0f823c35a976dda0164c80316aa13533fe2055e1bcf588f54b3ff13b724de8"} Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.070670 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-npgwb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.091810 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-q4kgb"] Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.093105 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.095817 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.104873 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q4kgb"] Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.138040 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-npgwb" podStartSLOduration=2.13800489 podStartE2EDuration="2.13800489s" podCreationTimestamp="2026-02-19 23:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:18.130684154 +0000 UTC m=+5998.402126634" watchObservedRunningTime="2026-02-19 23:08:18.13800489 +0000 UTC m=+5998.409447360" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.163285 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb568\" (UniqueName: \"kubernetes.io/projected/d1485977-7e7a-41e4-aedc-c208c6763dd2-kube-api-access-qb568\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.163346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1485977-7e7a-41e4-aedc-c208c6763dd2-combined-ca-bundle\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.163411 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d1485977-7e7a-41e4-aedc-c208c6763dd2-ovs-rundir\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.163489 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1485977-7e7a-41e4-aedc-c208c6763dd2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.163537 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d1485977-7e7a-41e4-aedc-c208c6763dd2-ovn-rundir\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.163554 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1485977-7e7a-41e4-aedc-c208c6763dd2-config\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266175 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1485977-7e7a-41e4-aedc-c208c6763dd2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d1485977-7e7a-41e4-aedc-c208c6763dd2-ovn-rundir\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1485977-7e7a-41e4-aedc-c208c6763dd2-config\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266412 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb568\" (UniqueName: \"kubernetes.io/projected/d1485977-7e7a-41e4-aedc-c208c6763dd2-kube-api-access-qb568\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1485977-7e7a-41e4-aedc-c208c6763dd2-combined-ca-bundle\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d1485977-7e7a-41e4-aedc-c208c6763dd2-ovs-rundir\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266656 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d1485977-7e7a-41e4-aedc-c208c6763dd2-ovn-rundir\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.266718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d1485977-7e7a-41e4-aedc-c208c6763dd2-ovs-rundir\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.267422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1485977-7e7a-41e4-aedc-c208c6763dd2-config\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.283718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1485977-7e7a-41e4-aedc-c208c6763dd2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.284010 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1485977-7e7a-41e4-aedc-c208c6763dd2-combined-ca-bundle\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.287491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb568\" (UniqueName: \"kubernetes.io/projected/d1485977-7e7a-41e4-aedc-c208c6763dd2-kube-api-access-qb568\") pod \"ovn-controller-metrics-q4kgb\" (UID: \"d1485977-7e7a-41e4-aedc-c208c6763dd2\") " pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.409215 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q4kgb" Feb 19 23:08:18 crc kubenswrapper[4771]: I0219 23:08:18.853935 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q4kgb"] Feb 19 23:08:19 crc kubenswrapper[4771]: I0219 23:08:19.086740 4771 generic.go:334] "Generic (PLEG): container finished" podID="7a899f18-4f99-4fd0-95c8-5198aa786938" containerID="4355b7d9190cc5c24cdce525258d9f82de8a4cd5e56f215da01850d6f5bf98c2" exitCode=0 Feb 19 23:08:19 crc kubenswrapper[4771]: I0219 23:08:19.089518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xtdt7" event={"ID":"7a899f18-4f99-4fd0-95c8-5198aa786938","Type":"ContainerDied","Data":"4355b7d9190cc5c24cdce525258d9f82de8a4cd5e56f215da01850d6f5bf98c2"} Feb 19 23:08:19 crc kubenswrapper[4771]: I0219 23:08:19.091503 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q4kgb" event={"ID":"d1485977-7e7a-41e4-aedc-c208c6763dd2","Type":"ContainerStarted","Data":"cbd5b00a36daff5a7aa06ca46d0d065c61902f2e8d61eb043af39b5d15be3820"} Feb 19 23:08:20 crc kubenswrapper[4771]: I0219 23:08:20.120293 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xtdt7" event={"ID":"7a899f18-4f99-4fd0-95c8-5198aa786938","Type":"ContainerStarted","Data":"cbf7eb33bf56e1ef8708d625d92a47ed557340e6900a2cfd2afdf132c066693c"} Feb 19 23:08:20 crc kubenswrapper[4771]: I0219 23:08:20.121130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xtdt7" event={"ID":"7a899f18-4f99-4fd0-95c8-5198aa786938","Type":"ContainerStarted","Data":"320675c0038edc20601b036fc344c81906891f7d59e3074887e20fb0d335d9e2"} Feb 19 23:08:20 crc kubenswrapper[4771]: I0219 23:08:20.121210 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:20 crc kubenswrapper[4771]: I0219 23:08:20.121236 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:20 crc kubenswrapper[4771]: I0219 23:08:20.122974 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q4kgb" event={"ID":"d1485977-7e7a-41e4-aedc-c208c6763dd2","Type":"ContainerStarted","Data":"fd5f22e6ed0e49ddc4ab01d2172cd0ec7594d793ff9251f012ef5fc08ab07f63"} Feb 19 23:08:20 crc kubenswrapper[4771]: I0219 23:08:20.147610 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-xtdt7" podStartSLOduration=4.1475954 podStartE2EDuration="4.1475954s" podCreationTimestamp="2026-02-19 23:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:20.141519068 +0000 UTC m=+6000.412961548" watchObservedRunningTime="2026-02-19 23:08:20.1475954 +0000 UTC m=+6000.419037860" Feb 19 23:08:20 crc kubenswrapper[4771]: I0219 23:08:20.167574 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-q4kgb" podStartSLOduration=2.167551993 podStartE2EDuration="2.167551993s" podCreationTimestamp="2026-02-19 23:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:20.160509015 +0000 UTC m=+6000.431951495" watchObservedRunningTime="2026-02-19 23:08:20.167551993 +0000 UTC m=+6000.438994473" Feb 19 23:08:29 crc kubenswrapper[4771]: I0219 23:08:29.060382 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dfx8t"] Feb 19 23:08:29 crc kubenswrapper[4771]: I0219 23:08:29.077063 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5f8b-account-create-update-d6cch"] Feb 19 23:08:29 crc kubenswrapper[4771]: I0219 23:08:29.088234 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dfx8t"] Feb 19 23:08:29 crc kubenswrapper[4771]: I0219 23:08:29.097129 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5f8b-account-create-update-d6cch"] Feb 19 23:08:30 crc kubenswrapper[4771]: I0219 23:08:30.449445 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7d2c11-273d-4266-8f85-f4dec666b9b6" path="/var/lib/kubelet/pods/6c7d2c11-273d-4266-8f85-f4dec666b9b6/volumes" Feb 19 23:08:30 crc kubenswrapper[4771]: I0219 23:08:30.450554 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7eca99-3572-4f98-a1ed-d06666e1b77a" path="/var/lib/kubelet/pods/db7eca99-3572-4f98-a1ed-d06666e1b77a/volumes" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.263640 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-nj457"] Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.265098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nj457" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.279989 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-nj457"] Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.344011 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7xkk\" (UniqueName: \"kubernetes.io/projected/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-kube-api-access-p7xkk\") pod \"octavia-db-create-nj457\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " pod="openstack/octavia-db-create-nj457" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.344113 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-operator-scripts\") pod \"octavia-db-create-nj457\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " pod="openstack/octavia-db-create-nj457" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.445918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7xkk\" (UniqueName: \"kubernetes.io/projected/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-kube-api-access-p7xkk\") pod \"octavia-db-create-nj457\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " pod="openstack/octavia-db-create-nj457" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.446064 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-operator-scripts\") pod \"octavia-db-create-nj457\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " pod="openstack/octavia-db-create-nj457" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.446957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-operator-scripts\") pod \"octavia-db-create-nj457\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " pod="openstack/octavia-db-create-nj457" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.464837 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7xkk\" (UniqueName: \"kubernetes.io/projected/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-kube-api-access-p7xkk\") pod \"octavia-db-create-nj457\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " pod="openstack/octavia-db-create-nj457" Feb 19 23:08:31 crc kubenswrapper[4771]: I0219 23:08:31.626395 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nj457" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.139256 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-nj457"] Feb 19 23:08:32 crc kubenswrapper[4771]: W0219 23:08:32.144205 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575155d0_ff3a_4e31_9f9d_e7005ec5b5cb.slice/crio-39af7855a9e2321e5e18352fc5bbd66196bc4cfe56f5aef8d8fcaf234d665a81 WatchSource:0}: Error finding container 39af7855a9e2321e5e18352fc5bbd66196bc4cfe56f5aef8d8fcaf234d665a81: Status 404 returned error can't find the container with id 39af7855a9e2321e5e18352fc5bbd66196bc4cfe56f5aef8d8fcaf234d665a81 Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.274089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nj457" event={"ID":"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb","Type":"ContainerStarted","Data":"39af7855a9e2321e5e18352fc5bbd66196bc4cfe56f5aef8d8fcaf234d665a81"} Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.291194 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-98eb-account-create-update-jvwgt"] Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.292783 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.295091 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.301308 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-98eb-account-create-update-jvwgt"] Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.367867 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5147a669-e152-45ec-80cb-a7318eac23a1-operator-scripts\") pod \"octavia-98eb-account-create-update-jvwgt\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.368223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ww4g\" (UniqueName: \"kubernetes.io/projected/5147a669-e152-45ec-80cb-a7318eac23a1-kube-api-access-2ww4g\") pod \"octavia-98eb-account-create-update-jvwgt\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.470752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ww4g\" (UniqueName: \"kubernetes.io/projected/5147a669-e152-45ec-80cb-a7318eac23a1-kube-api-access-2ww4g\") pod \"octavia-98eb-account-create-update-jvwgt\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.470974 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5147a669-e152-45ec-80cb-a7318eac23a1-operator-scripts\") pod \"octavia-98eb-account-create-update-jvwgt\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.471751 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5147a669-e152-45ec-80cb-a7318eac23a1-operator-scripts\") pod \"octavia-98eb-account-create-update-jvwgt\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.498703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ww4g\" (UniqueName: \"kubernetes.io/projected/5147a669-e152-45ec-80cb-a7318eac23a1-kube-api-access-2ww4g\") pod \"octavia-98eb-account-create-update-jvwgt\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:32 crc kubenswrapper[4771]: I0219 23:08:32.626062 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:33 crc kubenswrapper[4771]: I0219 23:08:33.198610 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-98eb-account-create-update-jvwgt"] Feb 19 23:08:33 crc kubenswrapper[4771]: I0219 23:08:33.284852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-98eb-account-create-update-jvwgt" event={"ID":"5147a669-e152-45ec-80cb-a7318eac23a1","Type":"ContainerStarted","Data":"7814edc711a0f6d53d73f29da2c53d5c9c3a948fa71d99980a537d1b5274ee9a"} Feb 19 23:08:33 crc kubenswrapper[4771]: I0219 23:08:33.288407 4771 generic.go:334] "Generic (PLEG): container finished" podID="575155d0-ff3a-4e31-9f9d-e7005ec5b5cb" containerID="5762d750b90d48a30a4ebf135b72e2a9a608059df1041b0bcc28566cdf3efd5a" exitCode=0 Feb 19 23:08:33 crc kubenswrapper[4771]: I0219 23:08:33.288461 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nj457" event={"ID":"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb","Type":"ContainerDied","Data":"5762d750b90d48a30a4ebf135b72e2a9a608059df1041b0bcc28566cdf3efd5a"} Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.329624 4771 generic.go:334] "Generic (PLEG): container finished" podID="5147a669-e152-45ec-80cb-a7318eac23a1" containerID="7a110e52caf3d8ab491cb6f9085ce7bdf86c6ca494e48c82b2a9e741ca866d36" exitCode=0 Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.330208 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-98eb-account-create-update-jvwgt" event={"ID":"5147a669-e152-45ec-80cb-a7318eac23a1","Type":"ContainerDied","Data":"7a110e52caf3d8ab491cb6f9085ce7bdf86c6ca494e48c82b2a9e741ca866d36"} Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.716337 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nj457" Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.826120 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7xkk\" (UniqueName: \"kubernetes.io/projected/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-kube-api-access-p7xkk\") pod \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.826210 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-operator-scripts\") pod \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\" (UID: \"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb\") " Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.827955 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "575155d0-ff3a-4e31-9f9d-e7005ec5b5cb" (UID: "575155d0-ff3a-4e31-9f9d-e7005ec5b5cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.836179 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-kube-api-access-p7xkk" (OuterVolumeSpecName: "kube-api-access-p7xkk") pod "575155d0-ff3a-4e31-9f9d-e7005ec5b5cb" (UID: "575155d0-ff3a-4e31-9f9d-e7005ec5b5cb"). InnerVolumeSpecName "kube-api-access-p7xkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.928949 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7xkk\" (UniqueName: \"kubernetes.io/projected/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-kube-api-access-p7xkk\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:34 crc kubenswrapper[4771]: I0219 23:08:34.929301 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.346591 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nj457" event={"ID":"575155d0-ff3a-4e31-9f9d-e7005ec5b5cb","Type":"ContainerDied","Data":"39af7855a9e2321e5e18352fc5bbd66196bc4cfe56f5aef8d8fcaf234d665a81"} Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.346641 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nj457" Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.346666 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39af7855a9e2321e5e18352fc5bbd66196bc4cfe56f5aef8d8fcaf234d665a81" Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.766140 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.847509 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5147a669-e152-45ec-80cb-a7318eac23a1-operator-scripts\") pod \"5147a669-e152-45ec-80cb-a7318eac23a1\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.847836 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ww4g\" (UniqueName: \"kubernetes.io/projected/5147a669-e152-45ec-80cb-a7318eac23a1-kube-api-access-2ww4g\") pod \"5147a669-e152-45ec-80cb-a7318eac23a1\" (UID: \"5147a669-e152-45ec-80cb-a7318eac23a1\") " Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.848282 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5147a669-e152-45ec-80cb-a7318eac23a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5147a669-e152-45ec-80cb-a7318eac23a1" (UID: "5147a669-e152-45ec-80cb-a7318eac23a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.848621 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5147a669-e152-45ec-80cb-a7318eac23a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.854733 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5147a669-e152-45ec-80cb-a7318eac23a1-kube-api-access-2ww4g" (OuterVolumeSpecName: "kube-api-access-2ww4g") pod "5147a669-e152-45ec-80cb-a7318eac23a1" (UID: "5147a669-e152-45ec-80cb-a7318eac23a1"). InnerVolumeSpecName "kube-api-access-2ww4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:35 crc kubenswrapper[4771]: I0219 23:08:35.949575 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ww4g\" (UniqueName: \"kubernetes.io/projected/5147a669-e152-45ec-80cb-a7318eac23a1-kube-api-access-2ww4g\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.061278 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-k7869"] Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.072122 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-k7869"] Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.212696 4771 scope.go:117] "RemoveContainer" containerID="ad53b8b8a68c591c493fcd858b6b131fab7cad18e7607df8c53f89e08f686ba9" Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.242549 4771 scope.go:117] "RemoveContainer" containerID="fdba08747089b717ccfe547f67abe469c1570a8c25cd24e10608444785edee00" Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.363025 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-98eb-account-create-update-jvwgt" event={"ID":"5147a669-e152-45ec-80cb-a7318eac23a1","Type":"ContainerDied","Data":"7814edc711a0f6d53d73f29da2c53d5c9c3a948fa71d99980a537d1b5274ee9a"} Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.363063 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7814edc711a0f6d53d73f29da2c53d5c9c3a948fa71d99980a537d1b5274ee9a" Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.363213 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-98eb-account-create-update-jvwgt" Feb 19 23:08:36 crc kubenswrapper[4771]: I0219 23:08:36.451399 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="504b9144-58a8-4842-b5bf-12d4d4498a38" path="/var/lib/kubelet/pods/504b9144-58a8-4842-b5bf-12d4d4498a38/volumes" Feb 19 23:08:37 crc kubenswrapper[4771]: I0219 23:08:37.878862 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-trz9l"] Feb 19 23:08:37 crc kubenswrapper[4771]: E0219 23:08:37.879812 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575155d0-ff3a-4e31-9f9d-e7005ec5b5cb" containerName="mariadb-database-create" Feb 19 23:08:37 crc kubenswrapper[4771]: I0219 23:08:37.879832 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="575155d0-ff3a-4e31-9f9d-e7005ec5b5cb" containerName="mariadb-database-create" Feb 19 23:08:37 crc kubenswrapper[4771]: E0219 23:08:37.879858 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5147a669-e152-45ec-80cb-a7318eac23a1" containerName="mariadb-account-create-update" Feb 19 23:08:37 crc kubenswrapper[4771]: I0219 23:08:37.879867 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5147a669-e152-45ec-80cb-a7318eac23a1" containerName="mariadb-account-create-update" Feb 19 23:08:37 crc kubenswrapper[4771]: I0219 23:08:37.880120 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5147a669-e152-45ec-80cb-a7318eac23a1" containerName="mariadb-account-create-update" Feb 19 23:08:37 crc kubenswrapper[4771]: I0219 23:08:37.880148 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="575155d0-ff3a-4e31-9f9d-e7005ec5b5cb" containerName="mariadb-database-create" Feb 19 23:08:37 crc kubenswrapper[4771]: I0219 23:08:37.880951 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:37 crc kubenswrapper[4771]: I0219 23:08:37.906137 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-trz9l"] Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.016607 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8k9z\" (UniqueName: \"kubernetes.io/projected/9a63473d-4647-4d2d-bdd3-cab040945fa0-kube-api-access-q8k9z\") pod \"octavia-persistence-db-create-trz9l\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.016927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a63473d-4647-4d2d-bdd3-cab040945fa0-operator-scripts\") pod \"octavia-persistence-db-create-trz9l\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.119375 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8k9z\" (UniqueName: \"kubernetes.io/projected/9a63473d-4647-4d2d-bdd3-cab040945fa0-kube-api-access-q8k9z\") pod \"octavia-persistence-db-create-trz9l\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.119561 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a63473d-4647-4d2d-bdd3-cab040945fa0-operator-scripts\") pod \"octavia-persistence-db-create-trz9l\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.121066 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a63473d-4647-4d2d-bdd3-cab040945fa0-operator-scripts\") pod \"octavia-persistence-db-create-trz9l\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.141859 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8k9z\" (UniqueName: \"kubernetes.io/projected/9a63473d-4647-4d2d-bdd3-cab040945fa0-kube-api-access-q8k9z\") pod \"octavia-persistence-db-create-trz9l\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.210376 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:38 crc kubenswrapper[4771]: I0219 23:08:38.737361 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-trz9l"] Feb 19 23:08:38 crc kubenswrapper[4771]: W0219 23:08:38.746455 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a63473d_4647_4d2d_bdd3_cab040945fa0.slice/crio-aa7ec0fee56f28c8fb1df851546192b4fffe6450b735a237a987fb868721412b WatchSource:0}: Error finding container aa7ec0fee56f28c8fb1df851546192b4fffe6450b735a237a987fb868721412b: Status 404 returned error can't find the container with id aa7ec0fee56f28c8fb1df851546192b4fffe6450b735a237a987fb868721412b Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.275894 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-69c1-account-create-update-zxdsx"] Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.277229 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.281293 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.297925 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-69c1-account-create-update-zxdsx"] Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.346124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qd9\" (UniqueName: \"kubernetes.io/projected/0fd8c186-0f03-411e-ae54-85ad1e70b87b-kube-api-access-x9qd9\") pod \"octavia-69c1-account-create-update-zxdsx\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.346453 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd8c186-0f03-411e-ae54-85ad1e70b87b-operator-scripts\") pod \"octavia-69c1-account-create-update-zxdsx\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.398531 4771 generic.go:334] "Generic (PLEG): container finished" podID="9a63473d-4647-4d2d-bdd3-cab040945fa0" containerID="78d2c2dffe1289bb8cf51c4b139304cc0ad91d4703a84dd185bc79e61f7451b7" exitCode=0 Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.398625 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-trz9l" event={"ID":"9a63473d-4647-4d2d-bdd3-cab040945fa0","Type":"ContainerDied","Data":"78d2c2dffe1289bb8cf51c4b139304cc0ad91d4703a84dd185bc79e61f7451b7"} Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.398743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-trz9l" event={"ID":"9a63473d-4647-4d2d-bdd3-cab040945fa0","Type":"ContainerStarted","Data":"aa7ec0fee56f28c8fb1df851546192b4fffe6450b735a237a987fb868721412b"} Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.448276 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd8c186-0f03-411e-ae54-85ad1e70b87b-operator-scripts\") pod \"octavia-69c1-account-create-update-zxdsx\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.449555 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qd9\" (UniqueName: \"kubernetes.io/projected/0fd8c186-0f03-411e-ae54-85ad1e70b87b-kube-api-access-x9qd9\") pod \"octavia-69c1-account-create-update-zxdsx\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.452760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd8c186-0f03-411e-ae54-85ad1e70b87b-operator-scripts\") pod \"octavia-69c1-account-create-update-zxdsx\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.475124 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qd9\" (UniqueName: \"kubernetes.io/projected/0fd8c186-0f03-411e-ae54-85ad1e70b87b-kube-api-access-x9qd9\") pod \"octavia-69c1-account-create-update-zxdsx\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.595803 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:39 crc kubenswrapper[4771]: W0219 23:08:39.878460 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fd8c186_0f03_411e_ae54_85ad1e70b87b.slice/crio-411fa589d3b2dc2d9772b20476f7c53650bdff79352b1adc78f7e9a822d696cf WatchSource:0}: Error finding container 411fa589d3b2dc2d9772b20476f7c53650bdff79352b1adc78f7e9a822d696cf: Status 404 returned error can't find the container with id 411fa589d3b2dc2d9772b20476f7c53650bdff79352b1adc78f7e9a822d696cf Feb 19 23:08:39 crc kubenswrapper[4771]: I0219 23:08:39.885132 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-69c1-account-create-update-zxdsx"] Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.414867 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fd8c186-0f03-411e-ae54-85ad1e70b87b" containerID="ede17bb660b830051f01cb0f083b460b8e0931d3bc0c12d80b09579d351f11dc" exitCode=0 Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.415121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-69c1-account-create-update-zxdsx" event={"ID":"0fd8c186-0f03-411e-ae54-85ad1e70b87b","Type":"ContainerDied","Data":"ede17bb660b830051f01cb0f083b460b8e0931d3bc0c12d80b09579d351f11dc"} Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.417131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-69c1-account-create-update-zxdsx" event={"ID":"0fd8c186-0f03-411e-ae54-85ad1e70b87b","Type":"ContainerStarted","Data":"411fa589d3b2dc2d9772b20476f7c53650bdff79352b1adc78f7e9a822d696cf"} Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.834331 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.885000 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a63473d-4647-4d2d-bdd3-cab040945fa0-operator-scripts\") pod \"9a63473d-4647-4d2d-bdd3-cab040945fa0\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.885502 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8k9z\" (UniqueName: \"kubernetes.io/projected/9a63473d-4647-4d2d-bdd3-cab040945fa0-kube-api-access-q8k9z\") pod \"9a63473d-4647-4d2d-bdd3-cab040945fa0\" (UID: \"9a63473d-4647-4d2d-bdd3-cab040945fa0\") " Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.893601 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a63473d-4647-4d2d-bdd3-cab040945fa0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a63473d-4647-4d2d-bdd3-cab040945fa0" (UID: "9a63473d-4647-4d2d-bdd3-cab040945fa0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.904396 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a63473d-4647-4d2d-bdd3-cab040945fa0-kube-api-access-q8k9z" (OuterVolumeSpecName: "kube-api-access-q8k9z") pod "9a63473d-4647-4d2d-bdd3-cab040945fa0" (UID: "9a63473d-4647-4d2d-bdd3-cab040945fa0"). InnerVolumeSpecName "kube-api-access-q8k9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.988930 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a63473d-4647-4d2d-bdd3-cab040945fa0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:40 crc kubenswrapper[4771]: I0219 23:08:40.989204 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8k9z\" (UniqueName: \"kubernetes.io/projected/9a63473d-4647-4d2d-bdd3-cab040945fa0-kube-api-access-q8k9z\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.428011 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-trz9l" Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.428065 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-trz9l" event={"ID":"9a63473d-4647-4d2d-bdd3-cab040945fa0","Type":"ContainerDied","Data":"aa7ec0fee56f28c8fb1df851546192b4fffe6450b735a237a987fb868721412b"} Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.428155 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa7ec0fee56f28c8fb1df851546192b4fffe6450b735a237a987fb868721412b" Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.853677 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.909914 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9qd9\" (UniqueName: \"kubernetes.io/projected/0fd8c186-0f03-411e-ae54-85ad1e70b87b-kube-api-access-x9qd9\") pod \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.910058 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd8c186-0f03-411e-ae54-85ad1e70b87b-operator-scripts\") pod \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\" (UID: \"0fd8c186-0f03-411e-ae54-85ad1e70b87b\") " Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.911113 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd8c186-0f03-411e-ae54-85ad1e70b87b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fd8c186-0f03-411e-ae54-85ad1e70b87b" (UID: "0fd8c186-0f03-411e-ae54-85ad1e70b87b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:41 crc kubenswrapper[4771]: I0219 23:08:41.919362 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd8c186-0f03-411e-ae54-85ad1e70b87b-kube-api-access-x9qd9" (OuterVolumeSpecName: "kube-api-access-x9qd9") pod "0fd8c186-0f03-411e-ae54-85ad1e70b87b" (UID: "0fd8c186-0f03-411e-ae54-85ad1e70b87b"). InnerVolumeSpecName "kube-api-access-x9qd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.013397 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9qd9\" (UniqueName: \"kubernetes.io/projected/0fd8c186-0f03-411e-ae54-85ad1e70b87b-kube-api-access-x9qd9\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.013458 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd8c186-0f03-411e-ae54-85ad1e70b87b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.442645 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-69c1-account-create-update-zxdsx" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.458221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-69c1-account-create-update-zxdsx" event={"ID":"0fd8c186-0f03-411e-ae54-85ad1e70b87b","Type":"ContainerDied","Data":"411fa589d3b2dc2d9772b20476f7c53650bdff79352b1adc78f7e9a822d696cf"} Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.458279 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="411fa589d3b2dc2d9772b20476f7c53650bdff79352b1adc78f7e9a822d696cf" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.960334 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.960437 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.960531 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.962189 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:08:42 crc kubenswrapper[4771]: I0219 23:08:42.962296 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" gracePeriod=600 Feb 19 23:08:43 crc kubenswrapper[4771]: E0219 23:08:43.093155 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:08:43 crc kubenswrapper[4771]: I0219 23:08:43.460474 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" exitCode=0 Feb 19 23:08:43 crc kubenswrapper[4771]: I0219 23:08:43.460550 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277"} Feb 19 23:08:43 crc kubenswrapper[4771]: I0219 23:08:43.460617 4771 scope.go:117] "RemoveContainer" containerID="c2b2a4906ade6d26077f231564fac49a46292a6f1358314d170979c9e439e6b3" Feb 19 23:08:43 crc kubenswrapper[4771]: I0219 23:08:43.461498 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:08:43 crc kubenswrapper[4771]: E0219 23:08:43.462084 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.093221 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-758bbfd74d-tdr78"] Feb 19 23:08:45 crc kubenswrapper[4771]: E0219 23:08:45.093795 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd8c186-0f03-411e-ae54-85ad1e70b87b" containerName="mariadb-account-create-update" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.093808 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd8c186-0f03-411e-ae54-85ad1e70b87b" containerName="mariadb-account-create-update" Feb 19 23:08:45 crc kubenswrapper[4771]: E0219 23:08:45.093821 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a63473d-4647-4d2d-bdd3-cab040945fa0" containerName="mariadb-database-create" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.093827 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a63473d-4647-4d2d-bdd3-cab040945fa0" containerName="mariadb-database-create" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.093987 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a63473d-4647-4d2d-bdd3-cab040945fa0" containerName="mariadb-database-create" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.094004 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd8c186-0f03-411e-ae54-85ad1e70b87b" containerName="mariadb-account-create-update" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.095221 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.102505 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.102676 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.103263 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-2sz8m" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.103381 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.118929 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-758bbfd74d-tdr78"] Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.191468 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-config-data\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.191537 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-octavia-run\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.191571 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-config-data-merged\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.191610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-scripts\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.191678 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-ovndb-tls-certs\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.191794 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-combined-ca-bundle\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.293363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-ovndb-tls-certs\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.293409 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-combined-ca-bundle\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.293437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-config-data\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.293478 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-octavia-run\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.293508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-config-data-merged\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.293546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-scripts\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.294162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-octavia-run\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.294206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-config-data-merged\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.299697 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-combined-ca-bundle\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.308691 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-ovndb-tls-certs\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.308788 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-config-data\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.308976 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-scripts\") pod \"octavia-api-758bbfd74d-tdr78\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:45 crc kubenswrapper[4771]: I0219 23:08:45.421000 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:46 crc kubenswrapper[4771]: I0219 23:08:46.049192 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-758bbfd74d-tdr78"] Feb 19 23:08:46 crc kubenswrapper[4771]: I0219 23:08:46.520182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-758bbfd74d-tdr78" event={"ID":"7f27156a-ca86-4640-8182-8b6b17841a92","Type":"ContainerStarted","Data":"2db33486e97d2e519db9d8bb8aa718509fb14a2c82ddaf1e97bdf656f175f08b"} Feb 19 23:08:50 crc kubenswrapper[4771]: I0219 23:08:50.055409 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d2jgk"] Feb 19 23:08:50 crc kubenswrapper[4771]: I0219 23:08:50.064855 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d2jgk"] Feb 19 23:08:50 crc kubenswrapper[4771]: I0219 23:08:50.450966 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c81a18f-e0b7-4557-8789-62e97a5950af" path="/var/lib/kubelet/pods/2c81a18f-e0b7-4557-8789-62e97a5950af/volumes" Feb 19 23:08:51 crc kubenswrapper[4771]: I0219 23:08:51.927801 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-npgwb" Feb 19 23:08:51 crc kubenswrapper[4771]: I0219 23:08:51.958189 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:51 crc kubenswrapper[4771]: I0219 23:08:51.962007 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xtdt7" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.084675 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-npgwb-config-8lbjl"] Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.086336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.090745 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.115880 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-npgwb-config-8lbjl"] Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.258549 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-additional-scripts\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.258598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-scripts\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.258634 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/e9692d95-aa9f-4015-b630-520a4c628f80-kube-api-access-xr55h\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.258692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run-ovn\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.258758 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.258773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-log-ovn\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360090 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360127 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-log-ovn\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-additional-scripts\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-scripts\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360239 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/e9692d95-aa9f-4015-b630-520a4c628f80-kube-api-access-xr55h\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360298 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run-ovn\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run-ovn\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.360681 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-log-ovn\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.361663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-additional-scripts\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.363142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-scripts\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.394314 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/e9692d95-aa9f-4015-b630-520a4c628f80-kube-api-access-xr55h\") pod \"ovn-controller-npgwb-config-8lbjl\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:52 crc kubenswrapper[4771]: I0219 23:08:52.406810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:55 crc kubenswrapper[4771]: I0219 23:08:55.437405 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:08:55 crc kubenswrapper[4771]: E0219 23:08:55.438197 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:08:56 crc kubenswrapper[4771]: I0219 23:08:56.329842 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-npgwb-config-8lbjl"] Feb 19 23:08:56 crc kubenswrapper[4771]: W0219 23:08:56.338426 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9692d95_aa9f_4015_b630_520a4c628f80.slice/crio-17c345aacdd5bb27f81092ebb64fb3c4f6b6693cc330c8c144001085ff74730e WatchSource:0}: Error finding container 17c345aacdd5bb27f81092ebb64fb3c4f6b6693cc330c8c144001085ff74730e: Status 404 returned error can't find the container with id 17c345aacdd5bb27f81092ebb64fb3c4f6b6693cc330c8c144001085ff74730e Feb 19 23:08:56 crc kubenswrapper[4771]: I0219 23:08:56.618713 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-npgwb-config-8lbjl" event={"ID":"e9692d95-aa9f-4015-b630-520a4c628f80","Type":"ContainerStarted","Data":"fe40feaf58a49738ad31ae71ed2fcd99a805500e09586e56da67a6be38f8ce1b"} Feb 19 23:08:56 crc kubenswrapper[4771]: I0219 23:08:56.619009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-npgwb-config-8lbjl" event={"ID":"e9692d95-aa9f-4015-b630-520a4c628f80","Type":"ContainerStarted","Data":"17c345aacdd5bb27f81092ebb64fb3c4f6b6693cc330c8c144001085ff74730e"} Feb 19 23:08:56 crc kubenswrapper[4771]: I0219 23:08:56.621947 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f27156a-ca86-4640-8182-8b6b17841a92" containerID="536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320" exitCode=0 Feb 19 23:08:56 crc kubenswrapper[4771]: I0219 23:08:56.622038 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-758bbfd74d-tdr78" event={"ID":"7f27156a-ca86-4640-8182-8b6b17841a92","Type":"ContainerDied","Data":"536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320"} Feb 19 23:08:56 crc kubenswrapper[4771]: I0219 23:08:56.648903 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-npgwb-config-8lbjl" podStartSLOduration=4.64798467 podStartE2EDuration="4.64798467s" podCreationTimestamp="2026-02-19 23:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:56.634295345 +0000 UTC m=+6036.905737825" watchObservedRunningTime="2026-02-19 23:08:56.64798467 +0000 UTC m=+6036.919427140" Feb 19 23:08:57 crc kubenswrapper[4771]: I0219 23:08:57.638269 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-758bbfd74d-tdr78" event={"ID":"7f27156a-ca86-4640-8182-8b6b17841a92","Type":"ContainerStarted","Data":"65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488"} Feb 19 23:08:57 crc kubenswrapper[4771]: I0219 23:08:57.638669 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-758bbfd74d-tdr78" event={"ID":"7f27156a-ca86-4640-8182-8b6b17841a92","Type":"ContainerStarted","Data":"deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f"} Feb 19 23:08:57 crc kubenswrapper[4771]: I0219 23:08:57.639345 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:57 crc kubenswrapper[4771]: I0219 23:08:57.639412 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:08:57 crc kubenswrapper[4771]: I0219 23:08:57.641549 4771 generic.go:334] "Generic (PLEG): container finished" podID="e9692d95-aa9f-4015-b630-520a4c628f80" containerID="fe40feaf58a49738ad31ae71ed2fcd99a805500e09586e56da67a6be38f8ce1b" exitCode=0 Feb 19 23:08:57 crc kubenswrapper[4771]: I0219 23:08:57.641626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-npgwb-config-8lbjl" event={"ID":"e9692d95-aa9f-4015-b630-520a4c628f80","Type":"ContainerDied","Data":"fe40feaf58a49738ad31ae71ed2fcd99a805500e09586e56da67a6be38f8ce1b"} Feb 19 23:08:57 crc kubenswrapper[4771]: I0219 23:08:57.668564 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-758bbfd74d-tdr78" podStartSLOduration=2.904959562 podStartE2EDuration="12.668536536s" podCreationTimestamp="2026-02-19 23:08:45 +0000 UTC" firstStartedPulling="2026-02-19 23:08:46.070841714 +0000 UTC m=+6026.342284184" lastFinishedPulling="2026-02-19 23:08:55.834418648 +0000 UTC m=+6036.105861158" observedRunningTime="2026-02-19 23:08:57.660683696 +0000 UTC m=+6037.932126186" watchObservedRunningTime="2026-02-19 23:08:57.668536536 +0000 UTC m=+6037.939979036" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.108156 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300005 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-log-ovn\") pod \"e9692d95-aa9f-4015-b630-520a4c628f80\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300186 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run-ovn\") pod \"e9692d95-aa9f-4015-b630-520a4c628f80\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300234 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/e9692d95-aa9f-4015-b630-520a4c628f80-kube-api-access-xr55h\") pod \"e9692d95-aa9f-4015-b630-520a4c628f80\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300283 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-scripts\") pod \"e9692d95-aa9f-4015-b630-520a4c628f80\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300335 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run\") pod \"e9692d95-aa9f-4015-b630-520a4c628f80\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300579 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-additional-scripts\") pod \"e9692d95-aa9f-4015-b630-520a4c628f80\" (UID: \"e9692d95-aa9f-4015-b630-520a4c628f80\") " Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300563 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e9692d95-aa9f-4015-b630-520a4c628f80" (UID: "e9692d95-aa9f-4015-b630-520a4c628f80"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.300722 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run" (OuterVolumeSpecName: "var-run") pod "e9692d95-aa9f-4015-b630-520a4c628f80" (UID: "e9692d95-aa9f-4015-b630-520a4c628f80"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.301533 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e9692d95-aa9f-4015-b630-520a4c628f80" (UID: "e9692d95-aa9f-4015-b630-520a4c628f80"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.302185 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.302210 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.302222 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.302260 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e9692d95-aa9f-4015-b630-520a4c628f80" (UID: "e9692d95-aa9f-4015-b630-520a4c628f80"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.302403 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-scripts" (OuterVolumeSpecName: "scripts") pod "e9692d95-aa9f-4015-b630-520a4c628f80" (UID: "e9692d95-aa9f-4015-b630-520a4c628f80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.322301 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9692d95-aa9f-4015-b630-520a4c628f80-kube-api-access-xr55h" (OuterVolumeSpecName: "kube-api-access-xr55h") pod "e9692d95-aa9f-4015-b630-520a4c628f80" (UID: "e9692d95-aa9f-4015-b630-520a4c628f80"). InnerVolumeSpecName "kube-api-access-xr55h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.404166 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9692d95-aa9f-4015-b630-520a4c628f80-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.404206 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr55h\" (UniqueName: \"kubernetes.io/projected/e9692d95-aa9f-4015-b630-520a4c628f80-kube-api-access-xr55h\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.404221 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9692d95-aa9f-4015-b630-520a4c628f80-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.433231 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-npgwb-config-8lbjl"] Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.443329 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-npgwb-config-8lbjl"] Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.661804 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17c345aacdd5bb27f81092ebb64fb3c4f6b6693cc330c8c144001085ff74730e" Feb 19 23:08:59 crc kubenswrapper[4771]: I0219 23:08:59.661881 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-npgwb-config-8lbjl" Feb 19 23:09:00 crc kubenswrapper[4771]: I0219 23:09:00.457010 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9692d95-aa9f-4015-b630-520a4c628f80" path="/var/lib/kubelet/pods/e9692d95-aa9f-4015-b630-520a4c628f80/volumes" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.437995 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:09:06 crc kubenswrapper[4771]: E0219 23:09:06.438761 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.613332 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-pfjcs"] Feb 19 23:09:06 crc kubenswrapper[4771]: E0219 23:09:06.613874 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9692d95-aa9f-4015-b630-520a4c628f80" containerName="ovn-config" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.613906 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9692d95-aa9f-4015-b630-520a4c628f80" containerName="ovn-config" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.614241 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9692d95-aa9f-4015-b630-520a4c628f80" containerName="ovn-config" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.615767 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.621352 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.621644 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.621845 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.641762 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-pfjcs"] Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.797135 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0d7deae2-cb15-495d-9146-604c2305fd28-hm-ports\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.797463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d7deae2-cb15-495d-9146-604c2305fd28-config-data-merged\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.797536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7deae2-cb15-495d-9146-604c2305fd28-scripts\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.797600 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7deae2-cb15-495d-9146-604c2305fd28-config-data\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.899345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0d7deae2-cb15-495d-9146-604c2305fd28-hm-ports\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.899435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d7deae2-cb15-495d-9146-604c2305fd28-config-data-merged\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.899534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7deae2-cb15-495d-9146-604c2305fd28-scripts\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.899576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7deae2-cb15-495d-9146-604c2305fd28-config-data\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.900262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d7deae2-cb15-495d-9146-604c2305fd28-config-data-merged\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.900580 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0d7deae2-cb15-495d-9146-604c2305fd28-hm-ports\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.918325 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7deae2-cb15-495d-9146-604c2305fd28-scripts\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.920871 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7deae2-cb15-495d-9146-604c2305fd28-config-data\") pod \"octavia-rsyslog-pfjcs\" (UID: \"0d7deae2-cb15-495d-9146-604c2305fd28\") " pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:06 crc kubenswrapper[4771]: I0219 23:09:06.935527 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.316321 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-qmzq4"] Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.318517 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.324477 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.341965 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-qmzq4"] Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.499037 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-pfjcs"] Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.510248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-amphora-image\") pod \"octavia-image-upload-8d4564f8f-qmzq4\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.510278 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.510371 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-httpd-config\") pod \"octavia-image-upload-8d4564f8f-qmzq4\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.612853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-httpd-config\") pod \"octavia-image-upload-8d4564f8f-qmzq4\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.613054 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-amphora-image\") pod \"octavia-image-upload-8d4564f8f-qmzq4\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.614520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-amphora-image\") pod \"octavia-image-upload-8d4564f8f-qmzq4\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.634093 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-httpd-config\") pod \"octavia-image-upload-8d4564f8f-qmzq4\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.649631 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.714921 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-pfjcs"] Feb 19 23:09:07 crc kubenswrapper[4771]: I0219 23:09:07.752654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pfjcs" event={"ID":"0d7deae2-cb15-495d-9146-604c2305fd28","Type":"ContainerStarted","Data":"f9f544e25bced7a7b4303861dde664d10547ac1eedb5be2ae318aa2f44db28ed"} Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.217642 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-qmzq4"] Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.394543 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-c98b99c64-kr7ql"] Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.396718 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.402373 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.402450 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.409781 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-c98b99c64-kr7ql"] Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.530790 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/30f2a934-f61e-4c3e-93eb-ed824e644eff-octavia-run\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.530913 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-config-data\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.530940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-public-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.530996 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-internal-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.531676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-ovndb-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.531860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-scripts\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.531970 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/30f2a934-f61e-4c3e-93eb-ed824e644eff-config-data-merged\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.532159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-combined-ca-bundle\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637053 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-internal-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-ovndb-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637221 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-scripts\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637265 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/30f2a934-f61e-4c3e-93eb-ed824e644eff-config-data-merged\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-combined-ca-bundle\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637441 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/30f2a934-f61e-4c3e-93eb-ed824e644eff-octavia-run\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-config-data\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.637619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-public-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.639707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/30f2a934-f61e-4c3e-93eb-ed824e644eff-config-data-merged\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.640065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/30f2a934-f61e-4c3e-93eb-ed824e644eff-octavia-run\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.644773 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-public-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.646208 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-combined-ca-bundle\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.647120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-internal-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.652499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-scripts\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.653077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-ovndb-tls-certs\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.656486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f2a934-f61e-4c3e-93eb-ed824e644eff-config-data\") pod \"octavia-api-c98b99c64-kr7ql\" (UID: \"30f2a934-f61e-4c3e-93eb-ed824e644eff\") " pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.714591 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:08 crc kubenswrapper[4771]: I0219 23:09:08.763912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" event={"ID":"d49dc2d6-7339-4846-91ff-7b70cc01dbcc","Type":"ContainerStarted","Data":"61ccbf1e0ff21648ab321c06417a261444dc2f048a310f32210bfe72a6ae3f90"} Feb 19 23:09:09 crc kubenswrapper[4771]: I0219 23:09:09.643364 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-c98b99c64-kr7ql"] Feb 19 23:09:09 crc kubenswrapper[4771]: I0219 23:09:09.777339 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pfjcs" event={"ID":"0d7deae2-cb15-495d-9146-604c2305fd28","Type":"ContainerStarted","Data":"c0b806176e85eab6361fdfa9fc9d1414fe38fd501e266438179cf94518557ff6"} Feb 19 23:09:10 crc kubenswrapper[4771]: I0219 23:09:10.790747 4771 generic.go:334] "Generic (PLEG): container finished" podID="30f2a934-f61e-4c3e-93eb-ed824e644eff" containerID="6195228d3e39ddd9da88849db86b3c73426a38a2a05ad75428deb1c0e528d8da" exitCode=0 Feb 19 23:09:10 crc kubenswrapper[4771]: I0219 23:09:10.790900 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c98b99c64-kr7ql" event={"ID":"30f2a934-f61e-4c3e-93eb-ed824e644eff","Type":"ContainerDied","Data":"6195228d3e39ddd9da88849db86b3c73426a38a2a05ad75428deb1c0e528d8da"} Feb 19 23:09:10 crc kubenswrapper[4771]: I0219 23:09:10.791364 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c98b99c64-kr7ql" event={"ID":"30f2a934-f61e-4c3e-93eb-ed824e644eff","Type":"ContainerStarted","Data":"664a0c25cddae741ae02501f945bd3cabed58b5b8379bb252b347bc728623c24"} Feb 19 23:09:11 crc kubenswrapper[4771]: I0219 23:09:11.805116 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c98b99c64-kr7ql" event={"ID":"30f2a934-f61e-4c3e-93eb-ed824e644eff","Type":"ContainerStarted","Data":"393b1df79f83373ffd8ba01ca53fc4424f230435a26bca9638a92563004ec3ed"} Feb 19 23:09:11 crc kubenswrapper[4771]: I0219 23:09:11.805674 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:11 crc kubenswrapper[4771]: I0219 23:09:11.805720 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:11 crc kubenswrapper[4771]: I0219 23:09:11.805732 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-c98b99c64-kr7ql" event={"ID":"30f2a934-f61e-4c3e-93eb-ed824e644eff","Type":"ContainerStarted","Data":"f28f60eeb9c13b6c965a336f61d4734b7fb5f13e717211b3ff29567c3b1f1bd9"} Feb 19 23:09:11 crc kubenswrapper[4771]: I0219 23:09:11.810289 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d7deae2-cb15-495d-9146-604c2305fd28" containerID="c0b806176e85eab6361fdfa9fc9d1414fe38fd501e266438179cf94518557ff6" exitCode=0 Feb 19 23:09:11 crc kubenswrapper[4771]: I0219 23:09:11.810332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pfjcs" event={"ID":"0d7deae2-cb15-495d-9146-604c2305fd28","Type":"ContainerDied","Data":"c0b806176e85eab6361fdfa9fc9d1414fe38fd501e266438179cf94518557ff6"} Feb 19 23:09:11 crc kubenswrapper[4771]: I0219 23:09:11.824443 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-c98b99c64-kr7ql" podStartSLOduration=3.8244087589999998 podStartE2EDuration="3.824408759s" podCreationTimestamp="2026-02-19 23:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:09:11.824360888 +0000 UTC m=+6052.095803378" watchObservedRunningTime="2026-02-19 23:09:11.824408759 +0000 UTC m=+6052.095851219" Feb 19 23:09:13 crc kubenswrapper[4771]: I0219 23:09:13.834259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-pfjcs" event={"ID":"0d7deae2-cb15-495d-9146-604c2305fd28","Type":"ContainerStarted","Data":"7e51a09e287d56f2c8924938e23d4806c7c47ab0f7a5fbb6ed1ce763057bdba1"} Feb 19 23:09:13 crc kubenswrapper[4771]: I0219 23:09:13.835844 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:13 crc kubenswrapper[4771]: I0219 23:09:13.868259 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-pfjcs" podStartSLOduration=2.538804296 podStartE2EDuration="7.868053049s" podCreationTimestamp="2026-02-19 23:09:06 +0000 UTC" firstStartedPulling="2026-02-19 23:09:07.510055971 +0000 UTC m=+6047.781498441" lastFinishedPulling="2026-02-19 23:09:12.839304714 +0000 UTC m=+6053.110747194" observedRunningTime="2026-02-19 23:09:13.852631127 +0000 UTC m=+6054.124073667" watchObservedRunningTime="2026-02-19 23:09:13.868053049 +0000 UTC m=+6054.139495519" Feb 19 23:09:19 crc kubenswrapper[4771]: I0219 23:09:19.062898 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:09:19 crc kubenswrapper[4771]: I0219 23:09:19.074765 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:09:19 crc kubenswrapper[4771]: I0219 23:09:19.438444 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:09:19 crc kubenswrapper[4771]: E0219 23:09:19.438897 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:09:20 crc kubenswrapper[4771]: I0219 23:09:20.919689 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" event={"ID":"d49dc2d6-7339-4846-91ff-7b70cc01dbcc","Type":"ContainerStarted","Data":"b439ec6371972f81fc9d0de09f096221921c07ebd2222a259943d61562d63794"} Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.199582 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-z862b"] Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.201274 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.204759 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.213088 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-z862b"] Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.320439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-combined-ca-bundle\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.320806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-scripts\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.320933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data-merged\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.321014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.422781 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-scripts\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.423181 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data-merged\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.423268 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.423377 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-combined-ca-bundle\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.424542 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data-merged\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.428907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-combined-ca-bundle\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.429092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.429806 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-scripts\") pod \"octavia-db-sync-z862b\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.538516 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:21 crc kubenswrapper[4771]: I0219 23:09:21.982904 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-pfjcs" Feb 19 23:09:22 crc kubenswrapper[4771]: I0219 23:09:22.055934 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-z862b"] Feb 19 23:09:22 crc kubenswrapper[4771]: W0219 23:09:22.062183 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb667377_e02a_4f75_824e_8ad5f3a5b5a6.slice/crio-6187b5872bc2940b06aee6fbb4e791a48fae9213badc55e0c178b720cc3f9c94 WatchSource:0}: Error finding container 6187b5872bc2940b06aee6fbb4e791a48fae9213badc55e0c178b720cc3f9c94: Status 404 returned error can't find the container with id 6187b5872bc2940b06aee6fbb4e791a48fae9213badc55e0c178b720cc3f9c94 Feb 19 23:09:22 crc kubenswrapper[4771]: I0219 23:09:22.941848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z862b" event={"ID":"cb667377-e02a-4f75-824e-8ad5f3a5b5a6","Type":"ContainerStarted","Data":"6187b5872bc2940b06aee6fbb4e791a48fae9213badc55e0c178b720cc3f9c94"} Feb 19 23:09:23 crc kubenswrapper[4771]: I0219 23:09:23.953694 4771 generic.go:334] "Generic (PLEG): container finished" podID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerID="b439ec6371972f81fc9d0de09f096221921c07ebd2222a259943d61562d63794" exitCode=0 Feb 19 23:09:23 crc kubenswrapper[4771]: I0219 23:09:23.953774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" event={"ID":"d49dc2d6-7339-4846-91ff-7b70cc01dbcc","Type":"ContainerDied","Data":"b439ec6371972f81fc9d0de09f096221921c07ebd2222a259943d61562d63794"} Feb 19 23:09:23 crc kubenswrapper[4771]: I0219 23:09:23.958374 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z862b" event={"ID":"cb667377-e02a-4f75-824e-8ad5f3a5b5a6","Type":"ContainerStarted","Data":"3043db322a09d2d9a58d83ed5be81dfe810ae73cd9a2726f5d9f5b3c38c41962"} Feb 19 23:09:24 crc kubenswrapper[4771]: I0219 23:09:24.969565 4771 generic.go:334] "Generic (PLEG): container finished" podID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" containerID="3043db322a09d2d9a58d83ed5be81dfe810ae73cd9a2726f5d9f5b3c38c41962" exitCode=0 Feb 19 23:09:24 crc kubenswrapper[4771]: I0219 23:09:24.969714 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z862b" event={"ID":"cb667377-e02a-4f75-824e-8ad5f3a5b5a6","Type":"ContainerDied","Data":"3043db322a09d2d9a58d83ed5be81dfe810ae73cd9a2726f5d9f5b3c38c41962"} Feb 19 23:09:24 crc kubenswrapper[4771]: I0219 23:09:24.975270 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" event={"ID":"d49dc2d6-7339-4846-91ff-7b70cc01dbcc","Type":"ContainerStarted","Data":"3a5e4590057bef3f082279fb098ba644db7df42f202c2f7bf0fd4dc0a864d10c"} Feb 19 23:09:25 crc kubenswrapper[4771]: I0219 23:09:25.033358 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" podStartSLOduration=5.690141713 podStartE2EDuration="18.033339451s" podCreationTimestamp="2026-02-19 23:09:07 +0000 UTC" firstStartedPulling="2026-02-19 23:09:08.245039095 +0000 UTC m=+6048.516481565" lastFinishedPulling="2026-02-19 23:09:20.588236823 +0000 UTC m=+6060.859679303" observedRunningTime="2026-02-19 23:09:25.020408616 +0000 UTC m=+6065.291851086" watchObservedRunningTime="2026-02-19 23:09:25.033339451 +0000 UTC m=+6065.304781921" Feb 19 23:09:25 crc kubenswrapper[4771]: I0219 23:09:25.986926 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z862b" event={"ID":"cb667377-e02a-4f75-824e-8ad5f3a5b5a6","Type":"ContainerStarted","Data":"2029aebb18e7e0e6b0cc6bc15121892b709ed89f6c824173dd9d5d5ca6d83e48"} Feb 19 23:09:26 crc kubenswrapper[4771]: I0219 23:09:26.020824 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-z862b" podStartSLOduration=5.020796434 podStartE2EDuration="5.020796434s" podCreationTimestamp="2026-02-19 23:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:09:26.009323157 +0000 UTC m=+6066.280765647" watchObservedRunningTime="2026-02-19 23:09:26.020796434 +0000 UTC m=+6066.292238934" Feb 19 23:09:27 crc kubenswrapper[4771]: I0219 23:09:27.547572 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:27 crc kubenswrapper[4771]: I0219 23:09:27.800317 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-c98b99c64-kr7ql" Feb 19 23:09:27 crc kubenswrapper[4771]: I0219 23:09:27.883238 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-758bbfd74d-tdr78"] Feb 19 23:09:27 crc kubenswrapper[4771]: I0219 23:09:27.883629 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-758bbfd74d-tdr78" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api" containerID="cri-o://deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f" gracePeriod=30 Feb 19 23:09:27 crc kubenswrapper[4771]: I0219 23:09:27.884315 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-758bbfd74d-tdr78" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api-provider-agent" containerID="cri-o://65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488" gracePeriod=30 Feb 19 23:09:28 crc kubenswrapper[4771]: I0219 23:09:28.013117 4771 generic.go:334] "Generic (PLEG): container finished" podID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" containerID="2029aebb18e7e0e6b0cc6bc15121892b709ed89f6c824173dd9d5d5ca6d83e48" exitCode=0 Feb 19 23:09:28 crc kubenswrapper[4771]: I0219 23:09:28.013198 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z862b" event={"ID":"cb667377-e02a-4f75-824e-8ad5f3a5b5a6","Type":"ContainerDied","Data":"2029aebb18e7e0e6b0cc6bc15121892b709ed89f6c824173dd9d5d5ca6d83e48"} Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.028895 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f27156a-ca86-4640-8182-8b6b17841a92" containerID="65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488" exitCode=0 Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.029039 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-758bbfd74d-tdr78" event={"ID":"7f27156a-ca86-4640-8182-8b6b17841a92","Type":"ContainerDied","Data":"65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488"} Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.516082 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.644749 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data\") pod \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.644845 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-combined-ca-bundle\") pod \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.644890 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-scripts\") pod \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.644985 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data-merged\") pod \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\" (UID: \"cb667377-e02a-4f75-824e-8ad5f3a5b5a6\") " Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.649686 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data" (OuterVolumeSpecName: "config-data") pod "cb667377-e02a-4f75-824e-8ad5f3a5b5a6" (UID: "cb667377-e02a-4f75-824e-8ad5f3a5b5a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.651046 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-scripts" (OuterVolumeSpecName: "scripts") pod "cb667377-e02a-4f75-824e-8ad5f3a5b5a6" (UID: "cb667377-e02a-4f75-824e-8ad5f3a5b5a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.679876 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "cb667377-e02a-4f75-824e-8ad5f3a5b5a6" (UID: "cb667377-e02a-4f75-824e-8ad5f3a5b5a6"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.706800 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb667377-e02a-4f75-824e-8ad5f3a5b5a6" (UID: "cb667377-e02a-4f75-824e-8ad5f3a5b5a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.747385 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.747420 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.747430 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:29 crc kubenswrapper[4771]: I0219 23:09:29.747440 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb667377-e02a-4f75-824e-8ad5f3a5b5a6-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:30 crc kubenswrapper[4771]: I0219 23:09:30.044568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z862b" event={"ID":"cb667377-e02a-4f75-824e-8ad5f3a5b5a6","Type":"ContainerDied","Data":"6187b5872bc2940b06aee6fbb4e791a48fae9213badc55e0c178b720cc3f9c94"} Feb 19 23:09:30 crc kubenswrapper[4771]: I0219 23:09:30.044613 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6187b5872bc2940b06aee6fbb4e791a48fae9213badc55e0c178b720cc3f9c94" Feb 19 23:09:30 crc kubenswrapper[4771]: I0219 23:09:30.044746 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z862b" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.043370 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/octavia-api-758bbfd74d-tdr78" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api" probeResult="failure" output="Get \"http://10.217.1.115:9876/healthcheck\": read tcp 10.217.0.2:44626->10.217.1.115:9876: read: connection reset by peer" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.043997 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/octavia-api-758bbfd74d-tdr78" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api-provider-agent" probeResult="failure" output="Get \"http://10.217.1.115:9876/healthcheck\": read tcp 10.217.0.2:44622->10.217.1.115:9876: read: connection reset by peer" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.707588 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.793129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-octavia-run\") pod \"7f27156a-ca86-4640-8182-8b6b17841a92\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.793175 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-scripts\") pod \"7f27156a-ca86-4640-8182-8b6b17841a92\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.793211 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-config-data-merged\") pod \"7f27156a-ca86-4640-8182-8b6b17841a92\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.793270 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-config-data\") pod \"7f27156a-ca86-4640-8182-8b6b17841a92\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.793287 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-combined-ca-bundle\") pod \"7f27156a-ca86-4640-8182-8b6b17841a92\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.793356 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-ovndb-tls-certs\") pod \"7f27156a-ca86-4640-8182-8b6b17841a92\" (UID: \"7f27156a-ca86-4640-8182-8b6b17841a92\") " Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.793713 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "7f27156a-ca86-4640-8182-8b6b17841a92" (UID: "7f27156a-ca86-4640-8182-8b6b17841a92"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.794284 4771 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-octavia-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.801806 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-scripts" (OuterVolumeSpecName: "scripts") pod "7f27156a-ca86-4640-8182-8b6b17841a92" (UID: "7f27156a-ca86-4640-8182-8b6b17841a92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.801897 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-config-data" (OuterVolumeSpecName: "config-data") pod "7f27156a-ca86-4640-8182-8b6b17841a92" (UID: "7f27156a-ca86-4640-8182-8b6b17841a92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.884208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f27156a-ca86-4640-8182-8b6b17841a92" (UID: "7f27156a-ca86-4640-8182-8b6b17841a92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.895386 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.895416 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.895429 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.909952 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "7f27156a-ca86-4640-8182-8b6b17841a92" (UID: "7f27156a-ca86-4640-8182-8b6b17841a92"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.981521 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7f27156a-ca86-4640-8182-8b6b17841a92" (UID: "7f27156a-ca86-4640-8182-8b6b17841a92"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.996457 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7f27156a-ca86-4640-8182-8b6b17841a92-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:31 crc kubenswrapper[4771]: I0219 23:09:31.996661 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f27156a-ca86-4640-8182-8b6b17841a92-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.067733 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f27156a-ca86-4640-8182-8b6b17841a92" containerID="deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f" exitCode=0 Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.067791 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-758bbfd74d-tdr78" event={"ID":"7f27156a-ca86-4640-8182-8b6b17841a92","Type":"ContainerDied","Data":"deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f"} Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.067822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-758bbfd74d-tdr78" event={"ID":"7f27156a-ca86-4640-8182-8b6b17841a92","Type":"ContainerDied","Data":"2db33486e97d2e519db9d8bb8aa718509fb14a2c82ddaf1e97bdf656f175f08b"} Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.067827 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-758bbfd74d-tdr78" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.067840 4771 scope.go:117] "RemoveContainer" containerID="65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.114669 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-758bbfd74d-tdr78"] Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.116948 4771 scope.go:117] "RemoveContainer" containerID="deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.123809 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-758bbfd74d-tdr78"] Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.212420 4771 scope.go:117] "RemoveContainer" containerID="536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.239831 4771 scope.go:117] "RemoveContainer" containerID="65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488" Feb 19 23:09:32 crc kubenswrapper[4771]: E0219 23:09:32.240270 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488\": container with ID starting with 65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488 not found: ID does not exist" containerID="65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.240302 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488"} err="failed to get container status \"65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488\": rpc error: code = NotFound desc = could not find container \"65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488\": container with ID starting with 65064f3e782f7c18966d93b8a42d8abf14e724d86bc737051119c582d0ce8488 not found: ID does not exist" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.240325 4771 scope.go:117] "RemoveContainer" containerID="deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f" Feb 19 23:09:32 crc kubenswrapper[4771]: E0219 23:09:32.240897 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f\": container with ID starting with deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f not found: ID does not exist" containerID="deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.240953 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f"} err="failed to get container status \"deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f\": rpc error: code = NotFound desc = could not find container \"deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f\": container with ID starting with deb866e776f76c27eea7e147d1c819af88f2af36e3d8ebbbd62c5dadf1a4853f not found: ID does not exist" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.240987 4771 scope.go:117] "RemoveContainer" containerID="536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320" Feb 19 23:09:32 crc kubenswrapper[4771]: E0219 23:09:32.241727 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320\": container with ID starting with 536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320 not found: ID does not exist" containerID="536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.241750 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320"} err="failed to get container status \"536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320\": rpc error: code = NotFound desc = could not find container \"536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320\": container with ID starting with 536b7d5dddf09c4914a6aabbaee5f72c519706fb58480d0bb49ab7617f0b5320 not found: ID does not exist" Feb 19 23:09:32 crc kubenswrapper[4771]: I0219 23:09:32.462394 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" path="/var/lib/kubelet/pods/7f27156a-ca86-4640-8182-8b6b17841a92/volumes" Feb 19 23:09:33 crc kubenswrapper[4771]: I0219 23:09:33.437482 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:09:33 crc kubenswrapper[4771]: E0219 23:09:33.438295 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:09:36 crc kubenswrapper[4771]: I0219 23:09:36.374510 4771 scope.go:117] "RemoveContainer" containerID="31e8257e5f63678562b42bdd4a7e3ab780e6923204c674c418a909706defdc99" Feb 19 23:09:36 crc kubenswrapper[4771]: I0219 23:09:36.438409 4771 scope.go:117] "RemoveContainer" containerID="125b77871cc050b4732c34ff193c83b80609e19aa429e65486e2c108c80311e4" Feb 19 23:09:45 crc kubenswrapper[4771]: I0219 23:09:45.438924 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:09:45 crc kubenswrapper[4771]: E0219 23:09:45.440606 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.001475 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-qmzq4"] Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.002415 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" podUID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerName="octavia-amphora-httpd" containerID="cri-o://3a5e4590057bef3f082279fb098ba644db7df42f202c2f7bf0fd4dc0a864d10c" gracePeriod=30 Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.367926 4771 generic.go:334] "Generic (PLEG): container finished" podID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerID="3a5e4590057bef3f082279fb098ba644db7df42f202c2f7bf0fd4dc0a864d10c" exitCode=0 Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.368212 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" event={"ID":"d49dc2d6-7339-4846-91ff-7b70cc01dbcc","Type":"ContainerDied","Data":"3a5e4590057bef3f082279fb098ba644db7df42f202c2f7bf0fd4dc0a864d10c"} Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.642883 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.723902 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-httpd-config\") pod \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.724205 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-amphora-image\") pod \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\" (UID: \"d49dc2d6-7339-4846-91ff-7b70cc01dbcc\") " Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.759939 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d49dc2d6-7339-4846-91ff-7b70cc01dbcc" (UID: "d49dc2d6-7339-4846-91ff-7b70cc01dbcc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.824593 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "d49dc2d6-7339-4846-91ff-7b70cc01dbcc" (UID: "d49dc2d6-7339-4846-91ff-7b70cc01dbcc"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.827086 4771 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-amphora-image\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:58 crc kubenswrapper[4771]: I0219 23:09:58.827139 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d49dc2d6-7339-4846-91ff-7b70cc01dbcc-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:59 crc kubenswrapper[4771]: I0219 23:09:59.381923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" event={"ID":"d49dc2d6-7339-4846-91ff-7b70cc01dbcc","Type":"ContainerDied","Data":"61ccbf1e0ff21648ab321c06417a261444dc2f048a310f32210bfe72a6ae3f90"} Feb 19 23:09:59 crc kubenswrapper[4771]: I0219 23:09:59.381998 4771 scope.go:117] "RemoveContainer" containerID="3a5e4590057bef3f082279fb098ba644db7df42f202c2f7bf0fd4dc0a864d10c" Feb 19 23:09:59 crc kubenswrapper[4771]: I0219 23:09:59.382066 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-qmzq4" Feb 19 23:09:59 crc kubenswrapper[4771]: I0219 23:09:59.431663 4771 scope.go:117] "RemoveContainer" containerID="b439ec6371972f81fc9d0de09f096221921c07ebd2222a259943d61562d63794" Feb 19 23:09:59 crc kubenswrapper[4771]: I0219 23:09:59.441002 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:09:59 crc kubenswrapper[4771]: E0219 23:09:59.441568 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:09:59 crc kubenswrapper[4771]: I0219 23:09:59.452389 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-qmzq4"] Feb 19 23:09:59 crc kubenswrapper[4771]: I0219 23:09:59.463471 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-qmzq4"] Feb 19 23:10:00 crc kubenswrapper[4771]: I0219 23:10:00.458923 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" path="/var/lib/kubelet/pods/d49dc2d6-7339-4846-91ff-7b70cc01dbcc/volumes" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.756418 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-9w77c"] Feb 19 23:10:05 crc kubenswrapper[4771]: E0219 23:10:05.757629 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" containerName="init" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.757647 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" containerName="init" Feb 19 23:10:05 crc kubenswrapper[4771]: E0219 23:10:05.757671 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerName="init" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.757680 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerName="init" Feb 19 23:10:05 crc kubenswrapper[4771]: E0219 23:10:05.757697 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="init" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.757705 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="init" Feb 19 23:10:05 crc kubenswrapper[4771]: E0219 23:10:05.757723 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" containerName="octavia-db-sync" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.757731 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" containerName="octavia-db-sync" Feb 19 23:10:05 crc kubenswrapper[4771]: E0219 23:10:05.757754 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.757764 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api" Feb 19 23:10:05 crc kubenswrapper[4771]: E0219 23:10:05.757792 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerName="octavia-amphora-httpd" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.757800 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerName="octavia-amphora-httpd" Feb 19 23:10:05 crc kubenswrapper[4771]: E0219 23:10:05.757814 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api-provider-agent" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.757822 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api-provider-agent" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.758073 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.758086 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f27156a-ca86-4640-8182-8b6b17841a92" containerName="octavia-api-provider-agent" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.758108 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" containerName="octavia-db-sync" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.758129 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49dc2d6-7339-4846-91ff-7b70cc01dbcc" containerName="octavia-amphora-httpd" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.759333 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.770166 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.774106 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-9w77c"] Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.884368 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e7c1d8f3-838f-4194-8fa9-ca248b474ca7-amphora-image\") pod \"octavia-image-upload-8d4564f8f-9w77c\" (UID: \"e7c1d8f3-838f-4194-8fa9-ca248b474ca7\") " pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.884443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c1d8f3-838f-4194-8fa9-ca248b474ca7-httpd-config\") pod \"octavia-image-upload-8d4564f8f-9w77c\" (UID: \"e7c1d8f3-838f-4194-8fa9-ca248b474ca7\") " pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.985885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e7c1d8f3-838f-4194-8fa9-ca248b474ca7-amphora-image\") pod \"octavia-image-upload-8d4564f8f-9w77c\" (UID: \"e7c1d8f3-838f-4194-8fa9-ca248b474ca7\") " pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.985951 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c1d8f3-838f-4194-8fa9-ca248b474ca7-httpd-config\") pod \"octavia-image-upload-8d4564f8f-9w77c\" (UID: \"e7c1d8f3-838f-4194-8fa9-ca248b474ca7\") " pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.986376 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/e7c1d8f3-838f-4194-8fa9-ca248b474ca7-amphora-image\") pod \"octavia-image-upload-8d4564f8f-9w77c\" (UID: \"e7c1d8f3-838f-4194-8fa9-ca248b474ca7\") " pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:05 crc kubenswrapper[4771]: I0219 23:10:05.992648 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7c1d8f3-838f-4194-8fa9-ca248b474ca7-httpd-config\") pod \"octavia-image-upload-8d4564f8f-9w77c\" (UID: \"e7c1d8f3-838f-4194-8fa9-ca248b474ca7\") " pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:06 crc kubenswrapper[4771]: I0219 23:10:06.095317 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-9w77c" Feb 19 23:10:06 crc kubenswrapper[4771]: I0219 23:10:06.595950 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-9w77c"] Feb 19 23:10:07 crc kubenswrapper[4771]: I0219 23:10:07.464028 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-9w77c" event={"ID":"e7c1d8f3-838f-4194-8fa9-ca248b474ca7","Type":"ContainerStarted","Data":"33106c6766b8f3821d4a82d48284b1c7a82c1188875fe48d76bab6573c0f4c75"} Feb 19 23:10:07 crc kubenswrapper[4771]: I0219 23:10:07.464493 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-9w77c" event={"ID":"e7c1d8f3-838f-4194-8fa9-ca248b474ca7","Type":"ContainerStarted","Data":"21ba45b53467c7d12cd5f5e8093b83b467ec067ec542bf20d583e96aaae4f49d"} Feb 19 23:10:08 crc kubenswrapper[4771]: I0219 23:10:08.475433 4771 generic.go:334] "Generic (PLEG): container finished" podID="e7c1d8f3-838f-4194-8fa9-ca248b474ca7" containerID="33106c6766b8f3821d4a82d48284b1c7a82c1188875fe48d76bab6573c0f4c75" exitCode=0 Feb 19 23:10:08 crc kubenswrapper[4771]: I0219 23:10:08.475585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-9w77c" event={"ID":"e7c1d8f3-838f-4194-8fa9-ca248b474ca7","Type":"ContainerDied","Data":"33106c6766b8f3821d4a82d48284b1c7a82c1188875fe48d76bab6573c0f4c75"} Feb 19 23:10:09 crc kubenswrapper[4771]: I0219 23:10:09.487245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-9w77c" event={"ID":"e7c1d8f3-838f-4194-8fa9-ca248b474ca7","Type":"ContainerStarted","Data":"b5d5b9221d8fb15d42ca4c0e2fe19d89bce4e5c1004f2c6e639d5545770b20cc"} Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.437389 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:10:11 crc kubenswrapper[4771]: E0219 23:10:11.438081 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.602427 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-8d4564f8f-9w77c" podStartSLOduration=6.17458162 podStartE2EDuration="6.602408775s" podCreationTimestamp="2026-02-19 23:10:05 +0000 UTC" firstStartedPulling="2026-02-19 23:10:06.60013469 +0000 UTC m=+6106.871577160" lastFinishedPulling="2026-02-19 23:10:07.027961835 +0000 UTC m=+6107.299404315" observedRunningTime="2026-02-19 23:10:09.505601607 +0000 UTC m=+6109.777044077" watchObservedRunningTime="2026-02-19 23:10:11.602408775 +0000 UTC m=+6111.873851255" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.607268 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-jrzlw"] Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.609078 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.610910 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.612919 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.612985 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.625193 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jrzlw"] Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.718108 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-combined-ca-bundle\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.718157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8cb35f59-020c-40c3-b6a8-b03124ee0b08-hm-ports\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.718199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-scripts\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.718457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-amphora-certs\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.718556 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-config-data\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.718700 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8cb35f59-020c-40c3-b6a8-b03124ee0b08-config-data-merged\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.820332 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-scripts\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.820469 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-amphora-certs\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.820537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-config-data\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.820596 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8cb35f59-020c-40c3-b6a8-b03124ee0b08-config-data-merged\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.820661 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-combined-ca-bundle\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.820747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8cb35f59-020c-40c3-b6a8-b03124ee0b08-hm-ports\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.821365 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8cb35f59-020c-40c3-b6a8-b03124ee0b08-config-data-merged\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.823090 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8cb35f59-020c-40c3-b6a8-b03124ee0b08-hm-ports\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.829558 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-combined-ca-bundle\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.830407 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-amphora-certs\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.837564 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-config-data\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.839191 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb35f59-020c-40c3-b6a8-b03124ee0b08-scripts\") pod \"octavia-healthmanager-jrzlw\" (UID: \"8cb35f59-020c-40c3-b6a8-b03124ee0b08\") " pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:11 crc kubenswrapper[4771]: I0219 23:10:11.939124 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:12 crc kubenswrapper[4771]: W0219 23:10:12.536116 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cb35f59_020c_40c3_b6a8_b03124ee0b08.slice/crio-cab3e67dd289cc533cb5e440d382731b35f09461cbf72e0262737b6015f847a2 WatchSource:0}: Error finding container cab3e67dd289cc533cb5e440d382731b35f09461cbf72e0262737b6015f847a2: Status 404 returned error can't find the container with id cab3e67dd289cc533cb5e440d382731b35f09461cbf72e0262737b6015f847a2 Feb 19 23:10:12 crc kubenswrapper[4771]: I0219 23:10:12.538895 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jrzlw"] Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.396539 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-gxv88"] Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.398969 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.400773 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.403357 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.407076 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-gxv88"] Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.530382 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jrzlw" event={"ID":"8cb35f59-020c-40c3-b6a8-b03124ee0b08","Type":"ContainerStarted","Data":"78675f39222e7c1e8261e3d10ccdb8f6f5cc331d0531e760a01c487e751675f8"} Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.530627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jrzlw" event={"ID":"8cb35f59-020c-40c3-b6a8-b03124ee0b08","Type":"ContainerStarted","Data":"cab3e67dd289cc533cb5e440d382731b35f09461cbf72e0262737b6015f847a2"} Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.553435 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-combined-ca-bundle\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.553614 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-scripts\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.553656 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-amphora-certs\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.553675 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/589851d7-1ce4-40a4-ad6a-96fb46673693-config-data-merged\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.553709 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/589851d7-1ce4-40a4-ad6a-96fb46673693-hm-ports\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.554165 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-config-data\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.656965 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-scripts\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.657092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-amphora-certs\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.657136 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/589851d7-1ce4-40a4-ad6a-96fb46673693-config-data-merged\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.657177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/589851d7-1ce4-40a4-ad6a-96fb46673693-hm-ports\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.657289 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-config-data\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.657419 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-combined-ca-bundle\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.658281 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/589851d7-1ce4-40a4-ad6a-96fb46673693-config-data-merged\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.658982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/589851d7-1ce4-40a4-ad6a-96fb46673693-hm-ports\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.668913 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-config-data\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.669380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-combined-ca-bundle\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.669561 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-amphora-certs\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.670205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/589851d7-1ce4-40a4-ad6a-96fb46673693-scripts\") pod \"octavia-housekeeping-gxv88\" (UID: \"589851d7-1ce4-40a4-ad6a-96fb46673693\") " pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:13 crc kubenswrapper[4771]: I0219 23:10:13.728935 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:14 crc kubenswrapper[4771]: I0219 23:10:14.308853 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-gxv88"] Feb 19 23:10:14 crc kubenswrapper[4771]: I0219 23:10:14.578435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-gxv88" event={"ID":"589851d7-1ce4-40a4-ad6a-96fb46673693","Type":"ContainerStarted","Data":"93c3502e2a413f2a7c3f198b6705ad6d085e48032cc9051ca1012c7f8a9760e1"} Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.597893 4771 generic.go:334] "Generic (PLEG): container finished" podID="8cb35f59-020c-40c3-b6a8-b03124ee0b08" containerID="78675f39222e7c1e8261e3d10ccdb8f6f5cc331d0531e760a01c487e751675f8" exitCode=0 Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.598791 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jrzlw" event={"ID":"8cb35f59-020c-40c3-b6a8-b03124ee0b08","Type":"ContainerDied","Data":"78675f39222e7c1e8261e3d10ccdb8f6f5cc331d0531e760a01c487e751675f8"} Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.786573 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-zvjfp"] Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.793293 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.796891 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.797571 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.809519 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-zvjfp"] Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.938692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-config-data\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.938797 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d1f51bd5-34dd-4d08-8427-71f91be4b299-config-data-merged\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.938878 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-amphora-certs\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.939110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-combined-ca-bundle\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.939250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d1f51bd5-34dd-4d08-8427-71f91be4b299-hm-ports\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:15 crc kubenswrapper[4771]: I0219 23:10:15.939424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-scripts\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.041131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-config-data\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.041622 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d1f51bd5-34dd-4d08-8427-71f91be4b299-config-data-merged\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.041686 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-amphora-certs\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.041741 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-combined-ca-bundle\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.041784 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d1f51bd5-34dd-4d08-8427-71f91be4b299-hm-ports\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.041830 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-scripts\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.042393 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d1f51bd5-34dd-4d08-8427-71f91be4b299-config-data-merged\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.042878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d1f51bd5-34dd-4d08-8427-71f91be4b299-hm-ports\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.048239 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-amphora-certs\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.049050 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-scripts\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.049577 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-combined-ca-bundle\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.051137 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1f51bd5-34dd-4d08-8427-71f91be4b299-config-data\") pod \"octavia-worker-zvjfp\" (UID: \"d1f51bd5-34dd-4d08-8427-71f91be4b299\") " pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.127962 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.627476 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jrzlw" event={"ID":"8cb35f59-020c-40c3-b6a8-b03124ee0b08","Type":"ContainerStarted","Data":"6a0fb1b416fe05af288cf9ca2cc064f61156d23ae457cac9533ace5989369bac"} Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.629265 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.660945 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-jrzlw" podStartSLOduration=5.660926449 podStartE2EDuration="5.660926449s" podCreationTimestamp="2026-02-19 23:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:10:16.648875207 +0000 UTC m=+6116.920317677" watchObservedRunningTime="2026-02-19 23:10:16.660926449 +0000 UTC m=+6116.932368919" Feb 19 23:10:16 crc kubenswrapper[4771]: I0219 23:10:16.909706 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-zvjfp"] Feb 19 23:10:16 crc kubenswrapper[4771]: W0219 23:10:16.926752 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1f51bd5_34dd_4d08_8427_71f91be4b299.slice/crio-0898a23894d96e2d5d40fea83dad43bd9e51dc1591659a7ea29f8940ec044407 WatchSource:0}: Error finding container 0898a23894d96e2d5d40fea83dad43bd9e51dc1591659a7ea29f8940ec044407: Status 404 returned error can't find the container with id 0898a23894d96e2d5d40fea83dad43bd9e51dc1591659a7ea29f8940ec044407 Feb 19 23:10:17 crc kubenswrapper[4771]: I0219 23:10:17.303662 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jrzlw"] Feb 19 23:10:17 crc kubenswrapper[4771]: I0219 23:10:17.640309 4771 generic.go:334] "Generic (PLEG): container finished" podID="589851d7-1ce4-40a4-ad6a-96fb46673693" containerID="60229e0b90fc241405066f32a0c1cedaef0d240e644bfcd0327fe43004f64f54" exitCode=0 Feb 19 23:10:17 crc kubenswrapper[4771]: I0219 23:10:17.640371 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-gxv88" event={"ID":"589851d7-1ce4-40a4-ad6a-96fb46673693","Type":"ContainerDied","Data":"60229e0b90fc241405066f32a0c1cedaef0d240e644bfcd0327fe43004f64f54"} Feb 19 23:10:17 crc kubenswrapper[4771]: I0219 23:10:17.642901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zvjfp" event={"ID":"d1f51bd5-34dd-4d08-8427-71f91be4b299","Type":"ContainerStarted","Data":"0898a23894d96e2d5d40fea83dad43bd9e51dc1591659a7ea29f8940ec044407"} Feb 19 23:10:18 crc kubenswrapper[4771]: I0219 23:10:18.657512 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-gxv88" event={"ID":"589851d7-1ce4-40a4-ad6a-96fb46673693","Type":"ContainerStarted","Data":"2122484f35dd52e7fe1924a7b8d265255b5eb552ec702c4258420cd50caaf184"} Feb 19 23:10:18 crc kubenswrapper[4771]: I0219 23:10:18.657933 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:18 crc kubenswrapper[4771]: I0219 23:10:18.685456 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-gxv88" podStartSLOduration=3.656035218 podStartE2EDuration="5.68543751s" podCreationTimestamp="2026-02-19 23:10:13 +0000 UTC" firstStartedPulling="2026-02-19 23:10:14.313448478 +0000 UTC m=+6114.584890938" lastFinishedPulling="2026-02-19 23:10:16.34285076 +0000 UTC m=+6116.614293230" observedRunningTime="2026-02-19 23:10:18.680561981 +0000 UTC m=+6118.952004461" watchObservedRunningTime="2026-02-19 23:10:18.68543751 +0000 UTC m=+6118.956879980" Feb 19 23:10:19 crc kubenswrapper[4771]: I0219 23:10:19.676492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zvjfp" event={"ID":"d1f51bd5-34dd-4d08-8427-71f91be4b299","Type":"ContainerStarted","Data":"2176f5e15f3ad357775e6983609e1e780345838891cb6b82ea4ec58ccecc3f64"} Feb 19 23:10:21 crc kubenswrapper[4771]: I0219 23:10:21.694345 4771 generic.go:334] "Generic (PLEG): container finished" podID="d1f51bd5-34dd-4d08-8427-71f91be4b299" containerID="2176f5e15f3ad357775e6983609e1e780345838891cb6b82ea4ec58ccecc3f64" exitCode=0 Feb 19 23:10:21 crc kubenswrapper[4771]: I0219 23:10:21.694615 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zvjfp" event={"ID":"d1f51bd5-34dd-4d08-8427-71f91be4b299","Type":"ContainerDied","Data":"2176f5e15f3ad357775e6983609e1e780345838891cb6b82ea4ec58ccecc3f64"} Feb 19 23:10:22 crc kubenswrapper[4771]: I0219 23:10:22.753096 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zvjfp" event={"ID":"d1f51bd5-34dd-4d08-8427-71f91be4b299","Type":"ContainerStarted","Data":"3602a27e0967502c6262a6da9bfe7d364ca391b7a6df44f6bc695dd172adc04b"} Feb 19 23:10:22 crc kubenswrapper[4771]: I0219 23:10:22.755243 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:22 crc kubenswrapper[4771]: I0219 23:10:22.795193 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-zvjfp" podStartSLOduration=6.233884619 podStartE2EDuration="7.795172041s" podCreationTimestamp="2026-02-19 23:10:15 +0000 UTC" firstStartedPulling="2026-02-19 23:10:16.932599882 +0000 UTC m=+6117.204042352" lastFinishedPulling="2026-02-19 23:10:18.493887304 +0000 UTC m=+6118.765329774" observedRunningTime="2026-02-19 23:10:22.784364713 +0000 UTC m=+6123.055807183" watchObservedRunningTime="2026-02-19 23:10:22.795172041 +0000 UTC m=+6123.066614521" Feb 19 23:10:25 crc kubenswrapper[4771]: I0219 23:10:25.437906 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:10:25 crc kubenswrapper[4771]: E0219 23:10:25.438833 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:10:26 crc kubenswrapper[4771]: I0219 23:10:26.977124 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-jrzlw" Feb 19 23:10:28 crc kubenswrapper[4771]: I0219 23:10:28.763815 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-gxv88" Feb 19 23:10:31 crc kubenswrapper[4771]: I0219 23:10:31.188229 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-zvjfp" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.797672 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6687bb7c6f-hz6w4"] Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.801788 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.807588 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.807927 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.809247 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6976z" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.809565 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.821716 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6687bb7c6f-hz6w4"] Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.867290 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.867542 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-log" containerID="cri-o://6f7d022613a30d88077579c03072b8654bba98d034987ddb1f3088dd192c6207" gracePeriod=30 Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.867713 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-httpd" containerID="cri-o://593d8695bb5aa300d9674870adefd18c581bed197f907d9490c5f404c479c207" gracePeriod=30 Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.936578 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59dc7576bf-f4hbt"] Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.936864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-scripts\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.936917 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/598eaa33-32e4-411a-b4bc-f0df6cde32d3-horizon-secret-key\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.937185 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-config-data\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.937318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p7bc\" (UniqueName: \"kubernetes.io/projected/598eaa33-32e4-411a-b4bc-f0df6cde32d3-kube-api-access-8p7bc\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.937348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598eaa33-32e4-411a-b4bc-f0df6cde32d3-logs\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.938475 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:38 crc kubenswrapper[4771]: I0219 23:10:38.965597 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59dc7576bf-f4hbt"] Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:38.995207 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:38.995680 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-log" containerID="cri-o://7e020a5d4eebb9762b908dff98fecc39ab9b009269a2b6f1e6276a4bbe8e7d9a" gracePeriod=30 Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:38.996200 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-httpd" containerID="cri-o://1858a869a1a9a41d2b43873ff0e0f5f9883fdf7c5781320a478d703445d60790" gracePeriod=30 Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.042317 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggkfv\" (UniqueName: \"kubernetes.io/projected/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-kube-api-access-ggkfv\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.042519 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-horizon-secret-key\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.042634 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-logs\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.042722 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-config-data\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.042894 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-scripts\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.042984 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/598eaa33-32e4-411a-b4bc-f0df6cde32d3-horizon-secret-key\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.043153 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-scripts\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.043296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-config-data\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.043794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-scripts\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.044108 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p7bc\" (UniqueName: \"kubernetes.io/projected/598eaa33-32e4-411a-b4bc-f0df6cde32d3-kube-api-access-8p7bc\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.044352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598eaa33-32e4-411a-b4bc-f0df6cde32d3-logs\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.044956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598eaa33-32e4-411a-b4bc-f0df6cde32d3-logs\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.045750 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-config-data\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.051733 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/598eaa33-32e4-411a-b4bc-f0df6cde32d3-horizon-secret-key\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.060290 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p7bc\" (UniqueName: \"kubernetes.io/projected/598eaa33-32e4-411a-b4bc-f0df6cde32d3-kube-api-access-8p7bc\") pod \"horizon-6687bb7c6f-hz6w4\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.135528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.146312 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggkfv\" (UniqueName: \"kubernetes.io/projected/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-kube-api-access-ggkfv\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.146480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-horizon-secret-key\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.146579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-logs\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.146652 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-config-data\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.146736 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-scripts\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.147517 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-scripts\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.147654 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-logs\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.148572 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-config-data\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.149822 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-horizon-secret-key\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.163972 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggkfv\" (UniqueName: \"kubernetes.io/projected/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-kube-api-access-ggkfv\") pod \"horizon-59dc7576bf-f4hbt\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.201432 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.613180 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59dc7576bf-f4hbt"] Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.637082 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6687bb7c6f-hz6w4"] Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.955729 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ed1dabf-d278-4999-bff5-2782672d6781" containerID="6f7d022613a30d88077579c03072b8654bba98d034987ddb1f3088dd192c6207" exitCode=143 Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.955907 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ed1dabf-d278-4999-bff5-2782672d6781","Type":"ContainerDied","Data":"6f7d022613a30d88077579c03072b8654bba98d034987ddb1f3088dd192c6207"} Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.958577 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerID="7e020a5d4eebb9762b908dff98fecc39ab9b009269a2b6f1e6276a4bbe8e7d9a" exitCode=143 Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.958638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c","Type":"ContainerDied","Data":"7e020a5d4eebb9762b908dff98fecc39ab9b009269a2b6f1e6276a4bbe8e7d9a"} Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.959952 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59dc7576bf-f4hbt" event={"ID":"478e48a8-a429-4d81-aff7-6dbc6f6b9f54","Type":"ContainerStarted","Data":"9d0f1022c1c7ad914e2b88b6177c8eedec41b57dd26ba8615b1e4dc3741cfb4c"} Feb 19 23:10:39 crc kubenswrapper[4771]: I0219 23:10:39.961958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687bb7c6f-hz6w4" event={"ID":"598eaa33-32e4-411a-b4bc-f0df6cde32d3","Type":"ContainerStarted","Data":"928e022933fcefe0f9435c2c328e1a510746ec2eb4989a4ccf0e50c4a4f9e833"} Feb 19 23:10:40 crc kubenswrapper[4771]: I0219 23:10:40.440804 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:10:40 crc kubenswrapper[4771]: E0219 23:10:40.441154 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:10:40 crc kubenswrapper[4771]: I0219 23:10:40.966313 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59dc7576bf-f4hbt"] Feb 19 23:10:40 crc kubenswrapper[4771]: I0219 23:10:40.977961 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67b65bd49b-2w988"] Feb 19 23:10:40 crc kubenswrapper[4771]: I0219 23:10:40.983835 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:40 crc kubenswrapper[4771]: I0219 23:10:40.987247 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 19 23:10:40 crc kubenswrapper[4771]: I0219 23:10:40.995500 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b65bd49b-2w988"] Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.019687 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-tls-certs\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.019783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-config-data\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.019805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-combined-ca-bundle\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.019965 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ktm\" (UniqueName: \"kubernetes.io/projected/26d9927e-39c9-4bd7-b426-b639c188da29-kube-api-access-p4ktm\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.020039 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d9927e-39c9-4bd7-b426-b639c188da29-logs\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.020127 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-scripts\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.020260 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-secret-key\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.074665 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6687bb7c6f-hz6w4"] Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.109957 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f449ffc58-sz7vv"] Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.111647 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122705 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-scripts\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122772 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-config-data\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-combined-ca-bundle\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-combined-ca-bundle\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122856 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-secret-key\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122875 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-config-data\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122922 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ktm\" (UniqueName: \"kubernetes.io/projected/26d9927e-39c9-4bd7-b426-b639c188da29-kube-api-access-p4ktm\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122948 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d9927e-39c9-4bd7-b426-b639c188da29-logs\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.122974 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-scripts\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.123010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-tls-certs\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.123044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d25b3d-46b9-4605-8c5d-4fe736c63322-logs\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.123073 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-secret-key\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.123112 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nztsc\" (UniqueName: \"kubernetes.io/projected/46d25b3d-46b9-4605-8c5d-4fe736c63322-kube-api-access-nztsc\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.123181 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-tls-certs\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.126522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-config-data\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.126773 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d9927e-39c9-4bd7-b426-b639c188da29-logs\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.127185 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-scripts\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.127290 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f449ffc58-sz7vv"] Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.132641 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-secret-key\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.132713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-combined-ca-bundle\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.139439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-tls-certs\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.149717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ktm\" (UniqueName: \"kubernetes.io/projected/26d9927e-39c9-4bd7-b426-b639c188da29-kube-api-access-p4ktm\") pod \"horizon-67b65bd49b-2w988\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.224659 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nztsc\" (UniqueName: \"kubernetes.io/projected/46d25b3d-46b9-4605-8c5d-4fe736c63322-kube-api-access-nztsc\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.224758 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-scripts\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.225441 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-combined-ca-bundle\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.225481 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-secret-key\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.225489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-scripts\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.225499 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-config-data\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.225573 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-tls-certs\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.225592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d25b3d-46b9-4605-8c5d-4fe736c63322-logs\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.226007 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d25b3d-46b9-4605-8c5d-4fe736c63322-logs\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.226969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-config-data\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.229406 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-combined-ca-bundle\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.229805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-tls-certs\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.240510 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-secret-key\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.245903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nztsc\" (UniqueName: \"kubernetes.io/projected/46d25b3d-46b9-4605-8c5d-4fe736c63322-kube-api-access-nztsc\") pod \"horizon-f449ffc58-sz7vv\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.320575 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.434348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.808694 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b65bd49b-2w988"] Feb 19 23:10:41 crc kubenswrapper[4771]: W0219 23:10:41.808884 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d9927e_39c9_4bd7_b426_b639c188da29.slice/crio-e436860c5ddb013bdc45a7ea098dec1ddc1ab2b2e0f789507aa7d14cbaf02b2b WatchSource:0}: Error finding container e436860c5ddb013bdc45a7ea098dec1ddc1ab2b2e0f789507aa7d14cbaf02b2b: Status 404 returned error can't find the container with id e436860c5ddb013bdc45a7ea098dec1ddc1ab2b2e0f789507aa7d14cbaf02b2b Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.945780 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f449ffc58-sz7vv"] Feb 19 23:10:41 crc kubenswrapper[4771]: W0219 23:10:41.955855 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46d25b3d_46b9_4605_8c5d_4fe736c63322.slice/crio-c75999ad8ae249c69e762d59901f594d4c477ece290c3d29099f58ee6789d0fe WatchSource:0}: Error finding container c75999ad8ae249c69e762d59901f594d4c477ece290c3d29099f58ee6789d0fe: Status 404 returned error can't find the container with id c75999ad8ae249c69e762d59901f594d4c477ece290c3d29099f58ee6789d0fe Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.984414 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f449ffc58-sz7vv" event={"ID":"46d25b3d-46b9-4605-8c5d-4fe736c63322","Type":"ContainerStarted","Data":"c75999ad8ae249c69e762d59901f594d4c477ece290c3d29099f58ee6789d0fe"} Feb 19 23:10:41 crc kubenswrapper[4771]: I0219 23:10:41.985585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b65bd49b-2w988" event={"ID":"26d9927e-39c9-4bd7-b426-b639c188da29","Type":"ContainerStarted","Data":"e436860c5ddb013bdc45a7ea098dec1ddc1ab2b2e0f789507aa7d14cbaf02b2b"} Feb 19 23:10:42 crc kubenswrapper[4771]: I0219 23:10:42.997562 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ed1dabf-d278-4999-bff5-2782672d6781" containerID="593d8695bb5aa300d9674870adefd18c581bed197f907d9490c5f404c479c207" exitCode=0 Feb 19 23:10:42 crc kubenswrapper[4771]: I0219 23:10:42.997635 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ed1dabf-d278-4999-bff5-2782672d6781","Type":"ContainerDied","Data":"593d8695bb5aa300d9674870adefd18c581bed197f907d9490c5f404c479c207"} Feb 19 23:10:43 crc kubenswrapper[4771]: I0219 23:10:43.001437 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerID="1858a869a1a9a41d2b43873ff0e0f5f9883fdf7c5781320a478d703445d60790" exitCode=0 Feb 19 23:10:43 crc kubenswrapper[4771]: I0219 23:10:43.001470 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c","Type":"ContainerDied","Data":"1858a869a1a9a41d2b43873ff0e0f5f9883fdf7c5781320a478d703445d60790"} Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.617560 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.623110 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.689857 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-httpd-run\") pod \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690352 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-combined-ca-bundle\") pod \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690392 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-config-data\") pod \"8ed1dabf-d278-4999-bff5-2782672d6781\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690437 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-logs\") pod \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690485 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-scripts\") pod \"8ed1dabf-d278-4999-bff5-2782672d6781\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690665 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-combined-ca-bundle\") pod \"8ed1dabf-d278-4999-bff5-2782672d6781\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-httpd-run\") pod \"8ed1dabf-d278-4999-bff5-2782672d6781\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690841 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-public-tls-certs\") pod \"8ed1dabf-d278-4999-bff5-2782672d6781\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690903 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-config-data\") pod \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.690950 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-logs\") pod \"8ed1dabf-d278-4999-bff5-2782672d6781\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.691083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwfct\" (UniqueName: \"kubernetes.io/projected/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-kube-api-access-bwfct\") pod \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.691181 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-scripts\") pod \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.691268 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-internal-tls-certs\") pod \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\" (UID: \"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.691310 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rbsw\" (UniqueName: \"kubernetes.io/projected/8ed1dabf-d278-4999-bff5-2782672d6781-kube-api-access-6rbsw\") pod \"8ed1dabf-d278-4999-bff5-2782672d6781\" (UID: \"8ed1dabf-d278-4999-bff5-2782672d6781\") " Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.699503 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ed1dabf-d278-4999-bff5-2782672d6781" (UID: "8ed1dabf-d278-4999-bff5-2782672d6781"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.699809 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" (UID: "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.701865 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-logs" (OuterVolumeSpecName: "logs") pod "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" (UID: "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.703102 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-logs" (OuterVolumeSpecName: "logs") pod "8ed1dabf-d278-4999-bff5-2782672d6781" (UID: "8ed1dabf-d278-4999-bff5-2782672d6781"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.709270 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-scripts" (OuterVolumeSpecName: "scripts") pod "8ed1dabf-d278-4999-bff5-2782672d6781" (UID: "8ed1dabf-d278-4999-bff5-2782672d6781"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.721830 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-kube-api-access-bwfct" (OuterVolumeSpecName: "kube-api-access-bwfct") pod "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" (UID: "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c"). InnerVolumeSpecName "kube-api-access-bwfct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.721848 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-scripts" (OuterVolumeSpecName: "scripts") pod "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" (UID: "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.746131 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed1dabf-d278-4999-bff5-2782672d6781-kube-api-access-6rbsw" (OuterVolumeSpecName: "kube-api-access-6rbsw") pod "8ed1dabf-d278-4999-bff5-2782672d6781" (UID: "8ed1dabf-d278-4999-bff5-2782672d6781"). InnerVolumeSpecName "kube-api-access-6rbsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.779386 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ed1dabf-d278-4999-bff5-2782672d6781" (UID: "8ed1dabf-d278-4999-bff5-2782672d6781"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.781152 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" (UID: "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794007 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwfct\" (UniqueName: \"kubernetes.io/projected/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-kube-api-access-bwfct\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794528 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794641 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rbsw\" (UniqueName: \"kubernetes.io/projected/8ed1dabf-d278-4999-bff5-2782672d6781-kube-api-access-6rbsw\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794752 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794769 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794779 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794790 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794800 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794808 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.794817 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed1dabf-d278-4999-bff5-2782672d6781-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.823159 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ed1dabf-d278-4999-bff5-2782672d6781" (UID: "8ed1dabf-d278-4999-bff5-2782672d6781"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.828444 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-config-data" (OuterVolumeSpecName: "config-data") pod "8ed1dabf-d278-4999-bff5-2782672d6781" (UID: "8ed1dabf-d278-4999-bff5-2782672d6781"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.849298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-config-data" (OuterVolumeSpecName: "config-data") pod "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" (UID: "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.864319 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" (UID: "b3ff24aa-eeb9-483c-9e20-0fe5949bb10c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.896968 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.897000 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.897010 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:47 crc kubenswrapper[4771]: I0219 23:10:47.897033 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed1dabf-d278-4999-bff5-2782672d6781-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.050353 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f449ffc58-sz7vv" event={"ID":"46d25b3d-46b9-4605-8c5d-4fe736c63322","Type":"ContainerStarted","Data":"477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9"} Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.054199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b65bd49b-2w988" event={"ID":"26d9927e-39c9-4bd7-b426-b639c188da29","Type":"ContainerStarted","Data":"dd48c29a601dba4deff2490fc19867ca5e31ed1cac63daf877fdf70dae8de0e3"} Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.056411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ed1dabf-d278-4999-bff5-2782672d6781","Type":"ContainerDied","Data":"bc7dd7f9b9a3ef0f6f46491d4251e3c15ed3acdbac4f7d111befaa87696de1fa"} Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.056439 4771 scope.go:117] "RemoveContainer" containerID="593d8695bb5aa300d9674870adefd18c581bed197f907d9490c5f404c479c207" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.056469 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.061433 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.061438 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b3ff24aa-eeb9-483c-9e20-0fe5949bb10c","Type":"ContainerDied","Data":"ab3854ff8bef1e06d55bbe6ed44a2685abc146b12135e4e8e49bd47b8fac6aa3"} Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.063688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59dc7576bf-f4hbt" event={"ID":"478e48a8-a429-4d81-aff7-6dbc6f6b9f54","Type":"ContainerStarted","Data":"35f7f21d96ad8d5f77987c00207320f0852215d11240c000c471bdf3c697c1a2"} Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.065286 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687bb7c6f-hz6w4" event={"ID":"598eaa33-32e4-411a-b4bc-f0df6cde32d3","Type":"ContainerStarted","Data":"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8"} Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.096119 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.103922 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.131782 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.133736 4771 scope.go:117] "RemoveContainer" containerID="6f7d022613a30d88077579c03072b8654bba98d034987ddb1f3088dd192c6207" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.143323 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: E0219 23:10:48.143828 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-httpd" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.143848 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-httpd" Feb 19 23:10:48 crc kubenswrapper[4771]: E0219 23:10:48.143868 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-log" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.143879 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-log" Feb 19 23:10:48 crc kubenswrapper[4771]: E0219 23:10:48.143912 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-log" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.143920 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-log" Feb 19 23:10:48 crc kubenswrapper[4771]: E0219 23:10:48.143949 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-httpd" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.143958 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-httpd" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.144229 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-log" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.144258 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" containerName="glance-httpd" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.144273 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-log" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.144285 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" containerName="glance-httpd" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.145604 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.150220 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.154125 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.157413 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.157728 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.158053 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2qwhw" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.166370 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.175420 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.177244 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.184090 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.184136 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.188489 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204193 4771 scope.go:117] "RemoveContainer" containerID="1858a869a1a9a41d2b43873ff0e0f5f9883fdf7c5781320a478d703445d60790" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204371 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7523e0-9170-4d3a-951b-2aae16fb47d3-logs\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204559 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef8da599-c761-483b-8e1f-0c39e63b7476-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204669 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pk2b\" (UniqueName: \"kubernetes.io/projected/4f7523e0-9170-4d3a-951b-2aae16fb47d3-kube-api-access-5pk2b\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204702 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204735 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdndl\" (UniqueName: \"kubernetes.io/projected/ef8da599-c761-483b-8e1f-0c39e63b7476-kube-api-access-zdndl\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204780 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204807 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.204838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f7523e0-9170-4d3a-951b-2aae16fb47d3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.205145 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef8da599-c761-483b-8e1f-0c39e63b7476-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.205191 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.205272 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.262360 4771 scope.go:117] "RemoveContainer" containerID="7e020a5d4eebb9762b908dff98fecc39ab9b009269a2b6f1e6276a4bbe8e7d9a" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306806 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306827 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7523e0-9170-4d3a-951b-2aae16fb47d3-logs\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306891 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306927 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef8da599-c761-483b-8e1f-0c39e63b7476-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pk2b\" (UniqueName: \"kubernetes.io/projected/4f7523e0-9170-4d3a-951b-2aae16fb47d3-kube-api-access-5pk2b\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.306985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.307009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdndl\" (UniqueName: \"kubernetes.io/projected/ef8da599-c761-483b-8e1f-0c39e63b7476-kube-api-access-zdndl\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.307050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.307066 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.307466 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f7523e0-9170-4d3a-951b-2aae16fb47d3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.307508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f7523e0-9170-4d3a-951b-2aae16fb47d3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.307548 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef8da599-c761-483b-8e1f-0c39e63b7476-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.307569 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.308511 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef8da599-c761-483b-8e1f-0c39e63b7476-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.308816 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7523e0-9170-4d3a-951b-2aae16fb47d3-logs\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.308878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef8da599-c761-483b-8e1f-0c39e63b7476-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.312743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.313034 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.313472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.313774 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.314793 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.323008 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef8da599-c761-483b-8e1f-0c39e63b7476-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.324251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.325258 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pk2b\" (UniqueName: \"kubernetes.io/projected/4f7523e0-9170-4d3a-951b-2aae16fb47d3-kube-api-access-5pk2b\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.329100 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7523e0-9170-4d3a-951b-2aae16fb47d3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f7523e0-9170-4d3a-951b-2aae16fb47d3\") " pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.329860 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdndl\" (UniqueName: \"kubernetes.io/projected/ef8da599-c761-483b-8e1f-0c39e63b7476-kube-api-access-zdndl\") pod \"glance-default-internal-api-0\" (UID: \"ef8da599-c761-483b-8e1f-0c39e63b7476\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.447186 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed1dabf-d278-4999-bff5-2782672d6781" path="/var/lib/kubelet/pods/8ed1dabf-d278-4999-bff5-2782672d6781/volumes" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.447862 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ff24aa-eeb9-483c-9e20-0fe5949bb10c" path="/var/lib/kubelet/pods/b3ff24aa-eeb9-483c-9e20-0fe5949bb10c/volumes" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.489996 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:10:48 crc kubenswrapper[4771]: I0219 23:10:48.498508 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.074688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687bb7c6f-hz6w4" event={"ID":"598eaa33-32e4-411a-b4bc-f0df6cde32d3","Type":"ContainerStarted","Data":"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f"} Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.074761 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6687bb7c6f-hz6w4" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon" containerID="cri-o://569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f" gracePeriod=30 Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.074757 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6687bb7c6f-hz6w4" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon-log" containerID="cri-o://25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8" gracePeriod=30 Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.079626 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.080356 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f449ffc58-sz7vv" event={"ID":"46d25b3d-46b9-4605-8c5d-4fe736c63322","Type":"ContainerStarted","Data":"3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea"} Feb 19 23:10:49 crc kubenswrapper[4771]: W0219 23:10:49.081341 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f7523e0_9170_4d3a_951b_2aae16fb47d3.slice/crio-70915eff15977a7d92672e516154f94dcf7bd3724dc4d4ce3dde0b004c2136b0 WatchSource:0}: Error finding container 70915eff15977a7d92672e516154f94dcf7bd3724dc4d4ce3dde0b004c2136b0: Status 404 returned error can't find the container with id 70915eff15977a7d92672e516154f94dcf7bd3724dc4d4ce3dde0b004c2136b0 Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.082392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b65bd49b-2w988" event={"ID":"26d9927e-39c9-4bd7-b426-b639c188da29","Type":"ContainerStarted","Data":"bbc5b958f999b0cf68b3753ba45fcd8a7baaa22f7b0c53b91938808f3b7330be"} Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.112204 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6687bb7c6f-hz6w4" podStartSLOduration=2.951560589 podStartE2EDuration="11.112188253s" podCreationTimestamp="2026-02-19 23:10:38 +0000 UTC" firstStartedPulling="2026-02-19 23:10:39.633368428 +0000 UTC m=+6139.904810898" lastFinishedPulling="2026-02-19 23:10:47.793996102 +0000 UTC m=+6148.065438562" observedRunningTime="2026-02-19 23:10:49.104863658 +0000 UTC m=+6149.376306138" watchObservedRunningTime="2026-02-19 23:10:49.112188253 +0000 UTC m=+6149.383630723" Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.126869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59dc7576bf-f4hbt" event={"ID":"478e48a8-a429-4d81-aff7-6dbc6f6b9f54","Type":"ContainerStarted","Data":"1fa23553b38960f29447f0c894fcbfdf37dc5d0ec4059fa3c14bc40a7775dba9"} Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.127055 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-59dc7576bf-f4hbt" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon-log" containerID="cri-o://35f7f21d96ad8d5f77987c00207320f0852215d11240c000c471bdf3c697c1a2" gracePeriod=30 Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.127360 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-59dc7576bf-f4hbt" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon" containerID="cri-o://1fa23553b38960f29447f0c894fcbfdf37dc5d0ec4059fa3c14bc40a7775dba9" gracePeriod=30 Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.138243 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.140338 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f449ffc58-sz7vv" podStartSLOduration=2.487879835 podStartE2EDuration="8.140327553s" podCreationTimestamp="2026-02-19 23:10:41 +0000 UTC" firstStartedPulling="2026-02-19 23:10:41.960046465 +0000 UTC m=+6142.231488925" lastFinishedPulling="2026-02-19 23:10:47.612494133 +0000 UTC m=+6147.883936643" observedRunningTime="2026-02-19 23:10:49.139853371 +0000 UTC m=+6149.411295841" watchObservedRunningTime="2026-02-19 23:10:49.140327553 +0000 UTC m=+6149.411770023" Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.168441 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67b65bd49b-2w988" podStartSLOduration=3.138291735 podStartE2EDuration="9.168421302s" podCreationTimestamp="2026-02-19 23:10:40 +0000 UTC" firstStartedPulling="2026-02-19 23:10:41.813313203 +0000 UTC m=+6142.084755673" lastFinishedPulling="2026-02-19 23:10:47.84344277 +0000 UTC m=+6148.114885240" observedRunningTime="2026-02-19 23:10:49.159376801 +0000 UTC m=+6149.430819291" watchObservedRunningTime="2026-02-19 23:10:49.168421302 +0000 UTC m=+6149.439863772" Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.204213 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.210527 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-59dc7576bf-f4hbt" podStartSLOduration=3.240287297 podStartE2EDuration="11.210506964s" podCreationTimestamp="2026-02-19 23:10:38 +0000 UTC" firstStartedPulling="2026-02-19 23:10:39.619988802 +0000 UTC m=+6139.891431272" lastFinishedPulling="2026-02-19 23:10:47.590208429 +0000 UTC m=+6147.861650939" observedRunningTime="2026-02-19 23:10:49.190545062 +0000 UTC m=+6149.461987532" watchObservedRunningTime="2026-02-19 23:10:49.210506964 +0000 UTC m=+6149.481949424" Feb 19 23:10:49 crc kubenswrapper[4771]: W0219 23:10:49.220124 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef8da599_c761_483b_8e1f_0c39e63b7476.slice/crio-66f0d1f2516bba6a0c30774d6b4540741ea0a75846ef40a202b7ec734b6d6fda WatchSource:0}: Error finding container 66f0d1f2516bba6a0c30774d6b4540741ea0a75846ef40a202b7ec734b6d6fda: Status 404 returned error can't find the container with id 66f0d1f2516bba6a0c30774d6b4540741ea0a75846ef40a202b7ec734b6d6fda Feb 19 23:10:49 crc kubenswrapper[4771]: I0219 23:10:49.230306 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:10:50 crc kubenswrapper[4771]: I0219 23:10:50.142123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef8da599-c761-483b-8e1f-0c39e63b7476","Type":"ContainerStarted","Data":"deedd09693c0850f912ff06da6a131076e664b03523c4b2bbb9da96a8b68ed68"} Feb 19 23:10:50 crc kubenswrapper[4771]: I0219 23:10:50.142432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef8da599-c761-483b-8e1f-0c39e63b7476","Type":"ContainerStarted","Data":"66f0d1f2516bba6a0c30774d6b4540741ea0a75846ef40a202b7ec734b6d6fda"} Feb 19 23:10:50 crc kubenswrapper[4771]: I0219 23:10:50.144345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f7523e0-9170-4d3a-951b-2aae16fb47d3","Type":"ContainerStarted","Data":"fefac1b59f058b20d818678367ea2c2535be149dfe2ff700512c2a6c22a28c0e"} Feb 19 23:10:50 crc kubenswrapper[4771]: I0219 23:10:50.144372 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f7523e0-9170-4d3a-951b-2aae16fb47d3","Type":"ContainerStarted","Data":"70915eff15977a7d92672e516154f94dcf7bd3724dc4d4ce3dde0b004c2136b0"} Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.156858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef8da599-c761-483b-8e1f-0c39e63b7476","Type":"ContainerStarted","Data":"f2260ce24b4ac94dd592621c1c9f6e87687a2ea413e2e599fc6f932bb6c63bff"} Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.159611 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f7523e0-9170-4d3a-951b-2aae16fb47d3","Type":"ContainerStarted","Data":"595ca7634dd13a26f3079ca009235331697e6fe5e47d71a3b484dbb76723e7d5"} Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.199964 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.19994757 podStartE2EDuration="3.19994757s" podCreationTimestamp="2026-02-19 23:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:10:51.190447047 +0000 UTC m=+6151.461889587" watchObservedRunningTime="2026-02-19 23:10:51.19994757 +0000 UTC m=+6151.471390040" Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.225645 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.225616075 podStartE2EDuration="3.225616075s" podCreationTimestamp="2026-02-19 23:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:10:51.211652503 +0000 UTC m=+6151.483094973" watchObservedRunningTime="2026-02-19 23:10:51.225616075 +0000 UTC m=+6151.497058575" Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.321947 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.321992 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.435584 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:51 crc kubenswrapper[4771]: I0219 23:10:51.435649 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:10:52 crc kubenswrapper[4771]: I0219 23:10:52.437579 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:10:52 crc kubenswrapper[4771]: E0219 23:10:52.438085 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.490401 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.491208 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.499444 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.499510 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.540219 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.544110 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.550300 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:10:58 crc kubenswrapper[4771]: I0219 23:10:58.554102 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:59 crc kubenswrapper[4771]: I0219 23:10:59.244065 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:10:59 crc kubenswrapper[4771]: I0219 23:10:59.244331 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:59 crc kubenswrapper[4771]: I0219 23:10:59.244344 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:10:59 crc kubenswrapper[4771]: I0219 23:10:59.244355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:11:01 crc kubenswrapper[4771]: I0219 23:11:01.040199 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:11:01 crc kubenswrapper[4771]: I0219 23:11:01.042482 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:11:01 crc kubenswrapper[4771]: I0219 23:11:01.053669 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:11:01 crc kubenswrapper[4771]: I0219 23:11:01.054946 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:11:01 crc kubenswrapper[4771]: I0219 23:11:01.330264 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67b65bd49b-2w988" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.127:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.127:8443: connect: connection refused" Feb 19 23:11:01 crc kubenswrapper[4771]: I0219 23:11:01.437053 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f449ffc58-sz7vv" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.128:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.128:8443: connect: connection refused" Feb 19 23:11:03 crc kubenswrapper[4771]: I0219 23:11:03.438101 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:11:03 crc kubenswrapper[4771]: E0219 23:11:03.438751 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:11:05 crc kubenswrapper[4771]: I0219 23:11:05.929887 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fnwjk"] Feb 19 23:11:05 crc kubenswrapper[4771]: I0219 23:11:05.933624 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:05 crc kubenswrapper[4771]: I0219 23:11:05.954489 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fnwjk"] Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.093591 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-catalog-content\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.093709 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-utilities\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.094065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4s9w\" (UniqueName: \"kubernetes.io/projected/06a04187-b3fc-4acc-87fb-dc37f2645071-kube-api-access-p4s9w\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.195919 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4s9w\" (UniqueName: \"kubernetes.io/projected/06a04187-b3fc-4acc-87fb-dc37f2645071-kube-api-access-p4s9w\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.196067 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-catalog-content\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.196112 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-utilities\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.196918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-utilities\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.197168 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-catalog-content\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.229813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4s9w\" (UniqueName: \"kubernetes.io/projected/06a04187-b3fc-4acc-87fb-dc37f2645071-kube-api-access-p4s9w\") pod \"community-operators-fnwjk\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.304299 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:06 crc kubenswrapper[4771]: I0219 23:11:06.843833 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fnwjk"] Feb 19 23:11:06 crc kubenswrapper[4771]: W0219 23:11:06.893726 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a04187_b3fc_4acc_87fb_dc37f2645071.slice/crio-118557c4d36d5e78643e96944214db3780415325a30e1a37cde55c6634520a5c WatchSource:0}: Error finding container 118557c4d36d5e78643e96944214db3780415325a30e1a37cde55c6634520a5c: Status 404 returned error can't find the container with id 118557c4d36d5e78643e96944214db3780415325a30e1a37cde55c6634520a5c Feb 19 23:11:07 crc kubenswrapper[4771]: I0219 23:11:07.358127 4771 generic.go:334] "Generic (PLEG): container finished" podID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerID="efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c" exitCode=0 Feb 19 23:11:07 crc kubenswrapper[4771]: I0219 23:11:07.358435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwjk" event={"ID":"06a04187-b3fc-4acc-87fb-dc37f2645071","Type":"ContainerDied","Data":"efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c"} Feb 19 23:11:07 crc kubenswrapper[4771]: I0219 23:11:07.358475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwjk" event={"ID":"06a04187-b3fc-4acc-87fb-dc37f2645071","Type":"ContainerStarted","Data":"118557c4d36d5e78643e96944214db3780415325a30e1a37cde55c6634520a5c"} Feb 19 23:11:08 crc kubenswrapper[4771]: I0219 23:11:08.371715 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwjk" event={"ID":"06a04187-b3fc-4acc-87fb-dc37f2645071","Type":"ContainerStarted","Data":"7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb"} Feb 19 23:11:09 crc kubenswrapper[4771]: I0219 23:11:09.388136 4771 generic.go:334] "Generic (PLEG): container finished" podID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerID="7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb" exitCode=0 Feb 19 23:11:09 crc kubenswrapper[4771]: I0219 23:11:09.388261 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwjk" event={"ID":"06a04187-b3fc-4acc-87fb-dc37f2645071","Type":"ContainerDied","Data":"7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb"} Feb 19 23:11:10 crc kubenswrapper[4771]: I0219 23:11:10.398894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwjk" event={"ID":"06a04187-b3fc-4acc-87fb-dc37f2645071","Type":"ContainerStarted","Data":"6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06"} Feb 19 23:11:10 crc kubenswrapper[4771]: I0219 23:11:10.421403 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fnwjk" podStartSLOduration=2.952601166 podStartE2EDuration="5.421384931s" podCreationTimestamp="2026-02-19 23:11:05 +0000 UTC" firstStartedPulling="2026-02-19 23:11:07.360873602 +0000 UTC m=+6167.632316112" lastFinishedPulling="2026-02-19 23:11:09.829657367 +0000 UTC m=+6170.101099877" observedRunningTime="2026-02-19 23:11:10.419631015 +0000 UTC m=+6170.691073485" watchObservedRunningTime="2026-02-19 23:11:10.421384931 +0000 UTC m=+6170.692827401" Feb 19 23:11:13 crc kubenswrapper[4771]: I0219 23:11:13.093233 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:11:13 crc kubenswrapper[4771]: I0219 23:11:13.096880 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.062604 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6d55-account-create-update-6w54c"] Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.076880 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-tgc56"] Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.089969 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6d55-account-create-update-6w54c"] Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.101728 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-tgc56"] Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.458683 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c084eb3-4488-4d20-b1fc-c92a2f42dd19" path="/var/lib/kubelet/pods/0c084eb3-4488-4d20-b1fc-c92a2f42dd19/volumes" Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.460240 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f1f2a3-c87c-49e6-9a87-7e622b30faac" path="/var/lib/kubelet/pods/29f1f2a3-c87c-49e6-9a87-7e622b30faac/volumes" Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.642901 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.801891 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:11:14 crc kubenswrapper[4771]: I0219 23:11:14.876892 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b65bd49b-2w988"] Feb 19 23:11:15 crc kubenswrapper[4771]: I0219 23:11:15.437452 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:11:15 crc kubenswrapper[4771]: E0219 23:11:15.437821 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:11:15 crc kubenswrapper[4771]: I0219 23:11:15.461358 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67b65bd49b-2w988" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon-log" containerID="cri-o://dd48c29a601dba4deff2490fc19867ca5e31ed1cac63daf877fdf70dae8de0e3" gracePeriod=30 Feb 19 23:11:15 crc kubenswrapper[4771]: I0219 23:11:15.461410 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67b65bd49b-2w988" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" containerID="cri-o://bbc5b958f999b0cf68b3753ba45fcd8a7baaa22f7b0c53b91938808f3b7330be" gracePeriod=30 Feb 19 23:11:16 crc kubenswrapper[4771]: I0219 23:11:16.305131 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:16 crc kubenswrapper[4771]: I0219 23:11:16.305415 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:16 crc kubenswrapper[4771]: I0219 23:11:16.359528 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:16 crc kubenswrapper[4771]: I0219 23:11:16.527601 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:16 crc kubenswrapper[4771]: I0219 23:11:16.605941 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fnwjk"] Feb 19 23:11:18 crc kubenswrapper[4771]: I0219 23:11:18.494229 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fnwjk" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="registry-server" containerID="cri-o://6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06" gracePeriod=2 Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.025284 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.046942 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-catalog-content\") pod \"06a04187-b3fc-4acc-87fb-dc37f2645071\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.047046 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-utilities\") pod \"06a04187-b3fc-4acc-87fb-dc37f2645071\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.047091 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4s9w\" (UniqueName: \"kubernetes.io/projected/06a04187-b3fc-4acc-87fb-dc37f2645071-kube-api-access-p4s9w\") pod \"06a04187-b3fc-4acc-87fb-dc37f2645071\" (UID: \"06a04187-b3fc-4acc-87fb-dc37f2645071\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.051974 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-utilities" (OuterVolumeSpecName: "utilities") pod "06a04187-b3fc-4acc-87fb-dc37f2645071" (UID: "06a04187-b3fc-4acc-87fb-dc37f2645071"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.093384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a04187-b3fc-4acc-87fb-dc37f2645071-kube-api-access-p4s9w" (OuterVolumeSpecName: "kube-api-access-p4s9w") pod "06a04187-b3fc-4acc-87fb-dc37f2645071" (UID: "06a04187-b3fc-4acc-87fb-dc37f2645071"). InnerVolumeSpecName "kube-api-access-p4s9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.106475 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06a04187-b3fc-4acc-87fb-dc37f2645071" (UID: "06a04187-b3fc-4acc-87fb-dc37f2645071"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.154046 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.154087 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a04187-b3fc-4acc-87fb-dc37f2645071-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.154098 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4s9w\" (UniqueName: \"kubernetes.io/projected/06a04187-b3fc-4acc-87fb-dc37f2645071-kube-api-access-p4s9w\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.461331 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.514105 4771 generic.go:334] "Generic (PLEG): container finished" podID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerID="1fa23553b38960f29447f0c894fcbfdf37dc5d0ec4059fa3c14bc40a7775dba9" exitCode=137 Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.514133 4771 generic.go:334] "Generic (PLEG): container finished" podID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerID="35f7f21d96ad8d5f77987c00207320f0852215d11240c000c471bdf3c697c1a2" exitCode=137 Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.514170 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59dc7576bf-f4hbt" event={"ID":"478e48a8-a429-4d81-aff7-6dbc6f6b9f54","Type":"ContainerDied","Data":"1fa23553b38960f29447f0c894fcbfdf37dc5d0ec4059fa3c14bc40a7775dba9"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.514195 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59dc7576bf-f4hbt" event={"ID":"478e48a8-a429-4d81-aff7-6dbc6f6b9f54","Type":"ContainerDied","Data":"35f7f21d96ad8d5f77987c00207320f0852215d11240c000c471bdf3c697c1a2"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.516848 4771 generic.go:334] "Generic (PLEG): container finished" podID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerID="569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f" exitCode=137 Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.516908 4771 generic.go:334] "Generic (PLEG): container finished" podID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerID="25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8" exitCode=137 Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.516940 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687bb7c6f-hz6w4" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.516962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687bb7c6f-hz6w4" event={"ID":"598eaa33-32e4-411a-b4bc-f0df6cde32d3","Type":"ContainerDied","Data":"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.516983 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687bb7c6f-hz6w4" event={"ID":"598eaa33-32e4-411a-b4bc-f0df6cde32d3","Type":"ContainerDied","Data":"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.516991 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687bb7c6f-hz6w4" event={"ID":"598eaa33-32e4-411a-b4bc-f0df6cde32d3","Type":"ContainerDied","Data":"928e022933fcefe0f9435c2c328e1a510746ec2eb4989a4ccf0e50c4a4f9e833"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.517007 4771 scope.go:117] "RemoveContainer" containerID="569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.521438 4771 generic.go:334] "Generic (PLEG): container finished" podID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerID="6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06" exitCode=0 Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.521493 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fnwjk" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.521532 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwjk" event={"ID":"06a04187-b3fc-4acc-87fb-dc37f2645071","Type":"ContainerDied","Data":"6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.521564 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fnwjk" event={"ID":"06a04187-b3fc-4acc-87fb-dc37f2645071","Type":"ContainerDied","Data":"118557c4d36d5e78643e96944214db3780415325a30e1a37cde55c6634520a5c"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.525693 4771 generic.go:334] "Generic (PLEG): container finished" podID="26d9927e-39c9-4bd7-b426-b639c188da29" containerID="bbc5b958f999b0cf68b3753ba45fcd8a7baaa22f7b0c53b91938808f3b7330be" exitCode=0 Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.525754 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b65bd49b-2w988" event={"ID":"26d9927e-39c9-4bd7-b426-b639c188da29","Type":"ContainerDied","Data":"bbc5b958f999b0cf68b3753ba45fcd8a7baaa22f7b0c53b91938808f3b7330be"} Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.562859 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598eaa33-32e4-411a-b4bc-f0df6cde32d3-logs\") pod \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.563215 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-config-data\") pod \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.563279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-scripts\") pod \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.563956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/598eaa33-32e4-411a-b4bc-f0df6cde32d3-horizon-secret-key\") pod \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.564167 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598eaa33-32e4-411a-b4bc-f0df6cde32d3-logs" (OuterVolumeSpecName: "logs") pod "598eaa33-32e4-411a-b4bc-f0df6cde32d3" (UID: "598eaa33-32e4-411a-b4bc-f0df6cde32d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.565072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p7bc\" (UniqueName: \"kubernetes.io/projected/598eaa33-32e4-411a-b4bc-f0df6cde32d3-kube-api-access-8p7bc\") pod \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\" (UID: \"598eaa33-32e4-411a-b4bc-f0df6cde32d3\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.566056 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598eaa33-32e4-411a-b4bc-f0df6cde32d3-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.570158 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598eaa33-32e4-411a-b4bc-f0df6cde32d3-kube-api-access-8p7bc" (OuterVolumeSpecName: "kube-api-access-8p7bc") pod "598eaa33-32e4-411a-b4bc-f0df6cde32d3" (UID: "598eaa33-32e4-411a-b4bc-f0df6cde32d3"). InnerVolumeSpecName "kube-api-access-8p7bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.572202 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598eaa33-32e4-411a-b4bc-f0df6cde32d3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "598eaa33-32e4-411a-b4bc-f0df6cde32d3" (UID: "598eaa33-32e4-411a-b4bc-f0df6cde32d3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.580592 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fnwjk"] Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.588283 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fnwjk"] Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.596666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-scripts" (OuterVolumeSpecName: "scripts") pod "598eaa33-32e4-411a-b4bc-f0df6cde32d3" (UID: "598eaa33-32e4-411a-b4bc-f0df6cde32d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.602384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-config-data" (OuterVolumeSpecName: "config-data") pod "598eaa33-32e4-411a-b4bc-f0df6cde32d3" (UID: "598eaa33-32e4-411a-b4bc-f0df6cde32d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.667514 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.667549 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/598eaa33-32e4-411a-b4bc-f0df6cde32d3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.667562 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p7bc\" (UniqueName: \"kubernetes.io/projected/598eaa33-32e4-411a-b4bc-f0df6cde32d3-kube-api-access-8p7bc\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.667573 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/598eaa33-32e4-411a-b4bc-f0df6cde32d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.695778 4771 scope.go:117] "RemoveContainer" containerID="25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.796180 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.803140 4771 scope.go:117] "RemoveContainer" containerID="569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f" Feb 19 23:11:19 crc kubenswrapper[4771]: E0219 23:11:19.805947 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f\": container with ID starting with 569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f not found: ID does not exist" containerID="569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.805985 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f"} err="failed to get container status \"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f\": rpc error: code = NotFound desc = could not find container \"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f\": container with ID starting with 569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f not found: ID does not exist" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.806008 4771 scope.go:117] "RemoveContainer" containerID="25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8" Feb 19 23:11:19 crc kubenswrapper[4771]: E0219 23:11:19.806427 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8\": container with ID starting with 25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8 not found: ID does not exist" containerID="25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.806474 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8"} err="failed to get container status \"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8\": rpc error: code = NotFound desc = could not find container \"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8\": container with ID starting with 25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8 not found: ID does not exist" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.806502 4771 scope.go:117] "RemoveContainer" containerID="569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.806871 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f"} err="failed to get container status \"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f\": rpc error: code = NotFound desc = could not find container \"569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f\": container with ID starting with 569f79036d49f2b94f3f0dce54a190eb50bfb12ce234a78a1c28e31eaef6ee2f not found: ID does not exist" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.806993 4771 scope.go:117] "RemoveContainer" containerID="25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.807630 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8"} err="failed to get container status \"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8\": rpc error: code = NotFound desc = could not find container \"25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8\": container with ID starting with 25604742a264eae79c42d722885c6d8ccc70bb5546b4e10c8f6b08d1f2800bf8 not found: ID does not exist" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.807725 4771 scope.go:117] "RemoveContainer" containerID="6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.841278 4771 scope.go:117] "RemoveContainer" containerID="7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.870364 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-config-data\") pod \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.870447 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggkfv\" (UniqueName: \"kubernetes.io/projected/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-kube-api-access-ggkfv\") pod \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.870465 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-horizon-secret-key\") pod \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.870487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-logs\") pod \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.870511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-scripts\") pod \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\" (UID: \"478e48a8-a429-4d81-aff7-6dbc6f6b9f54\") " Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.871485 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6687bb7c6f-hz6w4"] Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.872703 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-logs" (OuterVolumeSpecName: "logs") pod "478e48a8-a429-4d81-aff7-6dbc6f6b9f54" (UID: "478e48a8-a429-4d81-aff7-6dbc6f6b9f54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.877729 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-kube-api-access-ggkfv" (OuterVolumeSpecName: "kube-api-access-ggkfv") pod "478e48a8-a429-4d81-aff7-6dbc6f6b9f54" (UID: "478e48a8-a429-4d81-aff7-6dbc6f6b9f54"). InnerVolumeSpecName "kube-api-access-ggkfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.878148 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "478e48a8-a429-4d81-aff7-6dbc6f6b9f54" (UID: "478e48a8-a429-4d81-aff7-6dbc6f6b9f54"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.878822 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6687bb7c6f-hz6w4"] Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.880877 4771 scope.go:117] "RemoveContainer" containerID="efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.893253 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-scripts" (OuterVolumeSpecName: "scripts") pod "478e48a8-a429-4d81-aff7-6dbc6f6b9f54" (UID: "478e48a8-a429-4d81-aff7-6dbc6f6b9f54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.893993 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-config-data" (OuterVolumeSpecName: "config-data") pod "478e48a8-a429-4d81-aff7-6dbc6f6b9f54" (UID: "478e48a8-a429-4d81-aff7-6dbc6f6b9f54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.955127 4771 scope.go:117] "RemoveContainer" containerID="6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06" Feb 19 23:11:19 crc kubenswrapper[4771]: E0219 23:11:19.955530 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06\": container with ID starting with 6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06 not found: ID does not exist" containerID="6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.955584 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06"} err="failed to get container status \"6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06\": rpc error: code = NotFound desc = could not find container \"6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06\": container with ID starting with 6a8741a408b01b9c0c2d919d1eb5a5fd3fdccd316c5b819facafa98724454f06 not found: ID does not exist" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.955605 4771 scope.go:117] "RemoveContainer" containerID="7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb" Feb 19 23:11:19 crc kubenswrapper[4771]: E0219 23:11:19.955770 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb\": container with ID starting with 7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb not found: ID does not exist" containerID="7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.955790 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb"} err="failed to get container status \"7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb\": rpc error: code = NotFound desc = could not find container \"7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb\": container with ID starting with 7adf1adb3e951e9a2c8b7738ced995fecf400d0af39add60a62732ff2bb58cdb not found: ID does not exist" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.955803 4771 scope.go:117] "RemoveContainer" containerID="efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c" Feb 19 23:11:19 crc kubenswrapper[4771]: E0219 23:11:19.955970 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c\": container with ID starting with efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c not found: ID does not exist" containerID="efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.955991 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c"} err="failed to get container status \"efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c\": rpc error: code = NotFound desc = could not find container \"efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c\": container with ID starting with efdb5f247f9fb1769d322e17b040e19b0f338d84b79d43682db7463967f7199c not found: ID does not exist" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.971948 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.971971 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.971982 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggkfv\" (UniqueName: \"kubernetes.io/projected/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-kube-api-access-ggkfv\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.971992 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:19 crc kubenswrapper[4771]: I0219 23:11:19.972001 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e48a8-a429-4d81-aff7-6dbc6f6b9f54-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.454133 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" path="/var/lib/kubelet/pods/06a04187-b3fc-4acc-87fb-dc37f2645071/volumes" Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.455772 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" path="/var/lib/kubelet/pods/598eaa33-32e4-411a-b4bc-f0df6cde32d3/volumes" Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.542998 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59dc7576bf-f4hbt" event={"ID":"478e48a8-a429-4d81-aff7-6dbc6f6b9f54","Type":"ContainerDied","Data":"9d0f1022c1c7ad914e2b88b6177c8eedec41b57dd26ba8615b1e4dc3741cfb4c"} Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.543101 4771 scope.go:117] "RemoveContainer" containerID="1fa23553b38960f29447f0c894fcbfdf37dc5d0ec4059fa3c14bc40a7775dba9" Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.543101 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59dc7576bf-f4hbt" Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.587167 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59dc7576bf-f4hbt"] Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.601010 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59dc7576bf-f4hbt"] Feb 19 23:11:20 crc kubenswrapper[4771]: I0219 23:11:20.797615 4771 scope.go:117] "RemoveContainer" containerID="35f7f21d96ad8d5f77987c00207320f0852215d11240c000c471bdf3c697c1a2" Feb 19 23:11:21 crc kubenswrapper[4771]: I0219 23:11:21.322260 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b65bd49b-2w988" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.127:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.127:8443: connect: connection refused" Feb 19 23:11:22 crc kubenswrapper[4771]: I0219 23:11:22.053378 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-s45z6"] Feb 19 23:11:22 crc kubenswrapper[4771]: I0219 23:11:22.066343 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-s45z6"] Feb 19 23:11:22 crc kubenswrapper[4771]: I0219 23:11:22.457536 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" path="/var/lib/kubelet/pods/478e48a8-a429-4d81-aff7-6dbc6f6b9f54/volumes" Feb 19 23:11:22 crc kubenswrapper[4771]: I0219 23:11:22.458343 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d33bd6-8a72-405a-b3ec-14bcce254f9e" path="/var/lib/kubelet/pods/87d33bd6-8a72-405a-b3ec-14bcce254f9e/volumes" Feb 19 23:11:29 crc kubenswrapper[4771]: I0219 23:11:29.438534 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:11:29 crc kubenswrapper[4771]: E0219 23:11:29.439596 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:11:31 crc kubenswrapper[4771]: I0219 23:11:31.324059 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b65bd49b-2w988" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.127:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.127:8443: connect: connection refused" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.456373 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-plzj4"] Feb 19 23:11:34 crc kubenswrapper[4771]: E0219 23:11:34.457651 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="registry-server" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.457680 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="registry-server" Feb 19 23:11:34 crc kubenswrapper[4771]: E0219 23:11:34.457719 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="extract-utilities" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.457732 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="extract-utilities" Feb 19 23:11:34 crc kubenswrapper[4771]: E0219 23:11:34.457761 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.457773 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon" Feb 19 23:11:34 crc kubenswrapper[4771]: E0219 23:11:34.457810 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon-log" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.457822 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon-log" Feb 19 23:11:34 crc kubenswrapper[4771]: E0219 23:11:34.457846 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon-log" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.457858 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon-log" Feb 19 23:11:34 crc kubenswrapper[4771]: E0219 23:11:34.457883 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.457895 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon" Feb 19 23:11:34 crc kubenswrapper[4771]: E0219 23:11:34.457916 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="extract-content" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.457929 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="extract-content" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.458360 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a04187-b3fc-4acc-87fb-dc37f2645071" containerName="registry-server" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.458404 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.458444 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="598eaa33-32e4-411a-b4bc-f0df6cde32d3" containerName="horizon-log" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.458475 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.458491 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="478e48a8-a429-4d81-aff7-6dbc6f6b9f54" containerName="horizon-log" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.461777 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plzj4"] Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.461914 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.575475 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-utilities\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.575816 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmbp9\" (UniqueName: \"kubernetes.io/projected/d0a3c4aa-ab05-40e0-bc35-612fe046115b-kube-api-access-pmbp9\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.575961 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-catalog-content\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.677824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmbp9\" (UniqueName: \"kubernetes.io/projected/d0a3c4aa-ab05-40e0-bc35-612fe046115b-kube-api-access-pmbp9\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.678302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-catalog-content\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.678784 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-utilities\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.678825 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-catalog-content\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.679130 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-utilities\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.698044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmbp9\" (UniqueName: \"kubernetes.io/projected/d0a3c4aa-ab05-40e0-bc35-612fe046115b-kube-api-access-pmbp9\") pod \"redhat-operators-plzj4\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:34 crc kubenswrapper[4771]: I0219 23:11:34.795492 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:35 crc kubenswrapper[4771]: I0219 23:11:35.259692 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plzj4"] Feb 19 23:11:35 crc kubenswrapper[4771]: I0219 23:11:35.728704 4771 generic.go:334] "Generic (PLEG): container finished" podID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerID="5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec" exitCode=0 Feb 19 23:11:35 crc kubenswrapper[4771]: I0219 23:11:35.728938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plzj4" event={"ID":"d0a3c4aa-ab05-40e0-bc35-612fe046115b","Type":"ContainerDied","Data":"5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec"} Feb 19 23:11:35 crc kubenswrapper[4771]: I0219 23:11:35.728986 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plzj4" event={"ID":"d0a3c4aa-ab05-40e0-bc35-612fe046115b","Type":"ContainerStarted","Data":"a238001f7baa4421ff2a06f8419c1f565d96f3108f9b0cc38d7eabcf287c33ea"} Feb 19 23:11:36 crc kubenswrapper[4771]: I0219 23:11:36.645489 4771 scope.go:117] "RemoveContainer" containerID="ab7c09220036754f543c128fdff1af89f32df97a316d04996d4ef7f0c4065202" Feb 19 23:11:36 crc kubenswrapper[4771]: I0219 23:11:36.689641 4771 scope.go:117] "RemoveContainer" containerID="dfca0112d2c458918462e5dd50667567b8be4a8e86a92145d14e1629fcfba85c" Feb 19 23:11:36 crc kubenswrapper[4771]: I0219 23:11:36.744774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plzj4" event={"ID":"d0a3c4aa-ab05-40e0-bc35-612fe046115b","Type":"ContainerStarted","Data":"944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766"} Feb 19 23:11:36 crc kubenswrapper[4771]: I0219 23:11:36.757359 4771 scope.go:117] "RemoveContainer" containerID="f66be0da4eff9bb9f7b585d44765ebb24d7a93f1d44e293d07a3603f265a90a4" Feb 19 23:11:38 crc kubenswrapper[4771]: I0219 23:11:38.788445 4771 generic.go:334] "Generic (PLEG): container finished" podID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerID="944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766" exitCode=0 Feb 19 23:11:38 crc kubenswrapper[4771]: I0219 23:11:38.788560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plzj4" event={"ID":"d0a3c4aa-ab05-40e0-bc35-612fe046115b","Type":"ContainerDied","Data":"944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766"} Feb 19 23:11:39 crc kubenswrapper[4771]: I0219 23:11:39.806784 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plzj4" event={"ID":"d0a3c4aa-ab05-40e0-bc35-612fe046115b","Type":"ContainerStarted","Data":"6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7"} Feb 19 23:11:39 crc kubenswrapper[4771]: I0219 23:11:39.836978 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-plzj4" podStartSLOduration=2.367078395 podStartE2EDuration="5.836954336s" podCreationTimestamp="2026-02-19 23:11:34 +0000 UTC" firstStartedPulling="2026-02-19 23:11:35.731263065 +0000 UTC m=+6196.002705535" lastFinishedPulling="2026-02-19 23:11:39.201138966 +0000 UTC m=+6199.472581476" observedRunningTime="2026-02-19 23:11:39.828941003 +0000 UTC m=+6200.100383553" watchObservedRunningTime="2026-02-19 23:11:39.836954336 +0000 UTC m=+6200.108396816" Feb 19 23:11:41 crc kubenswrapper[4771]: I0219 23:11:41.322061 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b65bd49b-2w988" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.127:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.127:8443: connect: connection refused" Feb 19 23:11:41 crc kubenswrapper[4771]: I0219 23:11:41.322200 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:11:43 crc kubenswrapper[4771]: I0219 23:11:43.437789 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:11:43 crc kubenswrapper[4771]: E0219 23:11:43.438987 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:11:44 crc kubenswrapper[4771]: I0219 23:11:44.796588 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:44 crc kubenswrapper[4771]: I0219 23:11:44.796891 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:45 crc kubenswrapper[4771]: I0219 23:11:45.859828 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-plzj4" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="registry-server" probeResult="failure" output=< Feb 19 23:11:45 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:11:45 crc kubenswrapper[4771]: > Feb 19 23:11:45 crc kubenswrapper[4771]: I0219 23:11:45.915336 4771 generic.go:334] "Generic (PLEG): container finished" podID="26d9927e-39c9-4bd7-b426-b639c188da29" containerID="dd48c29a601dba4deff2490fc19867ca5e31ed1cac63daf877fdf70dae8de0e3" exitCode=137 Feb 19 23:11:45 crc kubenswrapper[4771]: I0219 23:11:45.915402 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b65bd49b-2w988" event={"ID":"26d9927e-39c9-4bd7-b426-b639c188da29","Type":"ContainerDied","Data":"dd48c29a601dba4deff2490fc19867ca5e31ed1cac63daf877fdf70dae8de0e3"} Feb 19 23:11:45 crc kubenswrapper[4771]: I0219 23:11:45.915440 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b65bd49b-2w988" event={"ID":"26d9927e-39c9-4bd7-b426-b639c188da29","Type":"ContainerDied","Data":"e436860c5ddb013bdc45a7ea098dec1ddc1ab2b2e0f789507aa7d14cbaf02b2b"} Feb 19 23:11:45 crc kubenswrapper[4771]: I0219 23:11:45.915459 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e436860c5ddb013bdc45a7ea098dec1ddc1ab2b2e0f789507aa7d14cbaf02b2b" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.012790 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.075906 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-secret-key\") pod \"26d9927e-39c9-4bd7-b426-b639c188da29\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.076305 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-tls-certs\") pod \"26d9927e-39c9-4bd7-b426-b639c188da29\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.076443 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4ktm\" (UniqueName: \"kubernetes.io/projected/26d9927e-39c9-4bd7-b426-b639c188da29-kube-api-access-p4ktm\") pod \"26d9927e-39c9-4bd7-b426-b639c188da29\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.076552 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-config-data\") pod \"26d9927e-39c9-4bd7-b426-b639c188da29\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.076698 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d9927e-39c9-4bd7-b426-b639c188da29-logs\") pod \"26d9927e-39c9-4bd7-b426-b639c188da29\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.076928 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-combined-ca-bundle\") pod \"26d9927e-39c9-4bd7-b426-b639c188da29\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.077108 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-scripts\") pod \"26d9927e-39c9-4bd7-b426-b639c188da29\" (UID: \"26d9927e-39c9-4bd7-b426-b639c188da29\") " Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.079830 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9927e-39c9-4bd7-b426-b639c188da29-logs" (OuterVolumeSpecName: "logs") pod "26d9927e-39c9-4bd7-b426-b639c188da29" (UID: "26d9927e-39c9-4bd7-b426-b639c188da29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.083782 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d9927e-39c9-4bd7-b426-b639c188da29-kube-api-access-p4ktm" (OuterVolumeSpecName: "kube-api-access-p4ktm") pod "26d9927e-39c9-4bd7-b426-b639c188da29" (UID: "26d9927e-39c9-4bd7-b426-b639c188da29"). InnerVolumeSpecName "kube-api-access-p4ktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.134203 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "26d9927e-39c9-4bd7-b426-b639c188da29" (UID: "26d9927e-39c9-4bd7-b426-b639c188da29"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.139064 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26d9927e-39c9-4bd7-b426-b639c188da29" (UID: "26d9927e-39c9-4bd7-b426-b639c188da29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.149220 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-scripts" (OuterVolumeSpecName: "scripts") pod "26d9927e-39c9-4bd7-b426-b639c188da29" (UID: "26d9927e-39c9-4bd7-b426-b639c188da29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.155742 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-config-data" (OuterVolumeSpecName: "config-data") pod "26d9927e-39c9-4bd7-b426-b639c188da29" (UID: "26d9927e-39c9-4bd7-b426-b639c188da29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.164364 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "26d9927e-39c9-4bd7-b426-b639c188da29" (UID: "26d9927e-39c9-4bd7-b426-b639c188da29"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.180595 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.180627 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4ktm\" (UniqueName: \"kubernetes.io/projected/26d9927e-39c9-4bd7-b426-b639c188da29-kube-api-access-p4ktm\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.180644 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.180678 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d9927e-39c9-4bd7-b426-b639c188da29-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.180691 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.180702 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d9927e-39c9-4bd7-b426-b639c188da29-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.180713 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/26d9927e-39c9-4bd7-b426-b639c188da29-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.925292 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b65bd49b-2w988" Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.969746 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b65bd49b-2w988"] Feb 19 23:11:46 crc kubenswrapper[4771]: I0219 23:11:46.981817 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67b65bd49b-2w988"] Feb 19 23:11:48 crc kubenswrapper[4771]: I0219 23:11:48.058290 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ww2tk"] Feb 19 23:11:48 crc kubenswrapper[4771]: I0219 23:11:48.085666 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ww2tk"] Feb 19 23:11:48 crc kubenswrapper[4771]: I0219 23:11:48.459955 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" path="/var/lib/kubelet/pods/26d9927e-39c9-4bd7-b426-b639c188da29/volumes" Feb 19 23:11:48 crc kubenswrapper[4771]: I0219 23:11:48.461369 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3337aa8-71a7-4f33-a69a-c46d3204007a" path="/var/lib/kubelet/pods/c3337aa8-71a7-4f33-a69a-c46d3204007a/volumes" Feb 19 23:11:49 crc kubenswrapper[4771]: I0219 23:11:49.063222 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3b4b-account-create-update-2fm94"] Feb 19 23:11:49 crc kubenswrapper[4771]: I0219 23:11:49.081246 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3b4b-account-create-update-2fm94"] Feb 19 23:11:50 crc kubenswrapper[4771]: I0219 23:11:50.457347 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf" path="/var/lib/kubelet/pods/4a51bd2d-0856-4bbb-b4d5-f9915c1e6cbf/volumes" Feb 19 23:11:54 crc kubenswrapper[4771]: I0219 23:11:54.438256 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:11:54 crc kubenswrapper[4771]: E0219 23:11:54.439058 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:11:54 crc kubenswrapper[4771]: I0219 23:11:54.857902 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:54 crc kubenswrapper[4771]: I0219 23:11:54.990386 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.099974 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plzj4"] Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.962454 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66b7bf74db-l2vb5"] Feb 19 23:11:55 crc kubenswrapper[4771]: E0219 23:11:55.967945 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.968100 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" Feb 19 23:11:55 crc kubenswrapper[4771]: E0219 23:11:55.968211 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon-log" Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.968312 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon-log" Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.968634 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon-log" Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.968733 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d9927e-39c9-4bd7-b426-b639c188da29" containerName="horizon" Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.970061 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:55 crc kubenswrapper[4771]: I0219 23:11:55.982516 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66b7bf74db-l2vb5"] Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.040051 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-plzj4" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="registry-server" containerID="cri-o://6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7" gracePeriod=2 Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.042002 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hmnhd"] Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.055027 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hmnhd"] Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.070547 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjg92\" (UniqueName: \"kubernetes.io/projected/46a4c64b-a2fd-4fac-814a-055008a0d27a-kube-api-access-mjg92\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.070821 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46a4c64b-a2fd-4fac-814a-055008a0d27a-scripts\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.070894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-horizon-secret-key\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.070960 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46a4c64b-a2fd-4fac-814a-055008a0d27a-logs\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.071101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46a4c64b-a2fd-4fac-814a-055008a0d27a-config-data\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.071179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-horizon-tls-certs\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.071989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-combined-ca-bundle\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.172902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjg92\" (UniqueName: \"kubernetes.io/projected/46a4c64b-a2fd-4fac-814a-055008a0d27a-kube-api-access-mjg92\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.173290 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46a4c64b-a2fd-4fac-814a-055008a0d27a-scripts\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.173322 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-horizon-secret-key\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.173347 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46a4c64b-a2fd-4fac-814a-055008a0d27a-logs\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.173439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46a4c64b-a2fd-4fac-814a-055008a0d27a-config-data\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.173472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-horizon-tls-certs\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.173505 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-combined-ca-bundle\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.177167 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46a4c64b-a2fd-4fac-814a-055008a0d27a-logs\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.178928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46a4c64b-a2fd-4fac-814a-055008a0d27a-config-data\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.179567 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46a4c64b-a2fd-4fac-814a-055008a0d27a-scripts\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.184753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-combined-ca-bundle\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.185162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-horizon-secret-key\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.186723 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46a4c64b-a2fd-4fac-814a-055008a0d27a-horizon-tls-certs\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.191752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjg92\" (UniqueName: \"kubernetes.io/projected/46a4c64b-a2fd-4fac-814a-055008a0d27a-kube-api-access-mjg92\") pod \"horizon-66b7bf74db-l2vb5\" (UID: \"46a4c64b-a2fd-4fac-814a-055008a0d27a\") " pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.310249 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.447469 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.461041 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43635992-5285-44e4-802b-43eaa8310d68" path="/var/lib/kubelet/pods/43635992-5285-44e4-802b-43eaa8310d68/volumes" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.587151 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-utilities\") pod \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.587276 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-catalog-content\") pod \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.587399 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmbp9\" (UniqueName: \"kubernetes.io/projected/d0a3c4aa-ab05-40e0-bc35-612fe046115b-kube-api-access-pmbp9\") pod \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\" (UID: \"d0a3c4aa-ab05-40e0-bc35-612fe046115b\") " Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.587973 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-utilities" (OuterVolumeSpecName: "utilities") pod "d0a3c4aa-ab05-40e0-bc35-612fe046115b" (UID: "d0a3c4aa-ab05-40e0-bc35-612fe046115b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.595279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a3c4aa-ab05-40e0-bc35-612fe046115b-kube-api-access-pmbp9" (OuterVolumeSpecName: "kube-api-access-pmbp9") pod "d0a3c4aa-ab05-40e0-bc35-612fe046115b" (UID: "d0a3c4aa-ab05-40e0-bc35-612fe046115b"). InnerVolumeSpecName "kube-api-access-pmbp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.689174 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.689201 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmbp9\" (UniqueName: \"kubernetes.io/projected/d0a3c4aa-ab05-40e0-bc35-612fe046115b-kube-api-access-pmbp9\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.712265 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0a3c4aa-ab05-40e0-bc35-612fe046115b" (UID: "d0a3c4aa-ab05-40e0-bc35-612fe046115b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.791057 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a3c4aa-ab05-40e0-bc35-612fe046115b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:56 crc kubenswrapper[4771]: I0219 23:11:56.809430 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66b7bf74db-l2vb5"] Feb 19 23:11:56 crc kubenswrapper[4771]: W0219 23:11:56.814581 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46a4c64b_a2fd_4fac_814a_055008a0d27a.slice/crio-638f3e502159417f2793600651c2790eefbfc5b56aaa5265c9a498a17918026a WatchSource:0}: Error finding container 638f3e502159417f2793600651c2790eefbfc5b56aaa5265c9a498a17918026a: Status 404 returned error can't find the container with id 638f3e502159417f2793600651c2790eefbfc5b56aaa5265c9a498a17918026a Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.056464 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66b7bf74db-l2vb5" event={"ID":"46a4c64b-a2fd-4fac-814a-055008a0d27a","Type":"ContainerStarted","Data":"225bb39a2fbd751b43b73fdd378fc812ba07a168b9ecc0664dadd2d6eee764ee"} Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.056692 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66b7bf74db-l2vb5" event={"ID":"46a4c64b-a2fd-4fac-814a-055008a0d27a","Type":"ContainerStarted","Data":"638f3e502159417f2793600651c2790eefbfc5b56aaa5265c9a498a17918026a"} Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.073575 4771 generic.go:334] "Generic (PLEG): container finished" podID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerID="6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7" exitCode=0 Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.073625 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plzj4" event={"ID":"d0a3c4aa-ab05-40e0-bc35-612fe046115b","Type":"ContainerDied","Data":"6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7"} Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.073658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plzj4" event={"ID":"d0a3c4aa-ab05-40e0-bc35-612fe046115b","Type":"ContainerDied","Data":"a238001f7baa4421ff2a06f8419c1f565d96f3108f9b0cc38d7eabcf287c33ea"} Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.073681 4771 scope.go:117] "RemoveContainer" containerID="6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.073858 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plzj4" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.129290 4771 scope.go:117] "RemoveContainer" containerID="944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.147486 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plzj4"] Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.162917 4771 scope.go:117] "RemoveContainer" containerID="5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.163670 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-plzj4"] Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.294431 4771 scope.go:117] "RemoveContainer" containerID="6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7" Feb 19 23:11:57 crc kubenswrapper[4771]: E0219 23:11:57.294867 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7\": container with ID starting with 6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7 not found: ID does not exist" containerID="6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.294917 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7"} err="failed to get container status \"6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7\": rpc error: code = NotFound desc = could not find container \"6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7\": container with ID starting with 6f3d2366fc7ad5601a3ac16174598eb6e3b54319ac3ed718b4f8295cb404bdd7 not found: ID does not exist" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.294959 4771 scope.go:117] "RemoveContainer" containerID="944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766" Feb 19 23:11:57 crc kubenswrapper[4771]: E0219 23:11:57.295649 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766\": container with ID starting with 944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766 not found: ID does not exist" containerID="944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.295681 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766"} err="failed to get container status \"944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766\": rpc error: code = NotFound desc = could not find container \"944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766\": container with ID starting with 944a67b942482c9a12170056758c1bf6d9ea9c66ade81b0ebcba0a30a9af5766 not found: ID does not exist" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.295725 4771 scope.go:117] "RemoveContainer" containerID="5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec" Feb 19 23:11:57 crc kubenswrapper[4771]: E0219 23:11:57.296007 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec\": container with ID starting with 5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec not found: ID does not exist" containerID="5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.296204 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec"} err="failed to get container status \"5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec\": rpc error: code = NotFound desc = could not find container \"5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec\": container with ID starting with 5d6e0c9c6f41bbc916bfee2b29de13e7e909c1a1cd1adc7740a26acfbde54fec not found: ID does not exist" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.592518 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-hb27f"] Feb 19 23:11:57 crc kubenswrapper[4771]: E0219 23:11:57.593105 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="extract-utilities" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.593128 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="extract-utilities" Feb 19 23:11:57 crc kubenswrapper[4771]: E0219 23:11:57.593156 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="registry-server" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.593164 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="registry-server" Feb 19 23:11:57 crc kubenswrapper[4771]: E0219 23:11:57.593207 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="extract-content" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.593215 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="extract-content" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.593462 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" containerName="registry-server" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.594295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.606459 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-hb27f"] Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.619893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jbm\" (UniqueName: \"kubernetes.io/projected/7d75dbbd-a8bf-4250-a891-e8de80593ef2-kube-api-access-f6jbm\") pod \"heat-db-create-hb27f\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.619989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d75dbbd-a8bf-4250-a891-e8de80593ef2-operator-scripts\") pod \"heat-db-create-hb27f\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.678865 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-5d53-account-create-update-kwgwp"] Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.680351 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.688769 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.694312 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5d53-account-create-update-kwgwp"] Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.721753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d75dbbd-a8bf-4250-a891-e8de80593ef2-operator-scripts\") pod \"heat-db-create-hb27f\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.721855 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de27f6b-5621-4e18-81a6-85dbfdd89711-operator-scripts\") pod \"heat-5d53-account-create-update-kwgwp\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.721905 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnth\" (UniqueName: \"kubernetes.io/projected/2de27f6b-5621-4e18-81a6-85dbfdd89711-kube-api-access-smnth\") pod \"heat-5d53-account-create-update-kwgwp\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.721952 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jbm\" (UniqueName: \"kubernetes.io/projected/7d75dbbd-a8bf-4250-a891-e8de80593ef2-kube-api-access-f6jbm\") pod \"heat-db-create-hb27f\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.722416 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d75dbbd-a8bf-4250-a891-e8de80593ef2-operator-scripts\") pod \"heat-db-create-hb27f\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.740134 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jbm\" (UniqueName: \"kubernetes.io/projected/7d75dbbd-a8bf-4250-a891-e8de80593ef2-kube-api-access-f6jbm\") pod \"heat-db-create-hb27f\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.823267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de27f6b-5621-4e18-81a6-85dbfdd89711-operator-scripts\") pod \"heat-5d53-account-create-update-kwgwp\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.823546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnth\" (UniqueName: \"kubernetes.io/projected/2de27f6b-5621-4e18-81a6-85dbfdd89711-kube-api-access-smnth\") pod \"heat-5d53-account-create-update-kwgwp\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.824553 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de27f6b-5621-4e18-81a6-85dbfdd89711-operator-scripts\") pod \"heat-5d53-account-create-update-kwgwp\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.841662 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnth\" (UniqueName: \"kubernetes.io/projected/2de27f6b-5621-4e18-81a6-85dbfdd89711-kube-api-access-smnth\") pod \"heat-5d53-account-create-update-kwgwp\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.923322 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hb27f" Feb 19 23:11:57 crc kubenswrapper[4771]: I0219 23:11:57.993408 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:11:58 crc kubenswrapper[4771]: I0219 23:11:58.091751 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66b7bf74db-l2vb5" event={"ID":"46a4c64b-a2fd-4fac-814a-055008a0d27a","Type":"ContainerStarted","Data":"758c9e25acf62009110ae687eb357de86df58fb16e482d9b818a66b827dc3a4c"} Feb 19 23:11:58 crc kubenswrapper[4771]: I0219 23:11:58.122834 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66b7bf74db-l2vb5" podStartSLOduration=3.122815366 podStartE2EDuration="3.122815366s" podCreationTimestamp="2026-02-19 23:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:11:58.119302883 +0000 UTC m=+6218.390745363" watchObservedRunningTime="2026-02-19 23:11:58.122815366 +0000 UTC m=+6218.394257826" Feb 19 23:11:58 crc kubenswrapper[4771]: I0219 23:11:58.421991 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-hb27f"] Feb 19 23:11:58 crc kubenswrapper[4771]: W0219 23:11:58.427713 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d75dbbd_a8bf_4250_a891_e8de80593ef2.slice/crio-9e866796bcdf5ce5fd30652495dd37bafd83bc4ae45e8a85b691987888a82bb0 WatchSource:0}: Error finding container 9e866796bcdf5ce5fd30652495dd37bafd83bc4ae45e8a85b691987888a82bb0: Status 404 returned error can't find the container with id 9e866796bcdf5ce5fd30652495dd37bafd83bc4ae45e8a85b691987888a82bb0 Feb 19 23:11:58 crc kubenswrapper[4771]: I0219 23:11:58.457498 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a3c4aa-ab05-40e0-bc35-612fe046115b" path="/var/lib/kubelet/pods/d0a3c4aa-ab05-40e0-bc35-612fe046115b/volumes" Feb 19 23:11:58 crc kubenswrapper[4771]: I0219 23:11:58.635973 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5d53-account-create-update-kwgwp"] Feb 19 23:11:58 crc kubenswrapper[4771]: W0219 23:11:58.636989 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2de27f6b_5621_4e18_81a6_85dbfdd89711.slice/crio-2272e3d37a167129615d0f0ecb1b16264c9fff40952bd4add2f4c527fb70d269 WatchSource:0}: Error finding container 2272e3d37a167129615d0f0ecb1b16264c9fff40952bd4add2f4c527fb70d269: Status 404 returned error can't find the container with id 2272e3d37a167129615d0f0ecb1b16264c9fff40952bd4add2f4c527fb70d269 Feb 19 23:11:59 crc kubenswrapper[4771]: I0219 23:11:59.109149 4771 generic.go:334] "Generic (PLEG): container finished" podID="7d75dbbd-a8bf-4250-a891-e8de80593ef2" containerID="d8accff843352756205b6562f6a48140332f0f78f391ce21454b3ddf52ce4b4f" exitCode=0 Feb 19 23:11:59 crc kubenswrapper[4771]: I0219 23:11:59.109191 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-hb27f" event={"ID":"7d75dbbd-a8bf-4250-a891-e8de80593ef2","Type":"ContainerDied","Data":"d8accff843352756205b6562f6a48140332f0f78f391ce21454b3ddf52ce4b4f"} Feb 19 23:11:59 crc kubenswrapper[4771]: I0219 23:11:59.109450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-hb27f" event={"ID":"7d75dbbd-a8bf-4250-a891-e8de80593ef2","Type":"ContainerStarted","Data":"9e866796bcdf5ce5fd30652495dd37bafd83bc4ae45e8a85b691987888a82bb0"} Feb 19 23:11:59 crc kubenswrapper[4771]: I0219 23:11:59.111556 4771 generic.go:334] "Generic (PLEG): container finished" podID="2de27f6b-5621-4e18-81a6-85dbfdd89711" containerID="13bf8aa751b83f481f26eb4700a682da97ae702a76baedff00b9214f321ef45b" exitCode=0 Feb 19 23:11:59 crc kubenswrapper[4771]: I0219 23:11:59.112461 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5d53-account-create-update-kwgwp" event={"ID":"2de27f6b-5621-4e18-81a6-85dbfdd89711","Type":"ContainerDied","Data":"13bf8aa751b83f481f26eb4700a682da97ae702a76baedff00b9214f321ef45b"} Feb 19 23:11:59 crc kubenswrapper[4771]: I0219 23:11:59.112491 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5d53-account-create-update-kwgwp" event={"ID":"2de27f6b-5621-4e18-81a6-85dbfdd89711","Type":"ContainerStarted","Data":"2272e3d37a167129615d0f0ecb1b16264c9fff40952bd4add2f4c527fb70d269"} Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.639493 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hb27f" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.651589 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.679674 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6jbm\" (UniqueName: \"kubernetes.io/projected/7d75dbbd-a8bf-4250-a891-e8de80593ef2-kube-api-access-f6jbm\") pod \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.679749 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de27f6b-5621-4e18-81a6-85dbfdd89711-operator-scripts\") pod \"2de27f6b-5621-4e18-81a6-85dbfdd89711\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.679787 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smnth\" (UniqueName: \"kubernetes.io/projected/2de27f6b-5621-4e18-81a6-85dbfdd89711-kube-api-access-smnth\") pod \"2de27f6b-5621-4e18-81a6-85dbfdd89711\" (UID: \"2de27f6b-5621-4e18-81a6-85dbfdd89711\") " Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.679842 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d75dbbd-a8bf-4250-a891-e8de80593ef2-operator-scripts\") pod \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\" (UID: \"7d75dbbd-a8bf-4250-a891-e8de80593ef2\") " Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.681047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d75dbbd-a8bf-4250-a891-e8de80593ef2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d75dbbd-a8bf-4250-a891-e8de80593ef2" (UID: "7d75dbbd-a8bf-4250-a891-e8de80593ef2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.683909 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de27f6b-5621-4e18-81a6-85dbfdd89711-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2de27f6b-5621-4e18-81a6-85dbfdd89711" (UID: "2de27f6b-5621-4e18-81a6-85dbfdd89711"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.708228 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d75dbbd-a8bf-4250-a891-e8de80593ef2-kube-api-access-f6jbm" (OuterVolumeSpecName: "kube-api-access-f6jbm") pod "7d75dbbd-a8bf-4250-a891-e8de80593ef2" (UID: "7d75dbbd-a8bf-4250-a891-e8de80593ef2"). InnerVolumeSpecName "kube-api-access-f6jbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.708299 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de27f6b-5621-4e18-81a6-85dbfdd89711-kube-api-access-smnth" (OuterVolumeSpecName: "kube-api-access-smnth") pod "2de27f6b-5621-4e18-81a6-85dbfdd89711" (UID: "2de27f6b-5621-4e18-81a6-85dbfdd89711"). InnerVolumeSpecName "kube-api-access-smnth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.782279 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6jbm\" (UniqueName: \"kubernetes.io/projected/7d75dbbd-a8bf-4250-a891-e8de80593ef2-kube-api-access-f6jbm\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.782791 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de27f6b-5621-4e18-81a6-85dbfdd89711-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.782854 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smnth\" (UniqueName: \"kubernetes.io/projected/2de27f6b-5621-4e18-81a6-85dbfdd89711-kube-api-access-smnth\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:00 crc kubenswrapper[4771]: I0219 23:12:00.782916 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d75dbbd-a8bf-4250-a891-e8de80593ef2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:01 crc kubenswrapper[4771]: I0219 23:12:01.206376 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-hb27f" Feb 19 23:12:01 crc kubenswrapper[4771]: I0219 23:12:01.207468 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-hb27f" event={"ID":"7d75dbbd-a8bf-4250-a891-e8de80593ef2","Type":"ContainerDied","Data":"9e866796bcdf5ce5fd30652495dd37bafd83bc4ae45e8a85b691987888a82bb0"} Feb 19 23:12:01 crc kubenswrapper[4771]: I0219 23:12:01.207526 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e866796bcdf5ce5fd30652495dd37bafd83bc4ae45e8a85b691987888a82bb0" Feb 19 23:12:01 crc kubenswrapper[4771]: I0219 23:12:01.210718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5d53-account-create-update-kwgwp" event={"ID":"2de27f6b-5621-4e18-81a6-85dbfdd89711","Type":"ContainerDied","Data":"2272e3d37a167129615d0f0ecb1b16264c9fff40952bd4add2f4c527fb70d269"} Feb 19 23:12:01 crc kubenswrapper[4771]: I0219 23:12:01.210748 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2272e3d37a167129615d0f0ecb1b16264c9fff40952bd4add2f4c527fb70d269" Feb 19 23:12:01 crc kubenswrapper[4771]: I0219 23:12:01.210813 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5d53-account-create-update-kwgwp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.800168 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-zshpp"] Feb 19 23:12:02 crc kubenswrapper[4771]: E0219 23:12:02.801266 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de27f6b-5621-4e18-81a6-85dbfdd89711" containerName="mariadb-account-create-update" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.801281 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de27f6b-5621-4e18-81a6-85dbfdd89711" containerName="mariadb-account-create-update" Feb 19 23:12:02 crc kubenswrapper[4771]: E0219 23:12:02.801302 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d75dbbd-a8bf-4250-a891-e8de80593ef2" containerName="mariadb-database-create" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.801308 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d75dbbd-a8bf-4250-a891-e8de80593ef2" containerName="mariadb-database-create" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.801521 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d75dbbd-a8bf-4250-a891-e8de80593ef2" containerName="mariadb-database-create" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.801534 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de27f6b-5621-4e18-81a6-85dbfdd89711" containerName="mariadb-account-create-update" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.802138 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.805978 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-mv6zh" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.806666 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.818712 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zshpp"] Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.829348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxhq\" (UniqueName: \"kubernetes.io/projected/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-kube-api-access-vpxhq\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.829703 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-config-data\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.829807 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-combined-ca-bundle\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.932295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxhq\" (UniqueName: \"kubernetes.io/projected/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-kube-api-access-vpxhq\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.932389 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-config-data\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.932417 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-combined-ca-bundle\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.941534 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-combined-ca-bundle\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.941673 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-config-data\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:02 crc kubenswrapper[4771]: I0219 23:12:02.955590 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxhq\" (UniqueName: \"kubernetes.io/projected/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-kube-api-access-vpxhq\") pod \"heat-db-sync-zshpp\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:03 crc kubenswrapper[4771]: I0219 23:12:03.160128 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:03 crc kubenswrapper[4771]: W0219 23:12:03.626235 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a7b1685_c41d_48b5_94d8_5b4b87420bf2.slice/crio-8f70eb58de893520e293cfb8bad16ca227a301ee039a01f43fcfeb4afa787e2c WatchSource:0}: Error finding container 8f70eb58de893520e293cfb8bad16ca227a301ee039a01f43fcfeb4afa787e2c: Status 404 returned error can't find the container with id 8f70eb58de893520e293cfb8bad16ca227a301ee039a01f43fcfeb4afa787e2c Feb 19 23:12:03 crc kubenswrapper[4771]: I0219 23:12:03.628488 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zshpp"] Feb 19 23:12:04 crc kubenswrapper[4771]: I0219 23:12:04.249932 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zshpp" event={"ID":"0a7b1685-c41d-48b5-94d8-5b4b87420bf2","Type":"ContainerStarted","Data":"8f70eb58de893520e293cfb8bad16ca227a301ee039a01f43fcfeb4afa787e2c"} Feb 19 23:12:06 crc kubenswrapper[4771]: I0219 23:12:06.310831 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:12:06 crc kubenswrapper[4771]: I0219 23:12:06.311281 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:12:09 crc kubenswrapper[4771]: I0219 23:12:09.438893 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:12:09 crc kubenswrapper[4771]: E0219 23:12:09.439576 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:12:12 crc kubenswrapper[4771]: I0219 23:12:12.337180 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zshpp" event={"ID":"0a7b1685-c41d-48b5-94d8-5b4b87420bf2","Type":"ContainerStarted","Data":"859442df92b24a8dcf3f85864479d8775c41021804d905e2f731a7f22f9d2716"} Feb 19 23:12:12 crc kubenswrapper[4771]: I0219 23:12:12.381124 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-zshpp" podStartSLOduration=2.61838137 podStartE2EDuration="10.381098736s" podCreationTimestamp="2026-02-19 23:12:02 +0000 UTC" firstStartedPulling="2026-02-19 23:12:03.629534279 +0000 UTC m=+6223.900976759" lastFinishedPulling="2026-02-19 23:12:11.392251615 +0000 UTC m=+6231.663694125" observedRunningTime="2026-02-19 23:12:12.360989509 +0000 UTC m=+6232.632432019" watchObservedRunningTime="2026-02-19 23:12:12.381098736 +0000 UTC m=+6232.652541236" Feb 19 23:12:14 crc kubenswrapper[4771]: I0219 23:12:14.360407 4771 generic.go:334] "Generic (PLEG): container finished" podID="0a7b1685-c41d-48b5-94d8-5b4b87420bf2" containerID="859442df92b24a8dcf3f85864479d8775c41021804d905e2f731a7f22f9d2716" exitCode=0 Feb 19 23:12:14 crc kubenswrapper[4771]: I0219 23:12:14.360517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zshpp" event={"ID":"0a7b1685-c41d-48b5-94d8-5b4b87420bf2","Type":"ContainerDied","Data":"859442df92b24a8dcf3f85864479d8775c41021804d905e2f731a7f22f9d2716"} Feb 19 23:12:15 crc kubenswrapper[4771]: I0219 23:12:15.898511 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:15 crc kubenswrapper[4771]: I0219 23:12:15.971622 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-config-data\") pod \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " Feb 19 23:12:15 crc kubenswrapper[4771]: I0219 23:12:15.971765 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-combined-ca-bundle\") pod \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " Feb 19 23:12:15 crc kubenswrapper[4771]: I0219 23:12:15.971978 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxhq\" (UniqueName: \"kubernetes.io/projected/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-kube-api-access-vpxhq\") pod \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\" (UID: \"0a7b1685-c41d-48b5-94d8-5b4b87420bf2\") " Feb 19 23:12:15 crc kubenswrapper[4771]: I0219 23:12:15.980366 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-kube-api-access-vpxhq" (OuterVolumeSpecName: "kube-api-access-vpxhq") pod "0a7b1685-c41d-48b5-94d8-5b4b87420bf2" (UID: "0a7b1685-c41d-48b5-94d8-5b4b87420bf2"). InnerVolumeSpecName "kube-api-access-vpxhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.026642 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a7b1685-c41d-48b5-94d8-5b4b87420bf2" (UID: "0a7b1685-c41d-48b5-94d8-5b4b87420bf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.073776 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpxhq\" (UniqueName: \"kubernetes.io/projected/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-kube-api-access-vpxhq\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.073803 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.101474 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-config-data" (OuterVolumeSpecName: "config-data") pod "0a7b1685-c41d-48b5-94d8-5b4b87420bf2" (UID: "0a7b1685-c41d-48b5-94d8-5b4b87420bf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.175502 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7b1685-c41d-48b5-94d8-5b4b87420bf2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.389914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zshpp" event={"ID":"0a7b1685-c41d-48b5-94d8-5b4b87420bf2","Type":"ContainerDied","Data":"8f70eb58de893520e293cfb8bad16ca227a301ee039a01f43fcfeb4afa787e2c"} Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.389957 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f70eb58de893520e293cfb8bad16ca227a301ee039a01f43fcfeb4afa787e2c" Feb 19 23:12:16 crc kubenswrapper[4771]: I0219 23:12:16.389982 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zshpp" Feb 19 23:12:16 crc kubenswrapper[4771]: E0219 23:12:16.663898 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a7b1685_c41d_48b5_94d8_5b4b87420bf2.slice\": RecentStats: unable to find data in memory cache]" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.535783 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-587498f77b-6skgf"] Feb 19 23:12:17 crc kubenswrapper[4771]: E0219 23:12:17.536875 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7b1685-c41d-48b5-94d8-5b4b87420bf2" containerName="heat-db-sync" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.536891 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7b1685-c41d-48b5-94d8-5b4b87420bf2" containerName="heat-db-sync" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.537169 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7b1685-c41d-48b5-94d8-5b4b87420bf2" containerName="heat-db-sync" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.540193 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.544833 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.544939 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.544953 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-mv6zh" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.568776 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-587498f77b-6skgf"] Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.613431 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-combined-ca-bundle\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.613492 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gkh\" (UniqueName: \"kubernetes.io/projected/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-kube-api-access-66gkh\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.613548 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data-custom\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.613745 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.716632 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data-custom\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.716794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.716858 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-combined-ca-bundle\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.716884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gkh\" (UniqueName: \"kubernetes.io/projected/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-kube-api-access-66gkh\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.724745 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7c7555f98f-9tfdn"] Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.729107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.729378 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.731874 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-combined-ca-bundle\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.733642 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.746745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data-custom\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.747590 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gkh\" (UniqueName: \"kubernetes.io/projected/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-kube-api-access-66gkh\") pod \"heat-engine-587498f77b-6skgf\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.749081 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-8457f4bfb4-cck5v"] Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.750683 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.754611 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.757909 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8457f4bfb4-cck5v"] Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.772612 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7c7555f98f-9tfdn"] Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.818924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data-custom\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.818994 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-combined-ca-bundle\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.819048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.819077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data-custom\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.819101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-combined-ca-bundle\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.819162 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srwzv\" (UniqueName: \"kubernetes.io/projected/a094a30e-0984-49ce-a81c-04b26696af81-kube-api-access-srwzv\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.819184 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.819207 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cd96\" (UniqueName: \"kubernetes.io/projected/53816944-33b8-44bd-9eab-b55559eab459-kube-api-access-4cd96\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.864709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921688 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srwzv\" (UniqueName: \"kubernetes.io/projected/a094a30e-0984-49ce-a81c-04b26696af81-kube-api-access-srwzv\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921736 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921765 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cd96\" (UniqueName: \"kubernetes.io/projected/53816944-33b8-44bd-9eab-b55559eab459-kube-api-access-4cd96\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921833 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data-custom\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921876 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-combined-ca-bundle\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921926 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data-custom\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.921950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-combined-ca-bundle\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.928314 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.931695 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-combined-ca-bundle\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.934704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data-custom\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.938965 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data-custom\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.939890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-combined-ca-bundle\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.941608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cd96\" (UniqueName: \"kubernetes.io/projected/53816944-33b8-44bd-9eab-b55559eab459-kube-api-access-4cd96\") pod \"heat-cfnapi-8457f4bfb4-cck5v\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.943898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srwzv\" (UniqueName: \"kubernetes.io/projected/a094a30e-0984-49ce-a81c-04b26696af81-kube-api-access-srwzv\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:17 crc kubenswrapper[4771]: I0219 23:12:17.944126 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data\") pod \"heat-api-7c7555f98f-9tfdn\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:18 crc kubenswrapper[4771]: I0219 23:12:18.165976 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:18 crc kubenswrapper[4771]: I0219 23:12:18.168697 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:12:18 crc kubenswrapper[4771]: I0219 23:12:18.175590 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:18 crc kubenswrapper[4771]: I0219 23:12:18.362462 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-587498f77b-6skgf"] Feb 19 23:12:18 crc kubenswrapper[4771]: I0219 23:12:18.420068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-587498f77b-6skgf" event={"ID":"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486","Type":"ContainerStarted","Data":"04553c9af8e55b234021971e19d0a62e5426880c3a554bf40947c813c89e0655"} Feb 19 23:12:18 crc kubenswrapper[4771]: I0219 23:12:18.652126 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8457f4bfb4-cck5v"] Feb 19 23:12:18 crc kubenswrapper[4771]: I0219 23:12:18.753984 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7c7555f98f-9tfdn"] Feb 19 23:12:18 crc kubenswrapper[4771]: W0219 23:12:18.766333 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda094a30e_0984_49ce_a81c_04b26696af81.slice/crio-2fc3a62554b7522755c979d1818f2db4ed07309c898a5488154ab2dcb562e369 WatchSource:0}: Error finding container 2fc3a62554b7522755c979d1818f2db4ed07309c898a5488154ab2dcb562e369: Status 404 returned error can't find the container with id 2fc3a62554b7522755c979d1818f2db4ed07309c898a5488154ab2dcb562e369 Feb 19 23:12:19 crc kubenswrapper[4771]: I0219 23:12:19.431039 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" event={"ID":"53816944-33b8-44bd-9eab-b55559eab459","Type":"ContainerStarted","Data":"78226749652cf91a2d041898eddc10108d22b06ae097251cac88eb4aba0d29e4"} Feb 19 23:12:19 crc kubenswrapper[4771]: I0219 23:12:19.432471 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c7555f98f-9tfdn" event={"ID":"a094a30e-0984-49ce-a81c-04b26696af81","Type":"ContainerStarted","Data":"2fc3a62554b7522755c979d1818f2db4ed07309c898a5488154ab2dcb562e369"} Feb 19 23:12:19 crc kubenswrapper[4771]: I0219 23:12:19.440245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-587498f77b-6skgf" event={"ID":"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486","Type":"ContainerStarted","Data":"661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe"} Feb 19 23:12:19 crc kubenswrapper[4771]: I0219 23:12:19.463903 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-587498f77b-6skgf" podStartSLOduration=2.463882854 podStartE2EDuration="2.463882854s" podCreationTimestamp="2026-02-19 23:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:12:19.456176449 +0000 UTC m=+6239.727618939" watchObservedRunningTime="2026-02-19 23:12:19.463882854 +0000 UTC m=+6239.735325324" Feb 19 23:12:19 crc kubenswrapper[4771]: I0219 23:12:19.975288 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-66b7bf74db-l2vb5" Feb 19 23:12:20 crc kubenswrapper[4771]: I0219 23:12:20.038728 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f449ffc58-sz7vv"] Feb 19 23:12:20 crc kubenswrapper[4771]: I0219 23:12:20.038997 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f449ffc58-sz7vv" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon-log" containerID="cri-o://477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9" gracePeriod=30 Feb 19 23:12:20 crc kubenswrapper[4771]: I0219 23:12:20.039126 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f449ffc58-sz7vv" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" containerID="cri-o://3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea" gracePeriod=30 Feb 19 23:12:20 crc kubenswrapper[4771]: I0219 23:12:20.446913 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:12:20 crc kubenswrapper[4771]: E0219 23:12:20.447595 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:12:20 crc kubenswrapper[4771]: I0219 23:12:20.457911 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:21 crc kubenswrapper[4771]: I0219 23:12:21.458367 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" event={"ID":"53816944-33b8-44bd-9eab-b55559eab459","Type":"ContainerStarted","Data":"23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc"} Feb 19 23:12:21 crc kubenswrapper[4771]: I0219 23:12:21.459109 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:21 crc kubenswrapper[4771]: I0219 23:12:21.462255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c7555f98f-9tfdn" event={"ID":"a094a30e-0984-49ce-a81c-04b26696af81","Type":"ContainerStarted","Data":"3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75"} Feb 19 23:12:21 crc kubenswrapper[4771]: I0219 23:12:21.462288 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:21 crc kubenswrapper[4771]: I0219 23:12:21.480274 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" podStartSLOduration=1.9834179760000001 podStartE2EDuration="4.480258659s" podCreationTimestamp="2026-02-19 23:12:17 +0000 UTC" firstStartedPulling="2026-02-19 23:12:18.659969533 +0000 UTC m=+6238.931412003" lastFinishedPulling="2026-02-19 23:12:21.156810216 +0000 UTC m=+6241.428252686" observedRunningTime="2026-02-19 23:12:21.472591194 +0000 UTC m=+6241.744033674" watchObservedRunningTime="2026-02-19 23:12:21.480258659 +0000 UTC m=+6241.751701129" Feb 19 23:12:21 crc kubenswrapper[4771]: I0219 23:12:21.493034 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7c7555f98f-9tfdn" podStartSLOduration=2.102306047 podStartE2EDuration="4.493001879s" podCreationTimestamp="2026-02-19 23:12:17 +0000 UTC" firstStartedPulling="2026-02-19 23:12:18.768109707 +0000 UTC m=+6239.039552167" lastFinishedPulling="2026-02-19 23:12:21.158805539 +0000 UTC m=+6241.430247999" observedRunningTime="2026-02-19 23:12:21.485947771 +0000 UTC m=+6241.757390251" watchObservedRunningTime="2026-02-19 23:12:21.493001879 +0000 UTC m=+6241.764444349" Feb 19 23:12:23 crc kubenswrapper[4771]: I0219 23:12:23.215472 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f449ffc58-sz7vv" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.128:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:33594->10.217.1.128:8443: read: connection reset by peer" Feb 19 23:12:23 crc kubenswrapper[4771]: I0219 23:12:23.484603 4771 generic.go:334] "Generic (PLEG): container finished" podID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerID="3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea" exitCode=0 Feb 19 23:12:23 crc kubenswrapper[4771]: I0219 23:12:23.484684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f449ffc58-sz7vv" event={"ID":"46d25b3d-46b9-4605-8c5d-4fe736c63322","Type":"ContainerDied","Data":"3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea"} Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.522340 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-f88dbf5b-scj6q"] Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.524340 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.540277 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-79b99f8cb9-fzb84"] Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.541638 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.569456 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6978b8f7d9-tc6qt"] Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.571184 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-f88dbf5b-scj6q"] Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.571295 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79b99f8cb9-fzb84"] Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.571298 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.581737 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6978b8f7d9-tc6qt"] Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.627574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hptr2\" (UniqueName: \"kubernetes.io/projected/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-kube-api-access-hptr2\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.627609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.627637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-combined-ca-bundle\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.627653 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-combined-ca-bundle\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.627674 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.627802 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmhnr\" (UniqueName: \"kubernetes.io/projected/9648f306-42b2-4e90-8ac4-2ac86f716a38-kube-api-access-hmhnr\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.628007 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data-custom\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.628079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-config-data\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.628204 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-combined-ca-bundle\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.628239 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data-custom\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.628425 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mf9f\" (UniqueName: \"kubernetes.io/projected/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-kube-api-access-9mf9f\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.628483 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-config-data-custom\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729791 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hptr2\" (UniqueName: \"kubernetes.io/projected/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-kube-api-access-hptr2\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729836 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-combined-ca-bundle\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-combined-ca-bundle\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729907 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729928 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmhnr\" (UniqueName: \"kubernetes.io/projected/9648f306-42b2-4e90-8ac4-2ac86f716a38-kube-api-access-hmhnr\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729967 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data-custom\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.729985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-config-data\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.730034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-combined-ca-bundle\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.730055 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data-custom\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.730089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mf9f\" (UniqueName: \"kubernetes.io/projected/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-kube-api-access-9mf9f\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.730109 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-config-data-custom\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.738095 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-combined-ca-bundle\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.738855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data-custom\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.739323 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-config-data-custom\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.739730 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.740765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-combined-ca-bundle\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.742073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-config-data\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.742627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-combined-ca-bundle\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.744228 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data-custom\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.745913 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.748890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hptr2\" (UniqueName: \"kubernetes.io/projected/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-kube-api-access-hptr2\") pod \"heat-cfnapi-6978b8f7d9-tc6qt\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.754208 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mf9f\" (UniqueName: \"kubernetes.io/projected/1eaa80a7-4011-400a-b29b-d7f76e69ecaa-kube-api-access-9mf9f\") pod \"heat-engine-f88dbf5b-scj6q\" (UID: \"1eaa80a7-4011-400a-b29b-d7f76e69ecaa\") " pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.757592 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmhnr\" (UniqueName: \"kubernetes.io/projected/9648f306-42b2-4e90-8ac4-2ac86f716a38-kube-api-access-hmhnr\") pod \"heat-api-79b99f8cb9-fzb84\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.903836 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.916013 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:25 crc kubenswrapper[4771]: I0219 23:12:25.930140 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.422925 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-f88dbf5b-scj6q"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.518409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f88dbf5b-scj6q" event={"ID":"1eaa80a7-4011-400a-b29b-d7f76e69ecaa","Type":"ContainerStarted","Data":"13895fc80b64d3c8c01a2ba639acf55c34a14d0ba7681b2482095e8f580d79db"} Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.546344 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7c7555f98f-9tfdn"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.549471 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7c7555f98f-9tfdn" podUID="a094a30e-0984-49ce-a81c-04b26696af81" containerName="heat-api" containerID="cri-o://3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75" gracePeriod=60 Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.571859 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-8457f4bfb4-cck5v"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.572143 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" podUID="53816944-33b8-44bd-9eab-b55559eab459" containerName="heat-cfnapi" containerID="cri-o://23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc" gracePeriod=60 Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.603243 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" podUID="53816944-33b8-44bd-9eab-b55559eab459" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.139:8000/healthcheck\": EOF" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.630113 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79b99f8cb9-fzb84"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.646927 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6d5f47dcf5-c4wds"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.651210 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.656123 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.656375 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.705279 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-659b48b6d7-rrwt7"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.706820 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.715848 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.716050 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6978b8f7d9-tc6qt"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.716367 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.727209 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d5f47dcf5-c4wds"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.753536 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-659b48b6d7-rrwt7"] Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.799806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-combined-ca-bundle\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.799863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-config-data\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.799890 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-public-tls-certs\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.799908 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-config-data-custom\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.799929 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-config-data-custom\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.799947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-internal-tls-certs\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.799975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4q9f\" (UniqueName: \"kubernetes.io/projected/67720512-861d-4b44-b21f-78a3ed2ec49b-kube-api-access-q4q9f\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.800035 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7hn5\" (UniqueName: \"kubernetes.io/projected/26834139-7112-45a0-bd80-ab038140ff2e-kube-api-access-d7hn5\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.800055 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-public-tls-certs\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.800073 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-combined-ca-bundle\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.800094 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-internal-tls-certs\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.800162 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-config-data\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901452 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-combined-ca-bundle\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901501 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-config-data\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-public-tls-certs\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-config-data-custom\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901563 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-config-data-custom\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901580 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-internal-tls-certs\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901607 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4q9f\" (UniqueName: \"kubernetes.io/projected/67720512-861d-4b44-b21f-78a3ed2ec49b-kube-api-access-q4q9f\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901638 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7hn5\" (UniqueName: \"kubernetes.io/projected/26834139-7112-45a0-bd80-ab038140ff2e-kube-api-access-d7hn5\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901653 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-public-tls-certs\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901674 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-combined-ca-bundle\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-internal-tls-certs\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.901753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-config-data\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.912963 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-public-tls-certs\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.914838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-combined-ca-bundle\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.915449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-internal-tls-certs\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.916947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-public-tls-certs\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.918063 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-config-data\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.925454 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-combined-ca-bundle\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.926047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-config-data-custom\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.926868 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-config-data\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.935546 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67720512-861d-4b44-b21f-78a3ed2ec49b-internal-tls-certs\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.937792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26834139-7112-45a0-bd80-ab038140ff2e-config-data-custom\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.971818 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7hn5\" (UniqueName: \"kubernetes.io/projected/26834139-7112-45a0-bd80-ab038140ff2e-kube-api-access-d7hn5\") pod \"heat-api-6d5f47dcf5-c4wds\" (UID: \"26834139-7112-45a0-bd80-ab038140ff2e\") " pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:26 crc kubenswrapper[4771]: I0219 23:12:26.978752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4q9f\" (UniqueName: \"kubernetes.io/projected/67720512-861d-4b44-b21f-78a3ed2ec49b-kube-api-access-q4q9f\") pod \"heat-cfnapi-659b48b6d7-rrwt7\" (UID: \"67720512-861d-4b44-b21f-78a3ed2ec49b\") " pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.030549 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.196489 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.517724 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d5f47dcf5-c4wds"] Feb 19 23:12:27 crc kubenswrapper[4771]: W0219 23:12:27.521167 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26834139_7112_45a0_bd80_ab038140ff2e.slice/crio-26a22475b48529695e9cdea5a95c636eab3fdd579714d69bc45ff786b9b87234 WatchSource:0}: Error finding container 26a22475b48529695e9cdea5a95c636eab3fdd579714d69bc45ff786b9b87234: Status 404 returned error can't find the container with id 26a22475b48529695e9cdea5a95c636eab3fdd579714d69bc45ff786b9b87234 Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.538708 4771 generic.go:334] "Generic (PLEG): container finished" podID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerID="2deba4ab7a4b51959ff29756d8c2bfeefb8f318778e1615cfb273212bb8a3991" exitCode=1 Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.538947 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b99f8cb9-fzb84" event={"ID":"9648f306-42b2-4e90-8ac4-2ac86f716a38","Type":"ContainerDied","Data":"2deba4ab7a4b51959ff29756d8c2bfeefb8f318778e1615cfb273212bb8a3991"} Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.538972 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b99f8cb9-fzb84" event={"ID":"9648f306-42b2-4e90-8ac4-2ac86f716a38","Type":"ContainerStarted","Data":"270ba89324a658614feb54559793d395caf7f2bb2e257fc8885c07ecfdf9a0d3"} Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.539561 4771 scope.go:117] "RemoveContainer" containerID="2deba4ab7a4b51959ff29756d8c2bfeefb8f318778e1615cfb273212bb8a3991" Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.546382 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f88dbf5b-scj6q" event={"ID":"1eaa80a7-4011-400a-b29b-d7f76e69ecaa","Type":"ContainerStarted","Data":"ed0e177ae1d4c02f769f1297a9d9796d513285f189fde011a91fa5d9df30159b"} Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.547422 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.550493 4771 generic.go:334] "Generic (PLEG): container finished" podID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerID="c94662ae4c9b6adca8ab93e1fce08655ec3df0f697935e0e1bdb16f3d2529918" exitCode=1 Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.550537 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" event={"ID":"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d","Type":"ContainerDied","Data":"c94662ae4c9b6adca8ab93e1fce08655ec3df0f697935e0e1bdb16f3d2529918"} Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.550564 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" event={"ID":"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d","Type":"ContainerStarted","Data":"c7c7a7c208a5c1a697b1964964cf97ffe7a1e6e5184921a2f8591d261f1396f3"} Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.551417 4771 scope.go:117] "RemoveContainer" containerID="c94662ae4c9b6adca8ab93e1fce08655ec3df0f697935e0e1bdb16f3d2529918" Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.630766 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-f88dbf5b-scj6q" podStartSLOduration=2.630748614 podStartE2EDuration="2.630748614s" podCreationTimestamp="2026-02-19 23:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:12:27.623635024 +0000 UTC m=+6247.895077494" watchObservedRunningTime="2026-02-19 23:12:27.630748614 +0000 UTC m=+6247.902191084" Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.741948 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-659b48b6d7-rrwt7"] Feb 19 23:12:27 crc kubenswrapper[4771]: I0219 23:12:27.982070 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.560697 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" event={"ID":"67720512-861d-4b44-b21f-78a3ed2ec49b","Type":"ContainerStarted","Data":"b856f1bd2d4a042ad5e021b16b398964ed75868074686d94c5a89f1a1c4126d6"} Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.560748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" event={"ID":"67720512-861d-4b44-b21f-78a3ed2ec49b","Type":"ContainerStarted","Data":"2bb86077b3104a7d54f2ba57037b07283369b70f113d1bfe3dcdf236fa21dfff"} Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.560863 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.562796 4771 generic.go:334] "Generic (PLEG): container finished" podID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerID="b0109a6144bca47fe54906e9a4d587fd9d07ec0378da3b7348d390ed9909d985" exitCode=1 Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.562931 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b99f8cb9-fzb84" event={"ID":"9648f306-42b2-4e90-8ac4-2ac86f716a38","Type":"ContainerDied","Data":"b0109a6144bca47fe54906e9a4d587fd9d07ec0378da3b7348d390ed9909d985"} Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.563085 4771 scope.go:117] "RemoveContainer" containerID="2deba4ab7a4b51959ff29756d8c2bfeefb8f318778e1615cfb273212bb8a3991" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.563682 4771 scope.go:117] "RemoveContainer" containerID="b0109a6144bca47fe54906e9a4d587fd9d07ec0378da3b7348d390ed9909d985" Feb 19 23:12:28 crc kubenswrapper[4771]: E0219 23:12:28.563993 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-79b99f8cb9-fzb84_openstack(9648f306-42b2-4e90-8ac4-2ac86f716a38)\"" pod="openstack/heat-api-79b99f8cb9-fzb84" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.580471 4771 generic.go:334] "Generic (PLEG): container finished" podID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerID="0e66531d293114ecaba6693009344d5a2af40c444dccefe8e59241fb749310ed" exitCode=1 Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.580585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" event={"ID":"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d","Type":"ContainerDied","Data":"0e66531d293114ecaba6693009344d5a2af40c444dccefe8e59241fb749310ed"} Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.581850 4771 scope.go:117] "RemoveContainer" containerID="0e66531d293114ecaba6693009344d5a2af40c444dccefe8e59241fb749310ed" Feb 19 23:12:28 crc kubenswrapper[4771]: E0219 23:12:28.582965 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6978b8f7d9-tc6qt_openstack(b48ba9f8-0c7e-49ab-841d-0cb838f3e24d)\"" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.583393 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d5f47dcf5-c4wds" event={"ID":"26834139-7112-45a0-bd80-ab038140ff2e","Type":"ContainerStarted","Data":"5a02871a36ca9475d83a66462dd529cd1fe29ec259aa97319da4751d5bb6c2bb"} Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.583430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d5f47dcf5-c4wds" event={"ID":"26834139-7112-45a0-bd80-ab038140ff2e","Type":"ContainerStarted","Data":"26a22475b48529695e9cdea5a95c636eab3fdd579714d69bc45ff786b9b87234"} Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.583787 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.616450 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" podStartSLOduration=2.616425601 podStartE2EDuration="2.616425601s" podCreationTimestamp="2026-02-19 23:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:12:28.589436842 +0000 UTC m=+6248.860879322" watchObservedRunningTime="2026-02-19 23:12:28.616425601 +0000 UTC m=+6248.887868061" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.650458 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6d5f47dcf5-c4wds" podStartSLOduration=2.6504420680000003 podStartE2EDuration="2.650442068s" podCreationTimestamp="2026-02-19 23:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:12:28.626841849 +0000 UTC m=+6248.898284349" watchObservedRunningTime="2026-02-19 23:12:28.650442068 +0000 UTC m=+6248.921884538" Feb 19 23:12:28 crc kubenswrapper[4771]: I0219 23:12:28.653395 4771 scope.go:117] "RemoveContainer" containerID="c94662ae4c9b6adca8ab93e1fce08655ec3df0f697935e0e1bdb16f3d2529918" Feb 19 23:12:29 crc kubenswrapper[4771]: I0219 23:12:29.594986 4771 scope.go:117] "RemoveContainer" containerID="b0109a6144bca47fe54906e9a4d587fd9d07ec0378da3b7348d390ed9909d985" Feb 19 23:12:29 crc kubenswrapper[4771]: E0219 23:12:29.595761 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-79b99f8cb9-fzb84_openstack(9648f306-42b2-4e90-8ac4-2ac86f716a38)\"" pod="openstack/heat-api-79b99f8cb9-fzb84" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" Feb 19 23:12:29 crc kubenswrapper[4771]: I0219 23:12:29.595808 4771 scope.go:117] "RemoveContainer" containerID="0e66531d293114ecaba6693009344d5a2af40c444dccefe8e59241fb749310ed" Feb 19 23:12:29 crc kubenswrapper[4771]: E0219 23:12:29.596720 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6978b8f7d9-tc6qt_openstack(b48ba9f8-0c7e-49ab-841d-0cb838f3e24d)\"" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" Feb 19 23:12:30 crc kubenswrapper[4771]: I0219 23:12:30.916605 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:30 crc kubenswrapper[4771]: I0219 23:12:30.916717 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:30 crc kubenswrapper[4771]: I0219 23:12:30.917975 4771 scope.go:117] "RemoveContainer" containerID="b0109a6144bca47fe54906e9a4d587fd9d07ec0378da3b7348d390ed9909d985" Feb 19 23:12:30 crc kubenswrapper[4771]: E0219 23:12:30.918550 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-79b99f8cb9-fzb84_openstack(9648f306-42b2-4e90-8ac4-2ac86f716a38)\"" pod="openstack/heat-api-79b99f8cb9-fzb84" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" Feb 19 23:12:30 crc kubenswrapper[4771]: I0219 23:12:30.930303 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:30 crc kubenswrapper[4771]: I0219 23:12:30.930377 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:30 crc kubenswrapper[4771]: I0219 23:12:30.931562 4771 scope.go:117] "RemoveContainer" containerID="0e66531d293114ecaba6693009344d5a2af40c444dccefe8e59241fb749310ed" Feb 19 23:12:30 crc kubenswrapper[4771]: E0219 23:12:30.932119 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6978b8f7d9-tc6qt_openstack(b48ba9f8-0c7e-49ab-841d-0cb838f3e24d)\"" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" Feb 19 23:12:31 crc kubenswrapper[4771]: I0219 23:12:31.435491 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f449ffc58-sz7vv" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.128:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.128:8443: connect: connection refused" Feb 19 23:12:31 crc kubenswrapper[4771]: I0219 23:12:31.977774 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" podUID="53816944-33b8-44bd-9eab-b55559eab459" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.139:8000/healthcheck\": read tcp 10.217.0.2:35904->10.217.1.139:8000: read: connection reset by peer" Feb 19 23:12:31 crc kubenswrapper[4771]: I0219 23:12:31.980726 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7c7555f98f-9tfdn" podUID="a094a30e-0984-49ce-a81c-04b26696af81" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.138:8004/healthcheck\": read tcp 10.217.0.2:47668->10.217.1.138:8004: read: connection reset by peer" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.443964 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:12:32 crc kubenswrapper[4771]: E0219 23:12:32.449473 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.562708 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.571383 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.626335 4771 generic.go:334] "Generic (PLEG): container finished" podID="53816944-33b8-44bd-9eab-b55559eab459" containerID="23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc" exitCode=0 Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.626383 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" event={"ID":"53816944-33b8-44bd-9eab-b55559eab459","Type":"ContainerDied","Data":"23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc"} Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.626420 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.626439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8457f4bfb4-cck5v" event={"ID":"53816944-33b8-44bd-9eab-b55559eab459","Type":"ContainerDied","Data":"78226749652cf91a2d041898eddc10108d22b06ae097251cac88eb4aba0d29e4"} Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.626471 4771 scope.go:117] "RemoveContainer" containerID="23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.629127 4771 generic.go:334] "Generic (PLEG): container finished" podID="a094a30e-0984-49ce-a81c-04b26696af81" containerID="3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75" exitCode=0 Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.629163 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c7555f98f-9tfdn" event={"ID":"a094a30e-0984-49ce-a81c-04b26696af81","Type":"ContainerDied","Data":"3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75"} Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.629183 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c7555f98f-9tfdn" event={"ID":"a094a30e-0984-49ce-a81c-04b26696af81","Type":"ContainerDied","Data":"2fc3a62554b7522755c979d1818f2db4ed07309c898a5488154ab2dcb562e369"} Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.629220 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c7555f98f-9tfdn" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658296 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data-custom\") pod \"53816944-33b8-44bd-9eab-b55559eab459\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658436 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data\") pod \"a094a30e-0984-49ce-a81c-04b26696af81\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-combined-ca-bundle\") pod \"53816944-33b8-44bd-9eab-b55559eab459\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658552 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srwzv\" (UniqueName: \"kubernetes.io/projected/a094a30e-0984-49ce-a81c-04b26696af81-kube-api-access-srwzv\") pod \"a094a30e-0984-49ce-a81c-04b26696af81\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658604 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data-custom\") pod \"a094a30e-0984-49ce-a81c-04b26696af81\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658661 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data\") pod \"53816944-33b8-44bd-9eab-b55559eab459\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658738 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-combined-ca-bundle\") pod \"a094a30e-0984-49ce-a81c-04b26696af81\" (UID: \"a094a30e-0984-49ce-a81c-04b26696af81\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.658835 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cd96\" (UniqueName: \"kubernetes.io/projected/53816944-33b8-44bd-9eab-b55559eab459-kube-api-access-4cd96\") pod \"53816944-33b8-44bd-9eab-b55559eab459\" (UID: \"53816944-33b8-44bd-9eab-b55559eab459\") " Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.663257 4771 scope.go:117] "RemoveContainer" containerID="23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc" Feb 19 23:12:32 crc kubenswrapper[4771]: E0219 23:12:32.663962 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc\": container with ID starting with 23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc not found: ID does not exist" containerID="23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.664010 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc"} err="failed to get container status \"23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc\": rpc error: code = NotFound desc = could not find container \"23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc\": container with ID starting with 23e28f164f0481cdfdda7c5d2d218e53da4789865cd48c9448d7e44f273d27cc not found: ID does not exist" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.664064 4771 scope.go:117] "RemoveContainer" containerID="3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.667170 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "53816944-33b8-44bd-9eab-b55559eab459" (UID: "53816944-33b8-44bd-9eab-b55559eab459"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.671516 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a094a30e-0984-49ce-a81c-04b26696af81-kube-api-access-srwzv" (OuterVolumeSpecName: "kube-api-access-srwzv") pod "a094a30e-0984-49ce-a81c-04b26696af81" (UID: "a094a30e-0984-49ce-a81c-04b26696af81"). InnerVolumeSpecName "kube-api-access-srwzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.672145 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a094a30e-0984-49ce-a81c-04b26696af81" (UID: "a094a30e-0984-49ce-a81c-04b26696af81"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.672442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53816944-33b8-44bd-9eab-b55559eab459-kube-api-access-4cd96" (OuterVolumeSpecName: "kube-api-access-4cd96") pod "53816944-33b8-44bd-9eab-b55559eab459" (UID: "53816944-33b8-44bd-9eab-b55559eab459"). InnerVolumeSpecName "kube-api-access-4cd96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.696485 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53816944-33b8-44bd-9eab-b55559eab459" (UID: "53816944-33b8-44bd-9eab-b55559eab459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.719126 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a094a30e-0984-49ce-a81c-04b26696af81" (UID: "a094a30e-0984-49ce-a81c-04b26696af81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.740797 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data" (OuterVolumeSpecName: "config-data") pod "53816944-33b8-44bd-9eab-b55559eab459" (UID: "53816944-33b8-44bd-9eab-b55559eab459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.741778 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data" (OuterVolumeSpecName: "config-data") pod "a094a30e-0984-49ce-a81c-04b26696af81" (UID: "a094a30e-0984-49ce-a81c-04b26696af81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.754372 4771 scope.go:117] "RemoveContainer" containerID="3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75" Feb 19 23:12:32 crc kubenswrapper[4771]: E0219 23:12:32.754824 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75\": container with ID starting with 3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75 not found: ID does not exist" containerID="3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.754855 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75"} err="failed to get container status \"3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75\": rpc error: code = NotFound desc = could not find container \"3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75\": container with ID starting with 3b615936bdc1341a29dc6753b1844facb1ba68f5765962c4c8122febb8fbaf75 not found: ID does not exist" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761263 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761283 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cd96\" (UniqueName: \"kubernetes.io/projected/53816944-33b8-44bd-9eab-b55559eab459-kube-api-access-4cd96\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761292 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761303 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761311 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761320 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srwzv\" (UniqueName: \"kubernetes.io/projected/a094a30e-0984-49ce-a81c-04b26696af81-kube-api-access-srwzv\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761328 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a094a30e-0984-49ce-a81c-04b26696af81-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:32 crc kubenswrapper[4771]: I0219 23:12:32.761335 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53816944-33b8-44bd-9eab-b55559eab459-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:33 crc kubenswrapper[4771]: I0219 23:12:33.002327 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-8457f4bfb4-cck5v"] Feb 19 23:12:33 crc kubenswrapper[4771]: I0219 23:12:33.022555 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-8457f4bfb4-cck5v"] Feb 19 23:12:33 crc kubenswrapper[4771]: I0219 23:12:33.035949 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7c7555f98f-9tfdn"] Feb 19 23:12:33 crc kubenswrapper[4771]: I0219 23:12:33.047918 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7c7555f98f-9tfdn"] Feb 19 23:12:34 crc kubenswrapper[4771]: I0219 23:12:34.450942 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53816944-33b8-44bd-9eab-b55559eab459" path="/var/lib/kubelet/pods/53816944-33b8-44bd-9eab-b55559eab459/volumes" Feb 19 23:12:34 crc kubenswrapper[4771]: I0219 23:12:34.452206 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a094a30e-0984-49ce-a81c-04b26696af81" path="/var/lib/kubelet/pods/a094a30e-0984-49ce-a81c-04b26696af81/volumes" Feb 19 23:12:37 crc kubenswrapper[4771]: I0219 23:12:37.024114 4771 scope.go:117] "RemoveContainer" containerID="5ed32cd77ab23cb02780e49be7657b442d9bf822d05d7d1a501ba6a7b0c888ad" Feb 19 23:12:37 crc kubenswrapper[4771]: I0219 23:12:37.082611 4771 scope.go:117] "RemoveContainer" containerID="5d7cdd7a93d149e393813df4dbcc8ff10ecd9b2dde28e26d870319c355a60911" Feb 19 23:12:37 crc kubenswrapper[4771]: I0219 23:12:37.122267 4771 scope.go:117] "RemoveContainer" containerID="0e3f909bb24d5c1718653a43e2504ffcec274d377440092ff761f9985d7387c4" Feb 19 23:12:37 crc kubenswrapper[4771]: I0219 23:12:37.916643 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.256156 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6d5f47dcf5-c4wds" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.349293 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-79b99f8cb9-fzb84"] Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.419855 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-659b48b6d7-rrwt7" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.529595 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6978b8f7d9-tc6qt"] Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.708937 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b99f8cb9-fzb84" event={"ID":"9648f306-42b2-4e90-8ac4-2ac86f716a38","Type":"ContainerDied","Data":"270ba89324a658614feb54559793d395caf7f2bb2e257fc8885c07ecfdf9a0d3"} Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.709205 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="270ba89324a658614feb54559793d395caf7f2bb2e257fc8885c07ecfdf9a0d3" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.766272 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.910012 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.925801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmhnr\" (UniqueName: \"kubernetes.io/projected/9648f306-42b2-4e90-8ac4-2ac86f716a38-kube-api-access-hmhnr\") pod \"9648f306-42b2-4e90-8ac4-2ac86f716a38\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.925875 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data-custom\") pod \"9648f306-42b2-4e90-8ac4-2ac86f716a38\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.925940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data\") pod \"9648f306-42b2-4e90-8ac4-2ac86f716a38\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.926012 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-combined-ca-bundle\") pod \"9648f306-42b2-4e90-8ac4-2ac86f716a38\" (UID: \"9648f306-42b2-4e90-8ac4-2ac86f716a38\") " Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.934404 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9648f306-42b2-4e90-8ac4-2ac86f716a38-kube-api-access-hmhnr" (OuterVolumeSpecName: "kube-api-access-hmhnr") pod "9648f306-42b2-4e90-8ac4-2ac86f716a38" (UID: "9648f306-42b2-4e90-8ac4-2ac86f716a38"). InnerVolumeSpecName "kube-api-access-hmhnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.934876 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9648f306-42b2-4e90-8ac4-2ac86f716a38" (UID: "9648f306-42b2-4e90-8ac4-2ac86f716a38"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:38 crc kubenswrapper[4771]: I0219 23:12:38.998513 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9648f306-42b2-4e90-8ac4-2ac86f716a38" (UID: "9648f306-42b2-4e90-8ac4-2ac86f716a38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.007070 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data" (OuterVolumeSpecName: "config-data") pod "9648f306-42b2-4e90-8ac4-2ac86f716a38" (UID: "9648f306-42b2-4e90-8ac4-2ac86f716a38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.030842 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hptr2\" (UniqueName: \"kubernetes.io/projected/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-kube-api-access-hptr2\") pod \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.030912 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data\") pod \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.031066 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data-custom\") pod \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.031157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-combined-ca-bundle\") pod \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\" (UID: \"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d\") " Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.031588 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmhnr\" (UniqueName: \"kubernetes.io/projected/9648f306-42b2-4e90-8ac4-2ac86f716a38-kube-api-access-hmhnr\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.031605 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.031614 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.031622 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9648f306-42b2-4e90-8ac4-2ac86f716a38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.033892 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-kube-api-access-hptr2" (OuterVolumeSpecName: "kube-api-access-hptr2") pod "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" (UID: "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d"). InnerVolumeSpecName "kube-api-access-hptr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.034014 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" (UID: "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.065992 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" (UID: "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.092968 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data" (OuterVolumeSpecName: "config-data") pod "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" (UID: "b48ba9f8-0c7e-49ab-841d-0cb838f3e24d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.133252 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.133687 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.133783 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hptr2\" (UniqueName: \"kubernetes.io/projected/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-kube-api-access-hptr2\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.133861 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.720757 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b99f8cb9-fzb84" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.720935 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.721193 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6978b8f7d9-tc6qt" event={"ID":"b48ba9f8-0c7e-49ab-841d-0cb838f3e24d","Type":"ContainerDied","Data":"c7c7a7c208a5c1a697b1964964cf97ffe7a1e6e5184921a2f8591d261f1396f3"} Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.721232 4771 scope.go:117] "RemoveContainer" containerID="0e66531d293114ecaba6693009344d5a2af40c444dccefe8e59241fb749310ed" Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.823579 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-79b99f8cb9-fzb84"] Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.834618 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-79b99f8cb9-fzb84"] Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.844108 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6978b8f7d9-tc6qt"] Feb 19 23:12:39 crc kubenswrapper[4771]: I0219 23:12:39.852033 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6978b8f7d9-tc6qt"] Feb 19 23:12:40 crc kubenswrapper[4771]: I0219 23:12:40.451519 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" path="/var/lib/kubelet/pods/9648f306-42b2-4e90-8ac4-2ac86f716a38/volumes" Feb 19 23:12:40 crc kubenswrapper[4771]: I0219 23:12:40.452347 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" path="/var/lib/kubelet/pods/b48ba9f8-0c7e-49ab-841d-0cb838f3e24d/volumes" Feb 19 23:12:41 crc kubenswrapper[4771]: I0219 23:12:41.435430 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f449ffc58-sz7vv" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.128:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.128:8443: connect: connection refused" Feb 19 23:12:41 crc kubenswrapper[4771]: I0219 23:12:41.435555 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:12:44 crc kubenswrapper[4771]: I0219 23:12:44.461330 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:12:44 crc kubenswrapper[4771]: E0219 23:12:44.461963 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:12:45 crc kubenswrapper[4771]: I0219 23:12:45.959904 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-f88dbf5b-scj6q" Feb 19 23:12:46 crc kubenswrapper[4771]: I0219 23:12:46.028592 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-587498f77b-6skgf"] Feb 19 23:12:46 crc kubenswrapper[4771]: I0219 23:12:46.028835 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-587498f77b-6skgf" podUID="c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" containerName="heat-engine" containerID="cri-o://661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" gracePeriod=60 Feb 19 23:12:47 crc kubenswrapper[4771]: E0219 23:12:47.867533 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 23:12:47 crc kubenswrapper[4771]: E0219 23:12:47.872388 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 23:12:47 crc kubenswrapper[4771]: E0219 23:12:47.877218 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 23:12:47 crc kubenswrapper[4771]: E0219 23:12:47.877265 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-587498f77b-6skgf" podUID="c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" containerName="heat-engine" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.525367 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.702659 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-combined-ca-bundle\") pod \"46d25b3d-46b9-4605-8c5d-4fe736c63322\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.702760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-config-data\") pod \"46d25b3d-46b9-4605-8c5d-4fe736c63322\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.702800 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nztsc\" (UniqueName: \"kubernetes.io/projected/46d25b3d-46b9-4605-8c5d-4fe736c63322-kube-api-access-nztsc\") pod \"46d25b3d-46b9-4605-8c5d-4fe736c63322\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.702876 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-scripts\") pod \"46d25b3d-46b9-4605-8c5d-4fe736c63322\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.703069 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-tls-certs\") pod \"46d25b3d-46b9-4605-8c5d-4fe736c63322\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.703149 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d25b3d-46b9-4605-8c5d-4fe736c63322-logs\") pod \"46d25b3d-46b9-4605-8c5d-4fe736c63322\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.703181 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-secret-key\") pod \"46d25b3d-46b9-4605-8c5d-4fe736c63322\" (UID: \"46d25b3d-46b9-4605-8c5d-4fe736c63322\") " Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.706609 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d25b3d-46b9-4605-8c5d-4fe736c63322-logs" (OuterVolumeSpecName: "logs") pod "46d25b3d-46b9-4605-8c5d-4fe736c63322" (UID: "46d25b3d-46b9-4605-8c5d-4fe736c63322"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.723434 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d25b3d-46b9-4605-8c5d-4fe736c63322-kube-api-access-nztsc" (OuterVolumeSpecName: "kube-api-access-nztsc") pod "46d25b3d-46b9-4605-8c5d-4fe736c63322" (UID: "46d25b3d-46b9-4605-8c5d-4fe736c63322"). InnerVolumeSpecName "kube-api-access-nztsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.723458 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "46d25b3d-46b9-4605-8c5d-4fe736c63322" (UID: "46d25b3d-46b9-4605-8c5d-4fe736c63322"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.754845 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-config-data" (OuterVolumeSpecName: "config-data") pod "46d25b3d-46b9-4605-8c5d-4fe736c63322" (UID: "46d25b3d-46b9-4605-8c5d-4fe736c63322"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.756134 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-scripts" (OuterVolumeSpecName: "scripts") pod "46d25b3d-46b9-4605-8c5d-4fe736c63322" (UID: "46d25b3d-46b9-4605-8c5d-4fe736c63322"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.764919 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46d25b3d-46b9-4605-8c5d-4fe736c63322" (UID: "46d25b3d-46b9-4605-8c5d-4fe736c63322"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.779787 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "46d25b3d-46b9-4605-8c5d-4fe736c63322" (UID: "46d25b3d-46b9-4605-8c5d-4fe736c63322"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.805930 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.805966 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.805980 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46d25b3d-46b9-4605-8c5d-4fe736c63322-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.805991 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.806003 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d25b3d-46b9-4605-8c5d-4fe736c63322-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.806013 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46d25b3d-46b9-4605-8c5d-4fe736c63322-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.806039 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nztsc\" (UniqueName: \"kubernetes.io/projected/46d25b3d-46b9-4605-8c5d-4fe736c63322-kube-api-access-nztsc\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.853920 4771 generic.go:334] "Generic (PLEG): container finished" podID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerID="477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9" exitCode=137 Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.853967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f449ffc58-sz7vv" event={"ID":"46d25b3d-46b9-4605-8c5d-4fe736c63322","Type":"ContainerDied","Data":"477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9"} Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.853993 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f449ffc58-sz7vv" event={"ID":"46d25b3d-46b9-4605-8c5d-4fe736c63322","Type":"ContainerDied","Data":"c75999ad8ae249c69e762d59901f594d4c477ece290c3d29099f58ee6789d0fe"} Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.854012 4771 scope.go:117] "RemoveContainer" containerID="3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.854163 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f449ffc58-sz7vv" Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.894880 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f449ffc58-sz7vv"] Feb 19 23:12:50 crc kubenswrapper[4771]: I0219 23:12:50.902556 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f449ffc58-sz7vv"] Feb 19 23:12:51 crc kubenswrapper[4771]: I0219 23:12:51.078003 4771 scope.go:117] "RemoveContainer" containerID="477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9" Feb 19 23:12:51 crc kubenswrapper[4771]: I0219 23:12:51.106209 4771 scope.go:117] "RemoveContainer" containerID="3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea" Feb 19 23:12:51 crc kubenswrapper[4771]: E0219 23:12:51.106682 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea\": container with ID starting with 3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea not found: ID does not exist" containerID="3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea" Feb 19 23:12:51 crc kubenswrapper[4771]: I0219 23:12:51.106731 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea"} err="failed to get container status \"3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea\": rpc error: code = NotFound desc = could not find container \"3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea\": container with ID starting with 3737589143db3725231317019f56e449a5b6bda11f1ae0d2f854e0c2e6cef6ea not found: ID does not exist" Feb 19 23:12:51 crc kubenswrapper[4771]: I0219 23:12:51.106757 4771 scope.go:117] "RemoveContainer" containerID="477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9" Feb 19 23:12:51 crc kubenswrapper[4771]: E0219 23:12:51.107102 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9\": container with ID starting with 477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9 not found: ID does not exist" containerID="477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9" Feb 19 23:12:51 crc kubenswrapper[4771]: I0219 23:12:51.107137 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9"} err="failed to get container status \"477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9\": rpc error: code = NotFound desc = could not find container \"477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9\": container with ID starting with 477dda3c077c9d7881d36f8315c48b3adeaf1608c84ff4ca071512a0c4869ec9 not found: ID does not exist" Feb 19 23:12:52 crc kubenswrapper[4771]: I0219 23:12:52.448451 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" path="/var/lib/kubelet/pods/46d25b3d-46b9-4605-8c5d-4fe736c63322/volumes" Feb 19 23:12:55 crc kubenswrapper[4771]: I0219 23:12:55.445389 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:12:55 crc kubenswrapper[4771]: E0219 23:12:55.446667 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.892959 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.919203 4771 generic.go:334] "Generic (PLEG): container finished" podID="c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" containerID="661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" exitCode=0 Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.919261 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-587498f77b-6skgf" Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.919247 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-587498f77b-6skgf" event={"ID":"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486","Type":"ContainerDied","Data":"661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe"} Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.919455 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-587498f77b-6skgf" event={"ID":"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486","Type":"ContainerDied","Data":"04553c9af8e55b234021971e19d0a62e5426880c3a554bf40947c813c89e0655"} Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.919473 4771 scope.go:117] "RemoveContainer" containerID="661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.960951 4771 scope.go:117] "RemoveContainer" containerID="661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" Feb 19 23:12:56 crc kubenswrapper[4771]: E0219 23:12:56.961570 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe\": container with ID starting with 661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe not found: ID does not exist" containerID="661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe" Feb 19 23:12:56 crc kubenswrapper[4771]: I0219 23:12:56.961604 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe"} err="failed to get container status \"661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe\": rpc error: code = NotFound desc = could not find container \"661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe\": container with ID starting with 661d5e9aa78ea2d437cb1288c63d1b88ff3331168c64c171efd5266b17ea14fe not found: ID does not exist" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.040032 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data-custom\") pod \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.040222 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66gkh\" (UniqueName: \"kubernetes.io/projected/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-kube-api-access-66gkh\") pod \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.040322 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-combined-ca-bundle\") pod \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.040362 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data\") pod \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\" (UID: \"c434b2cf-ccb1-4c24-8102-0c0ce5ffc486\") " Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.046370 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" (UID: "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.046961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-kube-api-access-66gkh" (OuterVolumeSpecName: "kube-api-access-66gkh") pod "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" (UID: "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486"). InnerVolumeSpecName "kube-api-access-66gkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.074892 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" (UID: "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.088306 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data" (OuterVolumeSpecName: "config-data") pod "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" (UID: "c434b2cf-ccb1-4c24-8102-0c0ce5ffc486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.143258 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.143294 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66gkh\" (UniqueName: \"kubernetes.io/projected/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-kube-api-access-66gkh\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.143307 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.143317 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.271215 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-587498f77b-6skgf"] Feb 19 23:12:57 crc kubenswrapper[4771]: I0219 23:12:57.279037 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-587498f77b-6skgf"] Feb 19 23:12:58 crc kubenswrapper[4771]: I0219 23:12:58.454782 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" path="/var/lib/kubelet/pods/c434b2cf-ccb1-4c24-8102-0c0ce5ffc486/volumes" Feb 19 23:13:05 crc kubenswrapper[4771]: I0219 23:13:05.067027 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7af4-account-create-update-m742s"] Feb 19 23:13:05 crc kubenswrapper[4771]: I0219 23:13:05.080471 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qs9x4"] Feb 19 23:13:05 crc kubenswrapper[4771]: I0219 23:13:05.095462 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qs9x4"] Feb 19 23:13:05 crc kubenswrapper[4771]: I0219 23:13:05.105266 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7af4-account-create-update-m742s"] Feb 19 23:13:06 crc kubenswrapper[4771]: I0219 23:13:06.437353 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:13:06 crc kubenswrapper[4771]: E0219 23:13:06.437832 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:13:06 crc kubenswrapper[4771]: I0219 23:13:06.448876 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e14bb44-70cd-4826-a969-3cb4cf02e9b9" path="/var/lib/kubelet/pods/1e14bb44-70cd-4826-a969-3cb4cf02e9b9/volumes" Feb 19 23:13:06 crc kubenswrapper[4771]: I0219 23:13:06.450039 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0523b6-312f-4159-8965-7bdbec2a4604" path="/var/lib/kubelet/pods/bd0523b6-312f-4159-8965-7bdbec2a4604/volumes" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.400806 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp"] Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401596 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53816944-33b8-44bd-9eab-b55559eab459" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401607 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="53816944-33b8-44bd-9eab-b55559eab459" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401624 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401630 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401642 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401649 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401660 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401666 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401673 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a094a30e-0984-49ce-a81c-04b26696af81" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401678 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a094a30e-0984-49ce-a81c-04b26696af81" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401695 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401701 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401713 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon-log" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401718 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon-log" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401728 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" containerName="heat-engine" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401734 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" containerName="heat-engine" Feb 19 23:13:11 crc kubenswrapper[4771]: E0219 23:13:11.401747 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401752 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401907 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon-log" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401930 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401938 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c434b2cf-ccb1-4c24-8102-0c0ce5ffc486" containerName="heat-engine" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401950 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a094a30e-0984-49ce-a81c-04b26696af81" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401960 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401970 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="53816944-33b8-44bd-9eab-b55559eab459" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.401980 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d25b3d-46b9-4605-8c5d-4fe736c63322" containerName="horizon" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.402355 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48ba9f8-0c7e-49ab-841d-0cb838f3e24d" containerName="heat-cfnapi" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.402369 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9648f306-42b2-4e90-8ac4-2ac86f716a38" containerName="heat-api" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.403303 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.405327 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.423982 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp"] Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.590003 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.590446 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.590581 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvrrh\" (UniqueName: \"kubernetes.io/projected/9fed3ba9-7458-44d3-a87a-9d9a5957c680-kube-api-access-xvrrh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.693866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.694049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.694105 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvrrh\" (UniqueName: \"kubernetes.io/projected/9fed3ba9-7458-44d3-a87a-9d9a5957c680-kube-api-access-xvrrh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.694545 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.694856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:11 crc kubenswrapper[4771]: I0219 23:13:11.725955 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvrrh\" (UniqueName: \"kubernetes.io/projected/9fed3ba9-7458-44d3-a87a-9d9a5957c680-kube-api-access-xvrrh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:12 crc kubenswrapper[4771]: I0219 23:13:12.020586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:12 crc kubenswrapper[4771]: I0219 23:13:12.563135 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp"] Feb 19 23:13:13 crc kubenswrapper[4771]: I0219 23:13:13.082875 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" event={"ID":"9fed3ba9-7458-44d3-a87a-9d9a5957c680","Type":"ContainerStarted","Data":"0ef0db10043b98810603c0aa3ae01a053cba82d7d0457ca8b1d76d0539bb6ee9"} Feb 19 23:13:13 crc kubenswrapper[4771]: I0219 23:13:13.083324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" event={"ID":"9fed3ba9-7458-44d3-a87a-9d9a5957c680","Type":"ContainerStarted","Data":"661fffca891ca305e74fda6779953e0c6168373f6eb07a3c1d9ce6fab45131e8"} Feb 19 23:13:14 crc kubenswrapper[4771]: I0219 23:13:14.097825 4771 generic.go:334] "Generic (PLEG): container finished" podID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerID="0ef0db10043b98810603c0aa3ae01a053cba82d7d0457ca8b1d76d0539bb6ee9" exitCode=0 Feb 19 23:13:14 crc kubenswrapper[4771]: I0219 23:13:14.097998 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" event={"ID":"9fed3ba9-7458-44d3-a87a-9d9a5957c680","Type":"ContainerDied","Data":"0ef0db10043b98810603c0aa3ae01a053cba82d7d0457ca8b1d76d0539bb6ee9"} Feb 19 23:13:16 crc kubenswrapper[4771]: I0219 23:13:16.044403 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9b2lj"] Feb 19 23:13:16 crc kubenswrapper[4771]: I0219 23:13:16.055046 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9b2lj"] Feb 19 23:13:16 crc kubenswrapper[4771]: I0219 23:13:16.151624 4771 generic.go:334] "Generic (PLEG): container finished" podID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerID="f8f46969afd492abd4e5eefcf869405803ca7789b454c181920b5f96448e98fc" exitCode=0 Feb 19 23:13:16 crc kubenswrapper[4771]: I0219 23:13:16.151662 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" event={"ID":"9fed3ba9-7458-44d3-a87a-9d9a5957c680","Type":"ContainerDied","Data":"f8f46969afd492abd4e5eefcf869405803ca7789b454c181920b5f96448e98fc"} Feb 19 23:13:16 crc kubenswrapper[4771]: I0219 23:13:16.448072 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee88110-ba9d-4076-b6e1-412a08c88b2b" path="/var/lib/kubelet/pods/1ee88110-ba9d-4076-b6e1-412a08c88b2b/volumes" Feb 19 23:13:17 crc kubenswrapper[4771]: I0219 23:13:17.161866 4771 generic.go:334] "Generic (PLEG): container finished" podID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerID="aa1c1d2730bd53abae105c0667b5eb6cdf3d1fc0deab3ae65304f9c62bbdf8e8" exitCode=0 Feb 19 23:13:17 crc kubenswrapper[4771]: I0219 23:13:17.161958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" event={"ID":"9fed3ba9-7458-44d3-a87a-9d9a5957c680","Type":"ContainerDied","Data":"aa1c1d2730bd53abae105c0667b5eb6cdf3d1fc0deab3ae65304f9c62bbdf8e8"} Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.437517 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:13:18 crc kubenswrapper[4771]: E0219 23:13:18.438217 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.738283 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.869572 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvrrh\" (UniqueName: \"kubernetes.io/projected/9fed3ba9-7458-44d3-a87a-9d9a5957c680-kube-api-access-xvrrh\") pod \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.869658 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-bundle\") pod \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.869743 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-util\") pod \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\" (UID: \"9fed3ba9-7458-44d3-a87a-9d9a5957c680\") " Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.871494 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-bundle" (OuterVolumeSpecName: "bundle") pod "9fed3ba9-7458-44d3-a87a-9d9a5957c680" (UID: "9fed3ba9-7458-44d3-a87a-9d9a5957c680"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.889753 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fed3ba9-7458-44d3-a87a-9d9a5957c680-kube-api-access-xvrrh" (OuterVolumeSpecName: "kube-api-access-xvrrh") pod "9fed3ba9-7458-44d3-a87a-9d9a5957c680" (UID: "9fed3ba9-7458-44d3-a87a-9d9a5957c680"). InnerVolumeSpecName "kube-api-access-xvrrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.890917 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-util" (OuterVolumeSpecName: "util") pod "9fed3ba9-7458-44d3-a87a-9d9a5957c680" (UID: "9fed3ba9-7458-44d3-a87a-9d9a5957c680"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.972568 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvrrh\" (UniqueName: \"kubernetes.io/projected/9fed3ba9-7458-44d3-a87a-9d9a5957c680-kube-api-access-xvrrh\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.972600 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:18 crc kubenswrapper[4771]: I0219 23:13:18.972609 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fed3ba9-7458-44d3-a87a-9d9a5957c680-util\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:19 crc kubenswrapper[4771]: I0219 23:13:19.187154 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" event={"ID":"9fed3ba9-7458-44d3-a87a-9d9a5957c680","Type":"ContainerDied","Data":"661fffca891ca305e74fda6779953e0c6168373f6eb07a3c1d9ce6fab45131e8"} Feb 19 23:13:19 crc kubenswrapper[4771]: I0219 23:13:19.187200 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="661fffca891ca305e74fda6779953e0c6168373f6eb07a3c1d9ce6fab45131e8" Feb 19 23:13:19 crc kubenswrapper[4771]: I0219 23:13:19.187210 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.327977 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88"] Feb 19 23:13:29 crc kubenswrapper[4771]: E0219 23:13:29.330577 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerName="pull" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.330601 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerName="pull" Feb 19 23:13:29 crc kubenswrapper[4771]: E0219 23:13:29.330620 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerName="extract" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.330626 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerName="extract" Feb 19 23:13:29 crc kubenswrapper[4771]: E0219 23:13:29.330644 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerName="util" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.330650 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerName="util" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.330847 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fed3ba9-7458-44d3-a87a-9d9a5957c680" containerName="extract" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.331560 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.335988 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-m4qlj" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.336462 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.336588 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.353033 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.404181 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m2hk\" (UniqueName: \"kubernetes.io/projected/bc803aa4-495b-490a-9ef3-90862fc268eb-kube-api-access-7m2hk\") pod \"obo-prometheus-operator-68bc856cb9-jfg88\" (UID: \"bc803aa4-495b-490a-9ef3-90862fc268eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.404568 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.405765 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.409107 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-h6zdr" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.409305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.433780 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.436307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.445623 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.494492 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.506946 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d026131-d93f-47a1-b7c2-b99751fd63c8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-jr98w\" (UID: \"1d026131-d93f-47a1-b7c2-b99751fd63c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.507199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d2f571a-67fe-4339-9648-ed9924d18c22-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-55zbx\" (UID: \"7d2f571a-67fe-4339-9648-ed9924d18c22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.507376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d026131-d93f-47a1-b7c2-b99751fd63c8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-jr98w\" (UID: \"1d026131-d93f-47a1-b7c2-b99751fd63c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.507475 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d2f571a-67fe-4339-9648-ed9924d18c22-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-55zbx\" (UID: \"7d2f571a-67fe-4339-9648-ed9924d18c22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.507588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m2hk\" (UniqueName: \"kubernetes.io/projected/bc803aa4-495b-490a-9ef3-90862fc268eb-kube-api-access-7m2hk\") pod \"obo-prometheus-operator-68bc856cb9-jfg88\" (UID: \"bc803aa4-495b-490a-9ef3-90862fc268eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.563733 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m2hk\" (UniqueName: \"kubernetes.io/projected/bc803aa4-495b-490a-9ef3-90862fc268eb-kube-api-access-7m2hk\") pod \"obo-prometheus-operator-68bc856cb9-jfg88\" (UID: \"bc803aa4-495b-490a-9ef3-90862fc268eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.609257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d026131-d93f-47a1-b7c2-b99751fd63c8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-jr98w\" (UID: \"1d026131-d93f-47a1-b7c2-b99751fd63c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.609545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d2f571a-67fe-4339-9648-ed9924d18c22-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-55zbx\" (UID: \"7d2f571a-67fe-4339-9648-ed9924d18c22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.609715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d026131-d93f-47a1-b7c2-b99751fd63c8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-jr98w\" (UID: \"1d026131-d93f-47a1-b7c2-b99751fd63c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.609827 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d2f571a-67fe-4339-9648-ed9924d18c22-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-55zbx\" (UID: \"7d2f571a-67fe-4339-9648-ed9924d18c22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.629578 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d026131-d93f-47a1-b7c2-b99751fd63c8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-jr98w\" (UID: \"1d026131-d93f-47a1-b7c2-b99751fd63c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.629583 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d2f571a-67fe-4339-9648-ed9924d18c22-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-55zbx\" (UID: \"7d2f571a-67fe-4339-9648-ed9924d18c22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.630097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d2f571a-67fe-4339-9648-ed9924d18c22-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-55zbx\" (UID: \"7d2f571a-67fe-4339-9648-ed9924d18c22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.634704 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-k884s"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.636485 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.647641 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.648056 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d026131-d93f-47a1-b7c2-b99751fd63c8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-666489549b-jr98w\" (UID: \"1d026131-d93f-47a1-b7c2-b99751fd63c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.651465 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.665736 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-qjzzm" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.689056 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-k884s"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.711216 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9tk\" (UniqueName: \"kubernetes.io/projected/08c1eeab-24f7-499e-a217-df70c76dbd50-kube-api-access-rz9tk\") pod \"observability-operator-59bdc8b94-k884s\" (UID: \"08c1eeab-24f7-499e-a217-df70c76dbd50\") " pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.711271 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/08c1eeab-24f7-499e-a217-df70c76dbd50-observability-operator-tls\") pod \"observability-operator-59bdc8b94-k884s\" (UID: \"08c1eeab-24f7-499e-a217-df70c76dbd50\") " pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.736176 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.759553 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.813098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9tk\" (UniqueName: \"kubernetes.io/projected/08c1eeab-24f7-499e-a217-df70c76dbd50-kube-api-access-rz9tk\") pod \"observability-operator-59bdc8b94-k884s\" (UID: \"08c1eeab-24f7-499e-a217-df70c76dbd50\") " pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.813166 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/08c1eeab-24f7-499e-a217-df70c76dbd50-observability-operator-tls\") pod \"observability-operator-59bdc8b94-k884s\" (UID: \"08c1eeab-24f7-499e-a217-df70c76dbd50\") " pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.821666 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/08c1eeab-24f7-499e-a217-df70c76dbd50-observability-operator-tls\") pod \"observability-operator-59bdc8b94-k884s\" (UID: \"08c1eeab-24f7-499e-a217-df70c76dbd50\") " pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.851088 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-bb52q"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.852488 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.854090 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9tk\" (UniqueName: \"kubernetes.io/projected/08c1eeab-24f7-499e-a217-df70c76dbd50-kube-api-access-rz9tk\") pod \"observability-operator-59bdc8b94-k884s\" (UID: \"08c1eeab-24f7-499e-a217-df70c76dbd50\") " pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.854580 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zt4bg" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.873869 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-bb52q"] Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.914786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/629c66a3-218a-43ff-884b-f6490aba1936-openshift-service-ca\") pod \"perses-operator-5bf474d74f-bb52q\" (UID: \"629c66a3-218a-43ff-884b-f6490aba1936\") " pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:29 crc kubenswrapper[4771]: I0219 23:13:29.914949 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnbmb\" (UniqueName: \"kubernetes.io/projected/629c66a3-218a-43ff-884b-f6490aba1936-kube-api-access-mnbmb\") pod \"perses-operator-5bf474d74f-bb52q\" (UID: \"629c66a3-218a-43ff-884b-f6490aba1936\") " pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.018203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnbmb\" (UniqueName: \"kubernetes.io/projected/629c66a3-218a-43ff-884b-f6490aba1936-kube-api-access-mnbmb\") pod \"perses-operator-5bf474d74f-bb52q\" (UID: \"629c66a3-218a-43ff-884b-f6490aba1936\") " pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.018550 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/629c66a3-218a-43ff-884b-f6490aba1936-openshift-service-ca\") pod \"perses-operator-5bf474d74f-bb52q\" (UID: \"629c66a3-218a-43ff-884b-f6490aba1936\") " pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.019496 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/629c66a3-218a-43ff-884b-f6490aba1936-openshift-service-ca\") pod \"perses-operator-5bf474d74f-bb52q\" (UID: \"629c66a3-218a-43ff-884b-f6490aba1936\") " pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.040421 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnbmb\" (UniqueName: \"kubernetes.io/projected/629c66a3-218a-43ff-884b-f6490aba1936-kube-api-access-mnbmb\") pod \"perses-operator-5bf474d74f-bb52q\" (UID: \"629c66a3-218a-43ff-884b-f6490aba1936\") " pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.127644 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.170733 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.460271 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:13:30 crc kubenswrapper[4771]: E0219 23:13:30.460777 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.706258 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w"] Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.727916 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88"] Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.860907 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx"] Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.963205 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-bb52q"] Feb 19 23:13:30 crc kubenswrapper[4771]: I0219 23:13:30.993460 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-k884s"] Feb 19 23:13:31 crc kubenswrapper[4771]: W0219 23:13:31.005330 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c1eeab_24f7_499e_a217_df70c76dbd50.slice/crio-2580a8a485ee02ea091fb47633a42d52c670729521c35647920bda5e191e59f6 WatchSource:0}: Error finding container 2580a8a485ee02ea091fb47633a42d52c670729521c35647920bda5e191e59f6: Status 404 returned error can't find the container with id 2580a8a485ee02ea091fb47633a42d52c670729521c35647920bda5e191e59f6 Feb 19 23:13:31 crc kubenswrapper[4771]: I0219 23:13:31.364643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-k884s" event={"ID":"08c1eeab-24f7-499e-a217-df70c76dbd50","Type":"ContainerStarted","Data":"2580a8a485ee02ea091fb47633a42d52c670729521c35647920bda5e191e59f6"} Feb 19 23:13:31 crc kubenswrapper[4771]: I0219 23:13:31.366467 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-bb52q" event={"ID":"629c66a3-218a-43ff-884b-f6490aba1936","Type":"ContainerStarted","Data":"fa2935a89235fda1e78c831ceefa9dcfeeefc0627ea0bdbb0a12b23f0e6e2fdf"} Feb 19 23:13:31 crc kubenswrapper[4771]: I0219 23:13:31.368098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" event={"ID":"bc803aa4-495b-490a-9ef3-90862fc268eb","Type":"ContainerStarted","Data":"50d7686eb403b9e6b9ade1656c9a850aff73555b1d4310188154f709feb152df"} Feb 19 23:13:31 crc kubenswrapper[4771]: I0219 23:13:31.369910 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" event={"ID":"1d026131-d93f-47a1-b7c2-b99751fd63c8","Type":"ContainerStarted","Data":"37c59b9c19f2e00b20ab594b2e4ef6ff240269d2c5d974386eac331313f6c9a2"} Feb 19 23:13:31 crc kubenswrapper[4771]: I0219 23:13:31.371119 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" event={"ID":"7d2f571a-67fe-4339-9648-ed9924d18c22","Type":"ContainerStarted","Data":"aba6eec836aea34b2815eee91a63a8cf8fa7e0bd8fc614ef8184f63812dbedcb"} Feb 19 23:13:37 crc kubenswrapper[4771]: I0219 23:13:37.307961 4771 scope.go:117] "RemoveContainer" containerID="4c7b00cc6e781261fe9e6a4d33c2fdc91d83e5358e2167acd65e58b05c780c58" Feb 19 23:13:41 crc kubenswrapper[4771]: I0219 23:13:41.024812 4771 scope.go:117] "RemoveContainer" containerID="c510b293aab102e9fa4bbbbc25e57eca69a3d792fde7155abea6e09958950d06" Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.077265 4771 scope.go:117] "RemoveContainer" containerID="169e31fe6ae33f7eff5e60f2e3cc613f3278f50cacfe0ca7d04b383c1c185bd8" Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.570742 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-k884s" event={"ID":"08c1eeab-24f7-499e-a217-df70c76dbd50","Type":"ContainerStarted","Data":"c40d2df4ed1b2230144fabfeed9884527a76d81483850ba626956c77ce151841"} Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.571067 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.572473 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-bb52q" event={"ID":"629c66a3-218a-43ff-884b-f6490aba1936","Type":"ContainerStarted","Data":"e81d933b00b99044a8d80a8aa28447fa802f449e2b6f7af3ef413e203418d4db"} Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.572637 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.574676 4771 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-k884s container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.149:8081/healthz\": dial tcp 10.217.1.149:8081: connect: connection refused" start-of-body= Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.574725 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-k884s" podUID="08c1eeab-24f7-499e-a217-df70c76dbd50" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.149:8081/healthz\": dial tcp 10.217.1.149:8081: connect: connection refused" Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.602654 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-k884s" podStartSLOduration=2.521505279 podStartE2EDuration="13.602636929s" podCreationTimestamp="2026-02-19 23:13:29 +0000 UTC" firstStartedPulling="2026-02-19 23:13:31.007980049 +0000 UTC m=+6311.279422519" lastFinishedPulling="2026-02-19 23:13:42.089111699 +0000 UTC m=+6322.360554169" observedRunningTime="2026-02-19 23:13:42.593906547 +0000 UTC m=+6322.865349027" watchObservedRunningTime="2026-02-19 23:13:42.602636929 +0000 UTC m=+6322.874079399" Feb 19 23:13:42 crc kubenswrapper[4771]: I0219 23:13:42.620715 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-bb52q" podStartSLOduration=2.5046839309999998 podStartE2EDuration="13.620699401s" podCreationTimestamp="2026-02-19 23:13:29 +0000 UTC" firstStartedPulling="2026-02-19 23:13:30.97387085 +0000 UTC m=+6311.245313320" lastFinishedPulling="2026-02-19 23:13:42.08988632 +0000 UTC m=+6322.361328790" observedRunningTime="2026-02-19 23:13:42.617452784 +0000 UTC m=+6322.888895274" watchObservedRunningTime="2026-02-19 23:13:42.620699401 +0000 UTC m=+6322.892141871" Feb 19 23:13:43 crc kubenswrapper[4771]: I0219 23:13:43.582058 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" event={"ID":"bc803aa4-495b-490a-9ef3-90862fc268eb","Type":"ContainerStarted","Data":"689d0f4e47a69df23a3c18f7149eeb6bd9a1ed9f0649dc5fc902d56c961647a5"} Feb 19 23:13:43 crc kubenswrapper[4771]: I0219 23:13:43.583397 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" event={"ID":"7d2f571a-67fe-4339-9648-ed9924d18c22","Type":"ContainerStarted","Data":"90ae4aec6d29844dea43f7d9c8b33eff9f61e1f710e9731230fa7debf71b8247"} Feb 19 23:13:43 crc kubenswrapper[4771]: I0219 23:13:43.584927 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" event={"ID":"1d026131-d93f-47a1-b7c2-b99751fd63c8","Type":"ContainerStarted","Data":"589d6af4a0954d23cd43ce8e1542eea75a24c7122556c090bbde2e59bd3abacd"} Feb 19 23:13:43 crc kubenswrapper[4771]: I0219 23:13:43.588759 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-k884s" Feb 19 23:13:43 crc kubenswrapper[4771]: I0219 23:13:43.617958 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jfg88" podStartSLOduration=3.226313668 podStartE2EDuration="14.617917435s" podCreationTimestamp="2026-02-19 23:13:29 +0000 UTC" firstStartedPulling="2026-02-19 23:13:30.730321697 +0000 UTC m=+6311.001764157" lastFinishedPulling="2026-02-19 23:13:42.121925454 +0000 UTC m=+6322.393367924" observedRunningTime="2026-02-19 23:13:43.608749941 +0000 UTC m=+6323.880192411" watchObservedRunningTime="2026-02-19 23:13:43.617917435 +0000 UTC m=+6323.889359905" Feb 19 23:13:43 crc kubenswrapper[4771]: I0219 23:13:43.639508 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-55zbx" podStartSLOduration=3.386198212 podStartE2EDuration="14.639490561s" podCreationTimestamp="2026-02-19 23:13:29 +0000 UTC" firstStartedPulling="2026-02-19 23:13:30.870137385 +0000 UTC m=+6311.141579855" lastFinishedPulling="2026-02-19 23:13:42.123429734 +0000 UTC m=+6322.394872204" observedRunningTime="2026-02-19 23:13:43.630569632 +0000 UTC m=+6323.902012122" watchObservedRunningTime="2026-02-19 23:13:43.639490561 +0000 UTC m=+6323.910933031" Feb 19 23:13:43 crc kubenswrapper[4771]: I0219 23:13:43.698033 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-666489549b-jr98w" podStartSLOduration=3.340539144 podStartE2EDuration="14.697996721s" podCreationTimestamp="2026-02-19 23:13:29 +0000 UTC" firstStartedPulling="2026-02-19 23:13:30.730645786 +0000 UTC m=+6311.002088256" lastFinishedPulling="2026-02-19 23:13:42.088103363 +0000 UTC m=+6322.359545833" observedRunningTime="2026-02-19 23:13:43.693731956 +0000 UTC m=+6323.965174436" watchObservedRunningTime="2026-02-19 23:13:43.697996721 +0000 UTC m=+6323.969439191" Feb 19 23:13:45 crc kubenswrapper[4771]: I0219 23:13:45.437516 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:13:46 crc kubenswrapper[4771]: I0219 23:13:46.623557 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"686e92da4ca4cf49dcd362c98b7bf5eea623f83136fe222a54561367030f55e3"} Feb 19 23:13:50 crc kubenswrapper[4771]: I0219 23:13:50.174635 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-bb52q" Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.833869 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.834730 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="200f4663-56b8-47de-a506-2345c7d42ef9" containerName="openstackclient" containerID="cri-o://98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968" gracePeriod=2 Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.845150 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.895933 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:51 crc kubenswrapper[4771]: E0219 23:13:51.896520 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200f4663-56b8-47de-a506-2345c7d42ef9" containerName="openstackclient" Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.896542 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="200f4663-56b8-47de-a506-2345c7d42ef9" containerName="openstackclient" Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.896888 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="200f4663-56b8-47de-a506-2345c7d42ef9" containerName="openstackclient" Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.897732 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.901620 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="200f4663-56b8-47de-a506-2345c7d42ef9" podUID="9f01e4f7-94ba-4f4e-8279-806223142e0f" Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.908248 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.918146 4771 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f01e4f7-94ba-4f4e-8279-806223142e0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T23:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T23:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T23:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T23:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:e1e8f9b33b9cbd07e1c9984d894a3237e9469672fb9b346889a34ba3276298e4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4gln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T23:13:51Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Feb 19 23:13:51 crc kubenswrapper[4771]: I0219 23:13:51.978033 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:52 crc kubenswrapper[4771]: E0219 23:13:51.990040 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-w4gln openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-w4gln]: context canceled" pod="openstack/openstackclient" podUID="9f01e4f7-94ba-4f4e-8279-806223142e0f" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:51.994436 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4gln\" (UniqueName: \"kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:51.994476 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:51.994499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:51.994585 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.028586 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.062562 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.063920 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.098598 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4gln\" (UniqueName: \"kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.098647 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.098667 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.098707 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.099258 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9f01e4f7-94ba-4f4e-8279-806223142e0f" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.101220 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: E0219 23:13:52.105341 4771 projected.go:194] Error preparing data for projected volume kube-api-access-w4gln for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (9f01e4f7-94ba-4f4e-8279-806223142e0f) does not match the UID in record. The object might have been deleted and then recreated Feb 19 23:13:52 crc kubenswrapper[4771]: E0219 23:13:52.105404 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln podName:9f01e4f7-94ba-4f4e-8279-806223142e0f nodeName:}" failed. No retries permitted until 2026-02-19 23:13:52.605389862 +0000 UTC m=+6332.876832332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w4gln" (UniqueName: "kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln") pod "openstackclient" (UID: "9f01e4f7-94ba-4f4e-8279-806223142e0f") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (9f01e4f7-94ba-4f4e-8279-806223142e0f) does not match the UID in record. The object might have been deleted and then recreated Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.105851 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.124096 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.125470 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.129500 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-f2qbk" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.130105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.140272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.146918 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.200348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnx6l\" (UniqueName: \"kubernetes.io/projected/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-kube-api-access-nnx6l\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.200723 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config-secret\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.200766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.200813 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.302365 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbszl\" (UniqueName: \"kubernetes.io/projected/b936b940-5d5a-4b62-8cd3-acff548506db-kube-api-access-bbszl\") pod \"kube-state-metrics-0\" (UID: \"b936b940-5d5a-4b62-8cd3-acff548506db\") " pod="openstack/kube-state-metrics-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.302491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config-secret\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.302566 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.302617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.302665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnx6l\" (UniqueName: \"kubernetes.io/projected/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-kube-api-access-nnx6l\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.303849 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.306483 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config-secret\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.307294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.324526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnx6l\" (UniqueName: \"kubernetes.io/projected/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-kube-api-access-nnx6l\") pod \"openstackclient\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.399108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.405130 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbszl\" (UniqueName: \"kubernetes.io/projected/b936b940-5d5a-4b62-8cd3-acff548506db-kube-api-access-bbszl\") pod \"kube-state-metrics-0\" (UID: \"b936b940-5d5a-4b62-8cd3-acff548506db\") " pod="openstack/kube-state-metrics-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.431728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbszl\" (UniqueName: \"kubernetes.io/projected/b936b940-5d5a-4b62-8cd3-acff548506db-kube-api-access-bbszl\") pod \"kube-state-metrics-0\" (UID: \"b936b940-5d5a-4b62-8cd3-acff548506db\") " pod="openstack/kube-state-metrics-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.542555 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.611519 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4gln\" (UniqueName: \"kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln\") pod \"openstackclient\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: E0219 23:13:52.615059 4771 projected.go:194] Error preparing data for projected volume kube-api-access-w4gln for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (9f01e4f7-94ba-4f4e-8279-806223142e0f) does not match the UID in record. The object might have been deleted and then recreated Feb 19 23:13:52 crc kubenswrapper[4771]: E0219 23:13:52.615114 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln podName:9f01e4f7-94ba-4f4e-8279-806223142e0f nodeName:}" failed. No retries permitted until 2026-02-19 23:13:53.61509922 +0000 UTC m=+6333.886541690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-w4gln" (UniqueName: "kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln") pod "openstackclient" (UID: "9f01e4f7-94ba-4f4e-8279-806223142e0f") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (9f01e4f7-94ba-4f4e-8279-806223142e0f) does not match the UID in record. The object might have been deleted and then recreated Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.670199 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.672345 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.679591 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.679941 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.680148 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.682046 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.688705 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-pl892" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.693231 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.708733 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.715469 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9f01e4f7-94ba-4f4e-8279-806223142e0f" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.751782 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.754722 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9f01e4f7-94ba-4f4e-8279-806223142e0f" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.815934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.816291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bfa02b55-b912-42a2-948f-69d38c0a6532-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.816362 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwf26\" (UniqueName: \"kubernetes.io/projected/bfa02b55-b912-42a2-948f-69d38c0a6532-kube-api-access-lwf26\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.816434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.816523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/bfa02b55-b912-42a2-948f-69d38c0a6532-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.816657 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bfa02b55-b912-42a2-948f-69d38c0a6532-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.816800 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918126 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config-secret\") pod \"9f01e4f7-94ba-4f4e-8279-806223142e0f\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918253 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config\") pod \"9f01e4f7-94ba-4f4e-8279-806223142e0f\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918420 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-combined-ca-bundle\") pod \"9f01e4f7-94ba-4f4e-8279-806223142e0f\" (UID: \"9f01e4f7-94ba-4f4e-8279-806223142e0f\") " Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918680 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/bfa02b55-b912-42a2-948f-69d38c0a6532-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918709 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bfa02b55-b912-42a2-948f-69d38c0a6532-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918777 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918829 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bfa02b55-b912-42a2-948f-69d38c0a6532-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwf26\" (UniqueName: \"kubernetes.io/projected/bfa02b55-b912-42a2-948f-69d38c0a6532-kube-api-access-lwf26\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.918954 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4gln\" (UniqueName: \"kubernetes.io/projected/9f01e4f7-94ba-4f4e-8279-806223142e0f-kube-api-access-w4gln\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.920125 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9f01e4f7-94ba-4f4e-8279-806223142e0f" (UID: "9f01e4f7-94ba-4f4e-8279-806223142e0f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.920375 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/bfa02b55-b912-42a2-948f-69d38c0a6532-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.944517 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f01e4f7-94ba-4f4e-8279-806223142e0f" (UID: "9f01e4f7-94ba-4f4e-8279-806223142e0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.944596 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9f01e4f7-94ba-4f4e-8279-806223142e0f" (UID: "9f01e4f7-94ba-4f4e-8279-806223142e0f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.944713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bfa02b55-b912-42a2-948f-69d38c0a6532-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.945045 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.945437 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bfa02b55-b912-42a2-948f-69d38c0a6532-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.947093 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.947159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bfa02b55-b912-42a2-948f-69d38c0a6532-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:52 crc kubenswrapper[4771]: I0219 23:13:52.949693 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwf26\" (UniqueName: \"kubernetes.io/projected/bfa02b55-b912-42a2-948f-69d38c0a6532-kube-api-access-lwf26\") pod \"alertmanager-metric-storage-0\" (UID: \"bfa02b55-b912-42a2-948f-69d38c0a6532\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.020593 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.020627 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.020641 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9f01e4f7-94ba-4f4e-8279-806223142e0f-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.030644 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.052999 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.293763 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.296495 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.302035 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.302070 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.304841 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.305135 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fgwfl" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.305255 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.305274 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.307683 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.307866 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.309666 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.340083 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.429180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxv8\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-kube-api-access-fnxv8\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.429699 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.429871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.429906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.429927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.429983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.430004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.430049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.430109 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.430127 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535598 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535716 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535873 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.535965 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxv8\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-kube-api-access-fnxv8\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.537674 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.541470 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.542182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.544862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.546171 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.546929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.547736 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.550992 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.551051 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/49abca73e33cf3563ba99cec27df5d404cbd23e5b70983fd7923117bd3eb244c/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.551121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.557149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxv8\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-kube-api-access-fnxv8\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.612350 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.672172 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.734178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b936b940-5d5a-4b62-8cd3-acff548506db","Type":"ContainerStarted","Data":"005483e24a77f8b134f2ffbfe719af5ab88671a649e4f9c8ac9c66aaa679ada8"} Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.735959 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.739901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c78b5c1b-7d24-43fe-bb20-5e7930b8d437","Type":"ContainerStarted","Data":"04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a"} Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.739961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c78b5c1b-7d24-43fe-bb20-5e7930b8d437","Type":"ContainerStarted","Data":"ba5817e3ad305ea284bc6bd4e615969166dca1899d01df97ab1dcd15aa5b0c33"} Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.787321 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.787305709 podStartE2EDuration="2.787305709s" podCreationTimestamp="2026-02-19 23:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:13:53.783094827 +0000 UTC m=+6334.054537297" watchObservedRunningTime="2026-02-19 23:13:53.787305709 +0000 UTC m=+6334.058748179" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.791427 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9f01e4f7-94ba-4f4e-8279-806223142e0f" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" Feb 19 23:13:53 crc kubenswrapper[4771]: I0219 23:13:53.851209 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.292230 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.437419 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.454987 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f01e4f7-94ba-4f4e-8279-806223142e0f" path="/var/lib/kubelet/pods/9f01e4f7-94ba-4f4e-8279-806223142e0f/volumes" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.567265 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config-secret\") pod \"200f4663-56b8-47de-a506-2345c7d42ef9\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.567406 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-combined-ca-bundle\") pod \"200f4663-56b8-47de-a506-2345c7d42ef9\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.567466 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vdkm\" (UniqueName: \"kubernetes.io/projected/200f4663-56b8-47de-a506-2345c7d42ef9-kube-api-access-8vdkm\") pod \"200f4663-56b8-47de-a506-2345c7d42ef9\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.567627 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config\") pod \"200f4663-56b8-47de-a506-2345c7d42ef9\" (UID: \"200f4663-56b8-47de-a506-2345c7d42ef9\") " Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.590165 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200f4663-56b8-47de-a506-2345c7d42ef9-kube-api-access-8vdkm" (OuterVolumeSpecName: "kube-api-access-8vdkm") pod "200f4663-56b8-47de-a506-2345c7d42ef9" (UID: "200f4663-56b8-47de-a506-2345c7d42ef9"). InnerVolumeSpecName "kube-api-access-8vdkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.601450 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "200f4663-56b8-47de-a506-2345c7d42ef9" (UID: "200f4663-56b8-47de-a506-2345c7d42ef9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.608462 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "200f4663-56b8-47de-a506-2345c7d42ef9" (UID: "200f4663-56b8-47de-a506-2345c7d42ef9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.636349 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "200f4663-56b8-47de-a506-2345c7d42ef9" (UID: "200f4663-56b8-47de-a506-2345c7d42ef9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.669620 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.669650 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vdkm\" (UniqueName: \"kubernetes.io/projected/200f4663-56b8-47de-a506-2345c7d42ef9-kube-api-access-8vdkm\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.669661 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.669670 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/200f4663-56b8-47de-a506-2345c7d42ef9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.753904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bfa02b55-b912-42a2-948f-69d38c0a6532","Type":"ContainerStarted","Data":"5853cc762509831732453d94956d5ea14d6fa5e5551d546a27b4e452706bd60d"} Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.755347 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerStarted","Data":"661de3c7d810df717c230f42b237f39d464d739f647a0698b130c50354364836"} Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.756993 4771 generic.go:334] "Generic (PLEG): container finished" podID="200f4663-56b8-47de-a506-2345c7d42ef9" containerID="98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968" exitCode=137 Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.757077 4771 scope.go:117] "RemoveContainer" containerID="98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.757193 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.762330 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b936b940-5d5a-4b62-8cd3-acff548506db","Type":"ContainerStarted","Data":"5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9"} Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.762437 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.793210 4771 scope.go:117] "RemoveContainer" containerID="98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968" Feb 19 23:13:54 crc kubenswrapper[4771]: E0219 23:13:54.796603 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968\": container with ID starting with 98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968 not found: ID does not exist" containerID="98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.796712 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968"} err="failed to get container status \"98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968\": rpc error: code = NotFound desc = could not find container \"98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968\": container with ID starting with 98e1927f4e2856b80540be375d847c062313fc4e366c2bdae5588ed3077eb968 not found: ID does not exist" Feb 19 23:13:54 crc kubenswrapper[4771]: I0219 23:13:54.799125 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.123519543 podStartE2EDuration="2.799106513s" podCreationTimestamp="2026-02-19 23:13:52 +0000 UTC" firstStartedPulling="2026-02-19 23:13:53.35216766 +0000 UTC m=+6333.623610130" lastFinishedPulling="2026-02-19 23:13:54.02775463 +0000 UTC m=+6334.299197100" observedRunningTime="2026-02-19 23:13:54.79113356 +0000 UTC m=+6335.062576040" watchObservedRunningTime="2026-02-19 23:13:54.799106513 +0000 UTC m=+6335.070548983" Feb 19 23:13:56 crc kubenswrapper[4771]: I0219 23:13:56.447940 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200f4663-56b8-47de-a506-2345c7d42ef9" path="/var/lib/kubelet/pods/200f4663-56b8-47de-a506-2345c7d42ef9/volumes" Feb 19 23:14:02 crc kubenswrapper[4771]: I0219 23:14:02.546946 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 23:14:02 crc kubenswrapper[4771]: I0219 23:14:02.859228 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerStarted","Data":"d57dbb0af100fefa6dc0c38ff5846dfbf4167f7540a78ed74aba2a66451084f5"} Feb 19 23:14:02 crc kubenswrapper[4771]: I0219 23:14:02.861692 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bfa02b55-b912-42a2-948f-69d38c0a6532","Type":"ContainerStarted","Data":"b733513a693f08d02c4824ea3b7cb5ff71448021b75e6316aa029899c9b812ba"} Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.694902 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqfl4"] Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.697975 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.710965 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqfl4"] Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.786400 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-utilities\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.786594 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxk9k\" (UniqueName: \"kubernetes.io/projected/d55a68d9-4c48-4729-b7b9-3c2e970643b3-kube-api-access-cxk9k\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.786829 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-catalog-content\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.887757 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-utilities\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.887904 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxk9k\" (UniqueName: \"kubernetes.io/projected/d55a68d9-4c48-4729-b7b9-3c2e970643b3-kube-api-access-cxk9k\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.887961 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-catalog-content\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.888339 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-utilities\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.888504 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-catalog-content\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:07 crc kubenswrapper[4771]: I0219 23:14:07.912727 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxk9k\" (UniqueName: \"kubernetes.io/projected/d55a68d9-4c48-4729-b7b9-3c2e970643b3-kube-api-access-cxk9k\") pod \"certified-operators-dqfl4\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:08 crc kubenswrapper[4771]: I0219 23:14:08.026198 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:08 crc kubenswrapper[4771]: W0219 23:14:08.531831 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55a68d9_4c48_4729_b7b9_3c2e970643b3.slice/crio-f6bba52322d51571550051012a24085c6abe8e29e33f6b834b9d2f1b739a5c1d WatchSource:0}: Error finding container f6bba52322d51571550051012a24085c6abe8e29e33f6b834b9d2f1b739a5c1d: Status 404 returned error can't find the container with id f6bba52322d51571550051012a24085c6abe8e29e33f6b834b9d2f1b739a5c1d Feb 19 23:14:08 crc kubenswrapper[4771]: I0219 23:14:08.538330 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqfl4"] Feb 19 23:14:08 crc kubenswrapper[4771]: I0219 23:14:08.934283 4771 generic.go:334] "Generic (PLEG): container finished" podID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerID="bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c" exitCode=0 Feb 19 23:14:08 crc kubenswrapper[4771]: I0219 23:14:08.934379 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqfl4" event={"ID":"d55a68d9-4c48-4729-b7b9-3c2e970643b3","Type":"ContainerDied","Data":"bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c"} Feb 19 23:14:08 crc kubenswrapper[4771]: I0219 23:14:08.934580 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqfl4" event={"ID":"d55a68d9-4c48-4729-b7b9-3c2e970643b3","Type":"ContainerStarted","Data":"f6bba52322d51571550051012a24085c6abe8e29e33f6b834b9d2f1b739a5c1d"} Feb 19 23:14:08 crc kubenswrapper[4771]: I0219 23:14:08.936995 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:14:09 crc kubenswrapper[4771]: I0219 23:14:09.949602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqfl4" event={"ID":"d55a68d9-4c48-4729-b7b9-3c2e970643b3","Type":"ContainerStarted","Data":"08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c"} Feb 19 23:14:09 crc kubenswrapper[4771]: I0219 23:14:09.951444 4771 generic.go:334] "Generic (PLEG): container finished" podID="bfa02b55-b912-42a2-948f-69d38c0a6532" containerID="b733513a693f08d02c4824ea3b7cb5ff71448021b75e6316aa029899c9b812ba" exitCode=0 Feb 19 23:14:09 crc kubenswrapper[4771]: I0219 23:14:09.951562 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bfa02b55-b912-42a2-948f-69d38c0a6532","Type":"ContainerDied","Data":"b733513a693f08d02c4824ea3b7cb5ff71448021b75e6316aa029899c9b812ba"} Feb 19 23:14:10 crc kubenswrapper[4771]: I0219 23:14:10.966434 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerID="d57dbb0af100fefa6dc0c38ff5846dfbf4167f7540a78ed74aba2a66451084f5" exitCode=0 Feb 19 23:14:10 crc kubenswrapper[4771]: I0219 23:14:10.966497 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerDied","Data":"d57dbb0af100fefa6dc0c38ff5846dfbf4167f7540a78ed74aba2a66451084f5"} Feb 19 23:14:11 crc kubenswrapper[4771]: I0219 23:14:11.977856 4771 generic.go:334] "Generic (PLEG): container finished" podID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerID="08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c" exitCode=0 Feb 19 23:14:11 crc kubenswrapper[4771]: I0219 23:14:11.977911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqfl4" event={"ID":"d55a68d9-4c48-4729-b7b9-3c2e970643b3","Type":"ContainerDied","Data":"08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c"} Feb 19 23:14:12 crc kubenswrapper[4771]: I0219 23:14:12.991608 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bfa02b55-b912-42a2-948f-69d38c0a6532","Type":"ContainerStarted","Data":"da2be2ba1bc4e485e2f3a3fea074bbd63dfe90b96891c753c41b8351906ab15c"} Feb 19 23:14:14 crc kubenswrapper[4771]: I0219 23:14:14.006720 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqfl4" event={"ID":"d55a68d9-4c48-4729-b7b9-3c2e970643b3","Type":"ContainerStarted","Data":"67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8"} Feb 19 23:14:14 crc kubenswrapper[4771]: I0219 23:14:14.033369 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqfl4" podStartSLOduration=3.029473037 podStartE2EDuration="7.033321965s" podCreationTimestamp="2026-02-19 23:14:07 +0000 UTC" firstStartedPulling="2026-02-19 23:14:08.936770527 +0000 UTC m=+6349.208212997" lastFinishedPulling="2026-02-19 23:14:12.940619455 +0000 UTC m=+6353.212061925" observedRunningTime="2026-02-19 23:14:14.029977126 +0000 UTC m=+6354.301419646" watchObservedRunningTime="2026-02-19 23:14:14.033321965 +0000 UTC m=+6354.304764435" Feb 19 23:14:17 crc kubenswrapper[4771]: I0219 23:14:17.034930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"bfa02b55-b912-42a2-948f-69d38c0a6532","Type":"ContainerStarted","Data":"851436fbbcaf19ef434287e782f83ed6ad848e90238bba46c976e08720299795"} Feb 19 23:14:17 crc kubenswrapper[4771]: I0219 23:14:17.082355 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.439973164 podStartE2EDuration="25.082335589s" podCreationTimestamp="2026-02-19 23:13:52 +0000 UTC" firstStartedPulling="2026-02-19 23:13:53.963185508 +0000 UTC m=+6334.234627978" lastFinishedPulling="2026-02-19 23:14:12.605547933 +0000 UTC m=+6352.876990403" observedRunningTime="2026-02-19 23:14:17.073107002 +0000 UTC m=+6357.344549482" watchObservedRunningTime="2026-02-19 23:14:17.082335589 +0000 UTC m=+6357.353778059" Feb 19 23:14:18 crc kubenswrapper[4771]: I0219 23:14:18.026693 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:18 crc kubenswrapper[4771]: I0219 23:14:18.026763 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:18 crc kubenswrapper[4771]: I0219 23:14:18.050075 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:14:18 crc kubenswrapper[4771]: I0219 23:14:18.054274 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:14:19 crc kubenswrapper[4771]: I0219 23:14:19.071719 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerStarted","Data":"7cb01e9b7d014c7c0707d2ac69f5dedc9b8c34f35cde4fdafcbf592a37abd6cf"} Feb 19 23:14:19 crc kubenswrapper[4771]: I0219 23:14:19.095948 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dqfl4" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="registry-server" probeResult="failure" output=< Feb 19 23:14:19 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:14:19 crc kubenswrapper[4771]: > Feb 19 23:14:22 crc kubenswrapper[4771]: I0219 23:14:22.054817 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2da2-account-create-update-rjbps"] Feb 19 23:14:22 crc kubenswrapper[4771]: I0219 23:14:22.071800 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kkr59"] Feb 19 23:14:22 crc kubenswrapper[4771]: I0219 23:14:22.084570 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kkr59"] Feb 19 23:14:22 crc kubenswrapper[4771]: I0219 23:14:22.095986 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2da2-account-create-update-rjbps"] Feb 19 23:14:22 crc kubenswrapper[4771]: I0219 23:14:22.450591 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367ee05c-cc9f-4e74-b756-4192639e94a9" path="/var/lib/kubelet/pods/367ee05c-cc9f-4e74-b756-4192639e94a9/volumes" Feb 19 23:14:22 crc kubenswrapper[4771]: I0219 23:14:22.451405 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d0887b-38d5-4c3d-8ade-150f8c40e3e1" path="/var/lib/kubelet/pods/45d0887b-38d5-4c3d-8ade-150f8c40e3e1/volumes" Feb 19 23:14:24 crc kubenswrapper[4771]: I0219 23:14:24.134428 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerStarted","Data":"b322ebb00eea3bf7450b9143116636bb4642ef82795f9055c3939402513a9518"} Feb 19 23:14:27 crc kubenswrapper[4771]: I0219 23:14:27.188254 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerStarted","Data":"1c70752a80593758ff43726b8982f0f78d1fe906bc58dae4957ff1b91fecafa0"} Feb 19 23:14:27 crc kubenswrapper[4771]: I0219 23:14:27.236387 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=2.796704599 podStartE2EDuration="35.236355613s" podCreationTimestamp="2026-02-19 23:13:52 +0000 UTC" firstStartedPulling="2026-02-19 23:13:54.289815486 +0000 UTC m=+6334.561257956" lastFinishedPulling="2026-02-19 23:14:26.7294665 +0000 UTC m=+6367.000908970" observedRunningTime="2026-02-19 23:14:27.216772061 +0000 UTC m=+6367.488214621" watchObservedRunningTime="2026-02-19 23:14:27.236355613 +0000 UTC m=+6367.507798123" Feb 19 23:14:28 crc kubenswrapper[4771]: I0219 23:14:28.151307 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:28 crc kubenswrapper[4771]: I0219 23:14:28.228979 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:28 crc kubenswrapper[4771]: I0219 23:14:28.411015 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqfl4"] Feb 19 23:14:28 crc kubenswrapper[4771]: I0219 23:14:28.673125 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:29 crc kubenswrapper[4771]: I0219 23:14:29.225994 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dqfl4" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="registry-server" containerID="cri-o://67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8" gracePeriod=2 Feb 19 23:14:29 crc kubenswrapper[4771]: I0219 23:14:29.852258 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:29 crc kubenswrapper[4771]: I0219 23:14:29.939142 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxk9k\" (UniqueName: \"kubernetes.io/projected/d55a68d9-4c48-4729-b7b9-3c2e970643b3-kube-api-access-cxk9k\") pod \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " Feb 19 23:14:29 crc kubenswrapper[4771]: I0219 23:14:29.939668 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-catalog-content\") pod \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " Feb 19 23:14:29 crc kubenswrapper[4771]: I0219 23:14:29.939987 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-utilities\") pod \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\" (UID: \"d55a68d9-4c48-4729-b7b9-3c2e970643b3\") " Feb 19 23:14:29 crc kubenswrapper[4771]: I0219 23:14:29.941671 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-utilities" (OuterVolumeSpecName: "utilities") pod "d55a68d9-4c48-4729-b7b9-3c2e970643b3" (UID: "d55a68d9-4c48-4729-b7b9-3c2e970643b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:29 crc kubenswrapper[4771]: I0219 23:14:29.959290 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55a68d9-4c48-4729-b7b9-3c2e970643b3-kube-api-access-cxk9k" (OuterVolumeSpecName: "kube-api-access-cxk9k") pod "d55a68d9-4c48-4729-b7b9-3c2e970643b3" (UID: "d55a68d9-4c48-4729-b7b9-3c2e970643b3"). InnerVolumeSpecName "kube-api-access-cxk9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.025028 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d55a68d9-4c48-4729-b7b9-3c2e970643b3" (UID: "d55a68d9-4c48-4729-b7b9-3c2e970643b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.035810 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tg6g8"] Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.042668 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxk9k\" (UniqueName: \"kubernetes.io/projected/d55a68d9-4c48-4729-b7b9-3c2e970643b3-kube-api-access-cxk9k\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.042711 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.042725 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55a68d9-4c48-4729-b7b9-3c2e970643b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.046372 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tg6g8"] Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.248319 4771 generic.go:334] "Generic (PLEG): container finished" podID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerID="67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8" exitCode=0 Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.248392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqfl4" event={"ID":"d55a68d9-4c48-4729-b7b9-3c2e970643b3","Type":"ContainerDied","Data":"67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8"} Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.248416 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqfl4" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.248434 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqfl4" event={"ID":"d55a68d9-4c48-4729-b7b9-3c2e970643b3","Type":"ContainerDied","Data":"f6bba52322d51571550051012a24085c6abe8e29e33f6b834b9d2f1b739a5c1d"} Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.248467 4771 scope.go:117] "RemoveContainer" containerID="67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.299940 4771 scope.go:117] "RemoveContainer" containerID="08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.306550 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqfl4"] Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.318663 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dqfl4"] Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.328359 4771 scope.go:117] "RemoveContainer" containerID="bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c" Feb 19 23:14:30 crc kubenswrapper[4771]: E0219 23:14:30.347867 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55a68d9_4c48_4729_b7b9_3c2e970643b3.slice\": RecentStats: unable to find data in memory cache]" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.383726 4771 scope.go:117] "RemoveContainer" containerID="67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8" Feb 19 23:14:30 crc kubenswrapper[4771]: E0219 23:14:30.384340 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8\": container with ID starting with 67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8 not found: ID does not exist" containerID="67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.384380 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8"} err="failed to get container status \"67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8\": rpc error: code = NotFound desc = could not find container \"67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8\": container with ID starting with 67f3ad64c61ebadc07efa07df6be8fd53ea0d560756b6802b2a58740cbfc47e8 not found: ID does not exist" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.384410 4771 scope.go:117] "RemoveContainer" containerID="08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c" Feb 19 23:14:30 crc kubenswrapper[4771]: E0219 23:14:30.384764 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c\": container with ID starting with 08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c not found: ID does not exist" containerID="08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.384803 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c"} err="failed to get container status \"08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c\": rpc error: code = NotFound desc = could not find container \"08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c\": container with ID starting with 08775d22acd2f3b14a7fbf8dbff6f3dfe0ea85dbc5e3b759bb8e2813bbafd02c not found: ID does not exist" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.384830 4771 scope.go:117] "RemoveContainer" containerID="bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c" Feb 19 23:14:30 crc kubenswrapper[4771]: E0219 23:14:30.385136 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c\": container with ID starting with bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c not found: ID does not exist" containerID="bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.385164 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c"} err="failed to get container status \"bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c\": rpc error: code = NotFound desc = could not find container \"bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c\": container with ID starting with bb871832f71c8110568fbcc6173a4601f2d35d7c1b7d054b9ce06071aef2aa8c not found: ID does not exist" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.455319 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fe0c371-c296-414d-958a-7db5115e6f69" path="/var/lib/kubelet/pods/0fe0c371-c296-414d-958a-7db5115e6f69/volumes" Feb 19 23:14:30 crc kubenswrapper[4771]: I0219 23:14:30.455976 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" path="/var/lib/kubelet/pods/d55a68d9-4c48-4729-b7b9-3c2e970643b3/volumes" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.505794 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:14:33 crc kubenswrapper[4771]: E0219 23:14:33.506728 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="extract-utilities" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.506744 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="extract-utilities" Feb 19 23:14:33 crc kubenswrapper[4771]: E0219 23:14:33.506761 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="registry-server" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.506767 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="registry-server" Feb 19 23:14:33 crc kubenswrapper[4771]: E0219 23:14:33.506778 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="extract-content" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.506784 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="extract-content" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.506971 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55a68d9-4c48-4729-b7b9-3c2e970643b3" containerName="registry-server" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.509263 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.515252 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.515470 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.520993 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.623858 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.623910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.623947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-scripts\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.624154 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-log-httpd\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.624256 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-run-httpd\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.624344 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhxpt\" (UniqueName: \"kubernetes.io/projected/cd66242b-d76e-4210-868b-97a78a3c93a3-kube-api-access-qhxpt\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.624410 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-config-data\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.725931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.725972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.725997 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-scripts\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.726128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-log-httpd\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.726171 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-run-httpd\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.726217 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhxpt\" (UniqueName: \"kubernetes.io/projected/cd66242b-d76e-4210-868b-97a78a3c93a3-kube-api-access-qhxpt\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.726241 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-config-data\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.726762 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-log-httpd\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.726970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-run-httpd\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.731878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.737883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-scripts\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.738509 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.739226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-config-data\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.758105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhxpt\" (UniqueName: \"kubernetes.io/projected/cd66242b-d76e-4210-868b-97a78a3c93a3-kube-api-access-qhxpt\") pod \"ceilometer-0\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " pod="openstack/ceilometer-0" Feb 19 23:14:33 crc kubenswrapper[4771]: I0219 23:14:33.835444 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:14:34 crc kubenswrapper[4771]: I0219 23:14:34.340840 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:14:34 crc kubenswrapper[4771]: W0219 23:14:34.347147 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd66242b_d76e_4210_868b_97a78a3c93a3.slice/crio-01dc72a56d477dbdccbda97f0887a6c37c3b2db6691fb4346d8298702a2b2d68 WatchSource:0}: Error finding container 01dc72a56d477dbdccbda97f0887a6c37c3b2db6691fb4346d8298702a2b2d68: Status 404 returned error can't find the container with id 01dc72a56d477dbdccbda97f0887a6c37c3b2db6691fb4346d8298702a2b2d68 Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.311160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerStarted","Data":"01dc72a56d477dbdccbda97f0887a6c37c3b2db6691fb4346d8298702a2b2d68"} Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.665370 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hnk4q"] Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.667532 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.679250 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnk4q"] Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.770065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-catalog-content\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.770305 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-utilities\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.770561 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjvm\" (UniqueName: \"kubernetes.io/projected/375555d2-d8a8-4968-aaff-979d372e4862-kube-api-access-qqjvm\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.872366 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjvm\" (UniqueName: \"kubernetes.io/projected/375555d2-d8a8-4968-aaff-979d372e4862-kube-api-access-qqjvm\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.872567 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-catalog-content\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.872633 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-utilities\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.873085 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-catalog-content\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.873215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-utilities\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:35 crc kubenswrapper[4771]: I0219 23:14:35.905353 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjvm\" (UniqueName: \"kubernetes.io/projected/375555d2-d8a8-4968-aaff-979d372e4862-kube-api-access-qqjvm\") pod \"redhat-marketplace-hnk4q\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:36 crc kubenswrapper[4771]: I0219 23:14:36.023842 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:36 crc kubenswrapper[4771]: I0219 23:14:36.356535 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerStarted","Data":"db16fdf7b031bfe8805ef313014646a977198661c66f25c462d287a5d5fbf064"} Feb 19 23:14:36 crc kubenswrapper[4771]: I0219 23:14:36.514217 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnk4q"] Feb 19 23:14:37 crc kubenswrapper[4771]: I0219 23:14:37.386698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerStarted","Data":"111ab8bb94ccbbb4862754acfe668de288d22e0308afa73e3ff6719cc9941de8"} Feb 19 23:14:37 crc kubenswrapper[4771]: I0219 23:14:37.387000 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerStarted","Data":"b3613a957f67a39240971d786a3be85f30d14c5c3db7b7133d5d98c2c7de3549"} Feb 19 23:14:37 crc kubenswrapper[4771]: I0219 23:14:37.388581 4771 generic.go:334] "Generic (PLEG): container finished" podID="375555d2-d8a8-4968-aaff-979d372e4862" containerID="d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b" exitCode=0 Feb 19 23:14:37 crc kubenswrapper[4771]: I0219 23:14:37.388650 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnk4q" event={"ID":"375555d2-d8a8-4968-aaff-979d372e4862","Type":"ContainerDied","Data":"d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b"} Feb 19 23:14:37 crc kubenswrapper[4771]: I0219 23:14:37.388678 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnk4q" event={"ID":"375555d2-d8a8-4968-aaff-979d372e4862","Type":"ContainerStarted","Data":"0fd913b1ddc6e0852ba3cafd0a22f124c880c374580dffc77b6d901acf7db26b"} Feb 19 23:14:38 crc kubenswrapper[4771]: I0219 23:14:38.400176 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnk4q" event={"ID":"375555d2-d8a8-4968-aaff-979d372e4862","Type":"ContainerStarted","Data":"d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613"} Feb 19 23:14:38 crc kubenswrapper[4771]: I0219 23:14:38.672973 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:38 crc kubenswrapper[4771]: I0219 23:14:38.675378 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:39 crc kubenswrapper[4771]: I0219 23:14:39.424380 4771 generic.go:334] "Generic (PLEG): container finished" podID="375555d2-d8a8-4968-aaff-979d372e4862" containerID="d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613" exitCode=0 Feb 19 23:14:39 crc kubenswrapper[4771]: I0219 23:14:39.424474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnk4q" event={"ID":"375555d2-d8a8-4968-aaff-979d372e4862","Type":"ContainerDied","Data":"d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613"} Feb 19 23:14:39 crc kubenswrapper[4771]: I0219 23:14:39.439098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerStarted","Data":"6f375f55917a4c6010a4a01b88605c49c88300e815878b404b21e8b3cce1030e"} Feb 19 23:14:39 crc kubenswrapper[4771]: I0219 23:14:39.439148 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:14:39 crc kubenswrapper[4771]: I0219 23:14:39.440924 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:39 crc kubenswrapper[4771]: I0219 23:14:39.538262 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.51190539 podStartE2EDuration="6.538242478s" podCreationTimestamp="2026-02-19 23:14:33 +0000 UTC" firstStartedPulling="2026-02-19 23:14:34.352300106 +0000 UTC m=+6374.623742616" lastFinishedPulling="2026-02-19 23:14:38.378637234 +0000 UTC m=+6378.650079704" observedRunningTime="2026-02-19 23:14:39.477313384 +0000 UTC m=+6379.748755864" watchObservedRunningTime="2026-02-19 23:14:39.538242478 +0000 UTC m=+6379.809684948" Feb 19 23:14:40 crc kubenswrapper[4771]: I0219 23:14:40.470477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnk4q" event={"ID":"375555d2-d8a8-4968-aaff-979d372e4862","Type":"ContainerStarted","Data":"fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3"} Feb 19 23:14:40 crc kubenswrapper[4771]: I0219 23:14:40.499081 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hnk4q" podStartSLOduration=3.056792304 podStartE2EDuration="5.499067173s" podCreationTimestamp="2026-02-19 23:14:35 +0000 UTC" firstStartedPulling="2026-02-19 23:14:37.390576383 +0000 UTC m=+6377.662018853" lastFinishedPulling="2026-02-19 23:14:39.832851252 +0000 UTC m=+6380.104293722" observedRunningTime="2026-02-19 23:14:40.489121787 +0000 UTC m=+6380.760564257" watchObservedRunningTime="2026-02-19 23:14:40.499067173 +0000 UTC m=+6380.770509633" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.713897 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.714387 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" containerName="openstackclient" containerID="cri-o://04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a" gracePeriod=2 Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.717956 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" podUID="be5018a6-07d3-4cc0-9215-dd2551ef18b1" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.722744 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.731915 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 23:14:41 crc kubenswrapper[4771]: E0219 23:14:41.732341 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" containerName="openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.732361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" containerName="openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.732571 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" containerName="openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.733268 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.749656 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.808549 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be5018a6-07d3-4cc0-9215-dd2551ef18b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.808621 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td589\" (UniqueName: \"kubernetes.io/projected/be5018a6-07d3-4cc0-9215-dd2551ef18b1-kube-api-access-td589\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.808706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be5018a6-07d3-4cc0-9215-dd2551ef18b1-openstack-config\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.808797 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5018a6-07d3-4cc0-9215-dd2551ef18b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.911174 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be5018a6-07d3-4cc0-9215-dd2551ef18b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.911256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td589\" (UniqueName: \"kubernetes.io/projected/be5018a6-07d3-4cc0-9215-dd2551ef18b1-kube-api-access-td589\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.911303 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be5018a6-07d3-4cc0-9215-dd2551ef18b1-openstack-config\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.911365 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5018a6-07d3-4cc0-9215-dd2551ef18b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.913736 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be5018a6-07d3-4cc0-9215-dd2551ef18b1-openstack-config\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.934429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be5018a6-07d3-4cc0-9215-dd2551ef18b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.934559 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5018a6-07d3-4cc0-9215-dd2551ef18b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:41 crc kubenswrapper[4771]: I0219 23:14:41.944566 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td589\" (UniqueName: \"kubernetes.io/projected/be5018a6-07d3-4cc0-9215-dd2551ef18b1-kube-api-access-td589\") pod \"openstackclient\" (UID: \"be5018a6-07d3-4cc0-9215-dd2551ef18b1\") " pod="openstack/openstackclient" Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.073211 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.342260 4771 scope.go:117] "RemoveContainer" containerID="921dea9be940b67f1f05a4e7409d4893f28e8bdb13c4d62fe5e88341b2ee3244" Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.364221 4771 scope.go:117] "RemoveContainer" containerID="4e5a179519a014ab779faf8f4789a6e0e9f877abdf333ebf92736f211faa5dae" Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.427472 4771 scope.go:117] "RemoveContainer" containerID="547c91ba61d2195d904a75511570c54b6870927108472ba5802873f1e9f07143" Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.686183 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:14:42 crc kubenswrapper[4771]: W0219 23:14:42.691598 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5018a6_07d3_4cc0_9215_dd2551ef18b1.slice/crio-1f1898012345d576265661ed52ac63d84b95d58362cb0445f19692c2100cab1d WatchSource:0}: Error finding container 1f1898012345d576265661ed52ac63d84b95d58362cb0445f19692c2100cab1d: Status 404 returned error can't find the container with id 1f1898012345d576265661ed52ac63d84b95d58362cb0445f19692c2100cab1d Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.919036 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.919285 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="prometheus" containerID="cri-o://7cb01e9b7d014c7c0707d2ac69f5dedc9b8c34f35cde4fdafcbf592a37abd6cf" gracePeriod=600 Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.919662 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="thanos-sidecar" containerID="cri-o://1c70752a80593758ff43726b8982f0f78d1fe906bc58dae4957ff1b91fecafa0" gracePeriod=600 Feb 19 23:14:42 crc kubenswrapper[4771]: I0219 23:14:42.919709 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="config-reloader" containerID="cri-o://b322ebb00eea3bf7450b9143116636bb4642ef82795f9055c3939402513a9518" gracePeriod=600 Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.509519 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"be5018a6-07d3-4cc0-9215-dd2551ef18b1","Type":"ContainerStarted","Data":"5e0bef130828bf6ce2db109ce89fbe44a4c26c7e3add7b5eba743f718049c9fd"} Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.509897 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"be5018a6-07d3-4cc0-9215-dd2551ef18b1","Type":"ContainerStarted","Data":"1f1898012345d576265661ed52ac63d84b95d58362cb0445f19692c2100cab1d"} Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.514926 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerID="1c70752a80593758ff43726b8982f0f78d1fe906bc58dae4957ff1b91fecafa0" exitCode=0 Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.514956 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerID="b322ebb00eea3bf7450b9143116636bb4642ef82795f9055c3939402513a9518" exitCode=0 Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.514968 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerID="7cb01e9b7d014c7c0707d2ac69f5dedc9b8c34f35cde4fdafcbf592a37abd6cf" exitCode=0 Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.514990 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerDied","Data":"1c70752a80593758ff43726b8982f0f78d1fe906bc58dae4957ff1b91fecafa0"} Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.515033 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerDied","Data":"b322ebb00eea3bf7450b9143116636bb4642ef82795f9055c3939402513a9518"} Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.515048 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerDied","Data":"7cb01e9b7d014c7c0707d2ac69f5dedc9b8c34f35cde4fdafcbf592a37abd6cf"} Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.528815 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.528792991 podStartE2EDuration="2.528792991s" podCreationTimestamp="2026-02-19 23:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:14:43.523793619 +0000 UTC m=+6383.795236089" watchObservedRunningTime="2026-02-19 23:14:43.528792991 +0000 UTC m=+6383.800235461" Feb 19 23:14:43 crc kubenswrapper[4771]: I0219 23:14:43.672855 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.155:9090/-/ready\": dial tcp 10.217.1.155:9090: connect: connection refused" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.037375 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.042923 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.074905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-web-config\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.074996 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-tls-assets\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075237 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075268 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075292 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-thanos-prometheus-http-client-file\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075348 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-2\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-0\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075464 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config-out\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075499 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnxv8\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-kube-api-access-fnxv8\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.075549 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-1\") pod \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\" (UID: \"1e348612-2ed2-4bfe-a089-d529aa53fb2e\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.076583 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.081664 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.081949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.112005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config-out" (OuterVolumeSpecName: "config-out") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.113386 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-kube-api-access-fnxv8" (OuterVolumeSpecName: "kube-api-access-fnxv8") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "kube-api-access-fnxv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.114070 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.133404 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config" (OuterVolumeSpecName: "config") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.139175 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.161786 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "pvc-9fb925b7-7184-456f-9972-08997e1df2ac". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.178631 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnx6l\" (UniqueName: \"kubernetes.io/projected/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-kube-api-access-nnx6l\") pod \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.178735 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config\") pod \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.178938 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config-secret\") pod \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.178985 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-combined-ca-bundle\") pod \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\" (UID: \"c78b5c1b-7d24-43fe-bb20-5e7930b8d437\") " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179396 4771 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179415 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnxv8\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-kube-api-access-fnxv8\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179425 4771 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179435 4771 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e348612-2ed2-4bfe-a089-d529aa53fb2e-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179454 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") on node \"crc\" " Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179464 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179475 4771 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179484 4771 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.179493 4771 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e348612-2ed2-4bfe-a089-d529aa53fb2e-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.192205 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-web-config" (OuterVolumeSpecName: "web-config") pod "1e348612-2ed2-4bfe-a089-d529aa53fb2e" (UID: "1e348612-2ed2-4bfe-a089-d529aa53fb2e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.195170 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-kube-api-access-nnx6l" (OuterVolumeSpecName: "kube-api-access-nnx6l") pod "c78b5c1b-7d24-43fe-bb20-5e7930b8d437" (UID: "c78b5c1b-7d24-43fe-bb20-5e7930b8d437"). InnerVolumeSpecName "kube-api-access-nnx6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.230594 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c78b5c1b-7d24-43fe-bb20-5e7930b8d437" (UID: "c78b5c1b-7d24-43fe-bb20-5e7930b8d437"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.245595 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c78b5c1b-7d24-43fe-bb20-5e7930b8d437" (UID: "c78b5c1b-7d24-43fe-bb20-5e7930b8d437"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.260595 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c78b5c1b-7d24-43fe-bb20-5e7930b8d437" (UID: "c78b5c1b-7d24-43fe-bb20-5e7930b8d437"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.267606 4771 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.267774 4771 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9fb925b7-7184-456f-9972-08997e1df2ac" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac") on node "crc" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.281070 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.281104 4771 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e348612-2ed2-4bfe-a089-d529aa53fb2e-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.281135 4771 reconciler_common.go:293] "Volume detached for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.281147 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.281157 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.281165 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnx6l\" (UniqueName: \"kubernetes.io/projected/c78b5c1b-7d24-43fe-bb20-5e7930b8d437-kube-api-access-nnx6l\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.448387 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" path="/var/lib/kubelet/pods/c78b5c1b-7d24-43fe-bb20-5e7930b8d437/volumes" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.527774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e348612-2ed2-4bfe-a089-d529aa53fb2e","Type":"ContainerDied","Data":"661de3c7d810df717c230f42b237f39d464d739f647a0698b130c50354364836"} Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.527829 4771 scope.go:117] "RemoveContainer" containerID="1c70752a80593758ff43726b8982f0f78d1fe906bc58dae4957ff1b91fecafa0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.527904 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.530221 4771 generic.go:334] "Generic (PLEG): container finished" podID="c78b5c1b-7d24-43fe-bb20-5e7930b8d437" containerID="04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a" exitCode=137 Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.530339 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.555969 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.563031 4771 scope.go:117] "RemoveContainer" containerID="b322ebb00eea3bf7450b9143116636bb4642ef82795f9055c3939402513a9518" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.564740 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.590270 4771 scope.go:117] "RemoveContainer" containerID="7cb01e9b7d014c7c0707d2ac69f5dedc9b8c34f35cde4fdafcbf592a37abd6cf" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.597640 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:14:44 crc kubenswrapper[4771]: E0219 23:14:44.598052 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="thanos-sidecar" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.598066 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="thanos-sidecar" Feb 19 23:14:44 crc kubenswrapper[4771]: E0219 23:14:44.598092 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="config-reloader" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.598098 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="config-reloader" Feb 19 23:14:44 crc kubenswrapper[4771]: E0219 23:14:44.598122 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="prometheus" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.598128 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="prometheus" Feb 19 23:14:44 crc kubenswrapper[4771]: E0219 23:14:44.598141 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="init-config-reloader" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.598148 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="init-config-reloader" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.598336 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="prometheus" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.598360 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="config-reloader" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.598373 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" containerName="thanos-sidecar" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.602113 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.606034 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.606475 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.606576 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.607143 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fgwfl" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.607724 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.607760 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.608566 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.608660 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.613722 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.619702 4771 scope.go:117] "RemoveContainer" containerID="d57dbb0af100fefa6dc0c38ff5846dfbf4167f7540a78ed74aba2a66451084f5" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.642529 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.672221 4771 scope.go:117] "RemoveContainer" containerID="04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688504 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688550 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/70d28726-108a-4624-9dfb-8300d74cf9e0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688576 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hzx\" (UniqueName: \"kubernetes.io/projected/70d28726-108a-4624-9dfb-8300d74cf9e0-kube-api-access-v4hzx\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688671 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688739 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688824 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/70d28726-108a-4624-9dfb-8300d74cf9e0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688842 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.688862 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-config\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.715997 4771 scope.go:117] "RemoveContainer" containerID="04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a" Feb 19 23:14:44 crc kubenswrapper[4771]: E0219 23:14:44.718745 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a\": container with ID starting with 04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a not found: ID does not exist" containerID="04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.718785 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a"} err="failed to get container status \"04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a\": rpc error: code = NotFound desc = could not find container \"04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a\": container with ID starting with 04c43e037df30cb0700588414ac162911fa940a5a0164539dcbe90e661615d0a not found: ID does not exist" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790791 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790841 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/70d28726-108a-4624-9dfb-8300d74cf9e0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hzx\" (UniqueName: \"kubernetes.io/projected/70d28726-108a-4624-9dfb-8300d74cf9e0-kube-api-access-v4hzx\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790922 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.790987 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.791031 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.791083 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.791106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/70d28726-108a-4624-9dfb-8300d74cf9e0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.791125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.791147 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-config\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.792072 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.792799 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.793294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/70d28726-108a-4624-9dfb-8300d74cf9e0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.797647 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/70d28726-108a-4624-9dfb-8300d74cf9e0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.798080 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.798405 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.798430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/70d28726-108a-4624-9dfb-8300d74cf9e0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.798624 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.798855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.800306 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.800764 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.800864 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/49abca73e33cf3563ba99cec27df5d404cbd23e5b70983fd7923117bd3eb244c/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.801378 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/70d28726-108a-4624-9dfb-8300d74cf9e0-config\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.812930 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hzx\" (UniqueName: \"kubernetes.io/projected/70d28726-108a-4624-9dfb-8300d74cf9e0-kube-api-access-v4hzx\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.846257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9fb925b7-7184-456f-9972-08997e1df2ac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb925b7-7184-456f-9972-08997e1df2ac\") pod \"prometheus-metric-storage-0\" (UID: \"70d28726-108a-4624-9dfb-8300d74cf9e0\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:44 crc kubenswrapper[4771]: I0219 23:14:44.924797 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.159306 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-z49xb"] Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.161200 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.200209 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-z49xb"] Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.201481 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abecce5-baef-4eca-b5de-b2152e68114a-operator-scripts\") pod \"aodh-db-create-z49xb\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.201649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqkjx\" (UniqueName: \"kubernetes.io/projected/0abecce5-baef-4eca-b5de-b2152e68114a-kube-api-access-sqkjx\") pod \"aodh-db-create-z49xb\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.303538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqkjx\" (UniqueName: \"kubernetes.io/projected/0abecce5-baef-4eca-b5de-b2152e68114a-kube-api-access-sqkjx\") pod \"aodh-db-create-z49xb\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.303609 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abecce5-baef-4eca-b5de-b2152e68114a-operator-scripts\") pod \"aodh-db-create-z49xb\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.304306 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abecce5-baef-4eca-b5de-b2152e68114a-operator-scripts\") pod \"aodh-db-create-z49xb\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.323444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqkjx\" (UniqueName: \"kubernetes.io/projected/0abecce5-baef-4eca-b5de-b2152e68114a-kube-api-access-sqkjx\") pod \"aodh-db-create-z49xb\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.368233 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-dd81-account-create-update-wvzrn"] Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.369596 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.371326 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.397070 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-dd81-account-create-update-wvzrn"] Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.405861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd353262-ccb6-422b-8f66-19091f7f34dd-operator-scripts\") pod \"aodh-dd81-account-create-update-wvzrn\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.406077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lft5f\" (UniqueName: \"kubernetes.io/projected/bd353262-ccb6-422b-8f66-19091f7f34dd-kube-api-access-lft5f\") pod \"aodh-dd81-account-create-update-wvzrn\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.479971 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.487468 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.507842 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lft5f\" (UniqueName: \"kubernetes.io/projected/bd353262-ccb6-422b-8f66-19091f7f34dd-kube-api-access-lft5f\") pod \"aodh-dd81-account-create-update-wvzrn\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.507965 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd353262-ccb6-422b-8f66-19091f7f34dd-operator-scripts\") pod \"aodh-dd81-account-create-update-wvzrn\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.508622 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd353262-ccb6-422b-8f66-19091f7f34dd-operator-scripts\") pod \"aodh-dd81-account-create-update-wvzrn\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:45 crc kubenswrapper[4771]: W0219 23:14:45.530861 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70d28726_108a_4624_9dfb_8300d74cf9e0.slice/crio-8d684a2ddb052c901968a2021fdb5912d5993d81681d974c87a8902bacf4afab WatchSource:0}: Error finding container 8d684a2ddb052c901968a2021fdb5912d5993d81681d974c87a8902bacf4afab: Status 404 returned error can't find the container with id 8d684a2ddb052c901968a2021fdb5912d5993d81681d974c87a8902bacf4afab Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.535427 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lft5f\" (UniqueName: \"kubernetes.io/projected/bd353262-ccb6-422b-8f66-19091f7f34dd-kube-api-access-lft5f\") pod \"aodh-dd81-account-create-update-wvzrn\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.597255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"70d28726-108a-4624-9dfb-8300d74cf9e0","Type":"ContainerStarted","Data":"8d684a2ddb052c901968a2021fdb5912d5993d81681d974c87a8902bacf4afab"} Feb 19 23:14:45 crc kubenswrapper[4771]: I0219 23:14:45.698648 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.024877 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.025204 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.045391 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-z49xb"] Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.096225 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.209575 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-dd81-account-create-update-wvzrn"] Feb 19 23:14:46 crc kubenswrapper[4771]: W0219 23:14:46.219435 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd353262_ccb6_422b_8f66_19091f7f34dd.slice/crio-940ed6c2963281caa76f1986c34e135a39c923867ee205f441fe26e054d60145 WatchSource:0}: Error finding container 940ed6c2963281caa76f1986c34e135a39c923867ee205f441fe26e054d60145: Status 404 returned error can't find the container with id 940ed6c2963281caa76f1986c34e135a39c923867ee205f441fe26e054d60145 Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.468596 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e348612-2ed2-4bfe-a089-d529aa53fb2e" path="/var/lib/kubelet/pods/1e348612-2ed2-4bfe-a089-d529aa53fb2e/volumes" Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.608518 4771 generic.go:334] "Generic (PLEG): container finished" podID="0abecce5-baef-4eca-b5de-b2152e68114a" containerID="4b384c0ea0f5f9cb5bfbda905e1adedd54736efd09b3d97c0217bb263d79af9b" exitCode=0 Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.608576 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z49xb" event={"ID":"0abecce5-baef-4eca-b5de-b2152e68114a","Type":"ContainerDied","Data":"4b384c0ea0f5f9cb5bfbda905e1adedd54736efd09b3d97c0217bb263d79af9b"} Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.608632 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z49xb" event={"ID":"0abecce5-baef-4eca-b5de-b2152e68114a","Type":"ContainerStarted","Data":"0d8413121424636bacd223061f14740a9f5c8479129983276b91588c45d601d7"} Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.609730 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dd81-account-create-update-wvzrn" event={"ID":"bd353262-ccb6-422b-8f66-19091f7f34dd","Type":"ContainerStarted","Data":"50aeb47c2615bfd1b17f343aae62951925270370c528d7d2b9e9ef4b2757564e"} Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.609768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dd81-account-create-update-wvzrn" event={"ID":"bd353262-ccb6-422b-8f66-19091f7f34dd","Type":"ContainerStarted","Data":"940ed6c2963281caa76f1986c34e135a39c923867ee205f441fe26e054d60145"} Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.643322 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-dd81-account-create-update-wvzrn" podStartSLOduration=1.643306631 podStartE2EDuration="1.643306631s" podCreationTimestamp="2026-02-19 23:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:14:46.636695285 +0000 UTC m=+6386.908137755" watchObservedRunningTime="2026-02-19 23:14:46.643306631 +0000 UTC m=+6386.914749101" Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.671304 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:46 crc kubenswrapper[4771]: I0219 23:14:46.716444 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnk4q"] Feb 19 23:14:47 crc kubenswrapper[4771]: I0219 23:14:47.625123 4771 generic.go:334] "Generic (PLEG): container finished" podID="bd353262-ccb6-422b-8f66-19091f7f34dd" containerID="50aeb47c2615bfd1b17f343aae62951925270370c528d7d2b9e9ef4b2757564e" exitCode=0 Feb 19 23:14:47 crc kubenswrapper[4771]: I0219 23:14:47.625182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dd81-account-create-update-wvzrn" event={"ID":"bd353262-ccb6-422b-8f66-19091f7f34dd","Type":"ContainerDied","Data":"50aeb47c2615bfd1b17f343aae62951925270370c528d7d2b9e9ef4b2757564e"} Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.033687 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.165801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abecce5-baef-4eca-b5de-b2152e68114a-operator-scripts\") pod \"0abecce5-baef-4eca-b5de-b2152e68114a\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.166043 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqkjx\" (UniqueName: \"kubernetes.io/projected/0abecce5-baef-4eca-b5de-b2152e68114a-kube-api-access-sqkjx\") pod \"0abecce5-baef-4eca-b5de-b2152e68114a\" (UID: \"0abecce5-baef-4eca-b5de-b2152e68114a\") " Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.166558 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0abecce5-baef-4eca-b5de-b2152e68114a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0abecce5-baef-4eca-b5de-b2152e68114a" (UID: "0abecce5-baef-4eca-b5de-b2152e68114a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.166971 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0abecce5-baef-4eca-b5de-b2152e68114a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.462622 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0abecce5-baef-4eca-b5de-b2152e68114a-kube-api-access-sqkjx" (OuterVolumeSpecName: "kube-api-access-sqkjx") pod "0abecce5-baef-4eca-b5de-b2152e68114a" (UID: "0abecce5-baef-4eca-b5de-b2152e68114a"). InnerVolumeSpecName "kube-api-access-sqkjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.475093 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqkjx\" (UniqueName: \"kubernetes.io/projected/0abecce5-baef-4eca-b5de-b2152e68114a-kube-api-access-sqkjx\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.640175 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-z49xb" event={"ID":"0abecce5-baef-4eca-b5de-b2152e68114a","Type":"ContainerDied","Data":"0d8413121424636bacd223061f14740a9f5c8479129983276b91588c45d601d7"} Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.640536 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d8413121424636bacd223061f14740a9f5c8479129983276b91588c45d601d7" Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.640474 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-z49xb" Feb 19 23:14:48 crc kubenswrapper[4771]: I0219 23:14:48.640423 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hnk4q" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="registry-server" containerID="cri-o://fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3" gracePeriod=2 Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.097070 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.227288 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.290176 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lft5f\" (UniqueName: \"kubernetes.io/projected/bd353262-ccb6-422b-8f66-19091f7f34dd-kube-api-access-lft5f\") pod \"bd353262-ccb6-422b-8f66-19091f7f34dd\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.290691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd353262-ccb6-422b-8f66-19091f7f34dd-operator-scripts\") pod \"bd353262-ccb6-422b-8f66-19091f7f34dd\" (UID: \"bd353262-ccb6-422b-8f66-19091f7f34dd\") " Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.291427 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd353262-ccb6-422b-8f66-19091f7f34dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd353262-ccb6-422b-8f66-19091f7f34dd" (UID: "bd353262-ccb6-422b-8f66-19091f7f34dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.291780 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd353262-ccb6-422b-8f66-19091f7f34dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.295570 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd353262-ccb6-422b-8f66-19091f7f34dd-kube-api-access-lft5f" (OuterVolumeSpecName: "kube-api-access-lft5f") pod "bd353262-ccb6-422b-8f66-19091f7f34dd" (UID: "bd353262-ccb6-422b-8f66-19091f7f34dd"). InnerVolumeSpecName "kube-api-access-lft5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.392945 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-utilities\") pod \"375555d2-d8a8-4968-aaff-979d372e4862\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.393259 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjvm\" (UniqueName: \"kubernetes.io/projected/375555d2-d8a8-4968-aaff-979d372e4862-kube-api-access-qqjvm\") pod \"375555d2-d8a8-4968-aaff-979d372e4862\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.393291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-catalog-content\") pod \"375555d2-d8a8-4968-aaff-979d372e4862\" (UID: \"375555d2-d8a8-4968-aaff-979d372e4862\") " Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.394097 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-utilities" (OuterVolumeSpecName: "utilities") pod "375555d2-d8a8-4968-aaff-979d372e4862" (UID: "375555d2-d8a8-4968-aaff-979d372e4862"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.397612 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375555d2-d8a8-4968-aaff-979d372e4862-kube-api-access-qqjvm" (OuterVolumeSpecName: "kube-api-access-qqjvm") pod "375555d2-d8a8-4968-aaff-979d372e4862" (UID: "375555d2-d8a8-4968-aaff-979d372e4862"). InnerVolumeSpecName "kube-api-access-qqjvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.399537 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqjvm\" (UniqueName: \"kubernetes.io/projected/375555d2-d8a8-4968-aaff-979d372e4862-kube-api-access-qqjvm\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.399609 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lft5f\" (UniqueName: \"kubernetes.io/projected/bd353262-ccb6-422b-8f66-19091f7f34dd-kube-api-access-lft5f\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.399624 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.414757 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "375555d2-d8a8-4968-aaff-979d372e4862" (UID: "375555d2-d8a8-4968-aaff-979d372e4862"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.505266 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/375555d2-d8a8-4968-aaff-979d372e4862-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.654814 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"70d28726-108a-4624-9dfb-8300d74cf9e0","Type":"ContainerStarted","Data":"d4e4c3777389b4c7a80fd23374400860bce5883478a3afb3a47820145d5949b4"} Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.659438 4771 generic.go:334] "Generic (PLEG): container finished" podID="375555d2-d8a8-4968-aaff-979d372e4862" containerID="fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3" exitCode=0 Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.659531 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnk4q" event={"ID":"375555d2-d8a8-4968-aaff-979d372e4862","Type":"ContainerDied","Data":"fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3"} Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.659572 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnk4q" event={"ID":"375555d2-d8a8-4968-aaff-979d372e4862","Type":"ContainerDied","Data":"0fd913b1ddc6e0852ba3cafd0a22f124c880c374580dffc77b6d901acf7db26b"} Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.659582 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnk4q" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.659600 4771 scope.go:117] "RemoveContainer" containerID="fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.663102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-dd81-account-create-update-wvzrn" event={"ID":"bd353262-ccb6-422b-8f66-19091f7f34dd","Type":"ContainerDied","Data":"940ed6c2963281caa76f1986c34e135a39c923867ee205f441fe26e054d60145"} Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.663135 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="940ed6c2963281caa76f1986c34e135a39c923867ee205f441fe26e054d60145" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.663200 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-dd81-account-create-update-wvzrn" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.751195 4771 scope.go:117] "RemoveContainer" containerID="d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.946928 4771 scope.go:117] "RemoveContainer" containerID="d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.952686 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnk4q"] Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.979006 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnk4q"] Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.979186 4771 scope.go:117] "RemoveContainer" containerID="fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3" Feb 19 23:14:49 crc kubenswrapper[4771]: E0219 23:14:49.979652 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3\": container with ID starting with fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3 not found: ID does not exist" containerID="fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.979679 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3"} err="failed to get container status \"fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3\": rpc error: code = NotFound desc = could not find container \"fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3\": container with ID starting with fbf53cae25114676850bdaf06679fe66eea61ad190131ac764690f5649de56c3 not found: ID does not exist" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.979699 4771 scope.go:117] "RemoveContainer" containerID="d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613" Feb 19 23:14:49 crc kubenswrapper[4771]: E0219 23:14:49.980071 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613\": container with ID starting with d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613 not found: ID does not exist" containerID="d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.980117 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613"} err="failed to get container status \"d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613\": rpc error: code = NotFound desc = could not find container \"d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613\": container with ID starting with d50f20166cee5a99abf854e0968ad720f7b7f4fed123b086837290f48897d613 not found: ID does not exist" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.980151 4771 scope.go:117] "RemoveContainer" containerID="d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b" Feb 19 23:14:49 crc kubenswrapper[4771]: E0219 23:14:49.980512 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b\": container with ID starting with d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b not found: ID does not exist" containerID="d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b" Feb 19 23:14:49 crc kubenswrapper[4771]: I0219 23:14:49.980550 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b"} err="failed to get container status \"d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b\": rpc error: code = NotFound desc = could not find container \"d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b\": container with ID starting with d29cb9f5bae67955f555daea493742a6b170c5263462aa88061d13da8473c31b not found: ID does not exist" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.457797 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375555d2-d8a8-4968-aaff-979d372e4862" path="/var/lib/kubelet/pods/375555d2-d8a8-4968-aaff-979d372e4862/volumes" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.724330 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-bkmmk"] Feb 19 23:14:50 crc kubenswrapper[4771]: E0219 23:14:50.724781 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="registry-server" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.724796 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="registry-server" Feb 19 23:14:50 crc kubenswrapper[4771]: E0219 23:14:50.724829 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="extract-utilities" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.724838 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="extract-utilities" Feb 19 23:14:50 crc kubenswrapper[4771]: E0219 23:14:50.724854 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="extract-content" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.724864 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="extract-content" Feb 19 23:14:50 crc kubenswrapper[4771]: E0219 23:14:50.724884 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abecce5-baef-4eca-b5de-b2152e68114a" containerName="mariadb-database-create" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.724893 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abecce5-baef-4eca-b5de-b2152e68114a" containerName="mariadb-database-create" Feb 19 23:14:50 crc kubenswrapper[4771]: E0219 23:14:50.724907 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd353262-ccb6-422b-8f66-19091f7f34dd" containerName="mariadb-account-create-update" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.724915 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd353262-ccb6-422b-8f66-19091f7f34dd" containerName="mariadb-account-create-update" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.725169 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0abecce5-baef-4eca-b5de-b2152e68114a" containerName="mariadb-database-create" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.725197 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd353262-ccb6-422b-8f66-19091f7f34dd" containerName="mariadb-account-create-update" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.725219 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="375555d2-d8a8-4968-aaff-979d372e4862" containerName="registry-server" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.726079 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.730118 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.731774 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.732177 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.732470 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-l9xvc" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.744188 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bkmmk"] Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.838249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxhw\" (UniqueName: \"kubernetes.io/projected/582bce55-de44-45ad-a382-bf64e80dffb9-kube-api-access-clxhw\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.838899 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-scripts\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.838989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-config-data\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.839083 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-combined-ca-bundle\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.940776 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxhw\" (UniqueName: \"kubernetes.io/projected/582bce55-de44-45ad-a382-bf64e80dffb9-kube-api-access-clxhw\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.940903 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-scripts\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.940931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-config-data\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.940959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-combined-ca-bundle\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.946237 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-combined-ca-bundle\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.946383 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-scripts\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.947523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-config-data\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:50 crc kubenswrapper[4771]: I0219 23:14:50.955224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxhw\" (UniqueName: \"kubernetes.io/projected/582bce55-de44-45ad-a382-bf64e80dffb9-kube-api-access-clxhw\") pod \"aodh-db-sync-bkmmk\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:51 crc kubenswrapper[4771]: I0219 23:14:51.067437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:14:51 crc kubenswrapper[4771]: I0219 23:14:51.570304 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bkmmk"] Feb 19 23:14:51 crc kubenswrapper[4771]: I0219 23:14:51.685661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bkmmk" event={"ID":"582bce55-de44-45ad-a382-bf64e80dffb9","Type":"ContainerStarted","Data":"6ac4443b657bc30837d4b260410b7474246eb7b2226b9407c45f54d3e829c8e4"} Feb 19 23:14:55 crc kubenswrapper[4771]: I0219 23:14:55.734378 4771 generic.go:334] "Generic (PLEG): container finished" podID="70d28726-108a-4624-9dfb-8300d74cf9e0" containerID="d4e4c3777389b4c7a80fd23374400860bce5883478a3afb3a47820145d5949b4" exitCode=0 Feb 19 23:14:55 crc kubenswrapper[4771]: I0219 23:14:55.734647 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"70d28726-108a-4624-9dfb-8300d74cf9e0","Type":"ContainerDied","Data":"d4e4c3777389b4c7a80fd23374400860bce5883478a3afb3a47820145d5949b4"} Feb 19 23:14:56 crc kubenswrapper[4771]: I0219 23:14:56.747508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bkmmk" event={"ID":"582bce55-de44-45ad-a382-bf64e80dffb9","Type":"ContainerStarted","Data":"134a0efcc84cbb7d21b6c4ca38cc3d96ff8995dfe7fad26c49354611ebaac7c5"} Feb 19 23:14:56 crc kubenswrapper[4771]: I0219 23:14:56.764780 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"70d28726-108a-4624-9dfb-8300d74cf9e0","Type":"ContainerStarted","Data":"8d6fbd8ae6bd931ae2be6da6cedc3e764ffe4d247bb152076ab56e8452ef0ac1"} Feb 19 23:14:56 crc kubenswrapper[4771]: I0219 23:14:56.776907 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-bkmmk" podStartSLOduration=2.655427898 podStartE2EDuration="6.776880241s" podCreationTimestamp="2026-02-19 23:14:50 +0000 UTC" firstStartedPulling="2026-02-19 23:14:51.568842101 +0000 UTC m=+6391.840284571" lastFinishedPulling="2026-02-19 23:14:55.690294444 +0000 UTC m=+6395.961736914" observedRunningTime="2026-02-19 23:14:56.767466031 +0000 UTC m=+6397.038908521" watchObservedRunningTime="2026-02-19 23:14:56.776880241 +0000 UTC m=+6397.048322711" Feb 19 23:14:58 crc kubenswrapper[4771]: I0219 23:14:58.800544 4771 generic.go:334] "Generic (PLEG): container finished" podID="582bce55-de44-45ad-a382-bf64e80dffb9" containerID="134a0efcc84cbb7d21b6c4ca38cc3d96ff8995dfe7fad26c49354611ebaac7c5" exitCode=0 Feb 19 23:14:58 crc kubenswrapper[4771]: I0219 23:14:58.800587 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bkmmk" event={"ID":"582bce55-de44-45ad-a382-bf64e80dffb9","Type":"ContainerDied","Data":"134a0efcc84cbb7d21b6c4ca38cc3d96ff8995dfe7fad26c49354611ebaac7c5"} Feb 19 23:14:59 crc kubenswrapper[4771]: I0219 23:14:59.072967 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-cjzcm"] Feb 19 23:14:59 crc kubenswrapper[4771]: I0219 23:14:59.086977 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-cjzcm"] Feb 19 23:14:59 crc kubenswrapper[4771]: I0219 23:14:59.097996 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-52b9-account-create-update-622pv"] Feb 19 23:14:59 crc kubenswrapper[4771]: I0219 23:14:59.108277 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-52b9-account-create-update-622pv"] Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.154899 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq"] Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.156763 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.161647 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.161873 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.164641 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq"] Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.237151 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.256322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcslf\" (UniqueName: \"kubernetes.io/projected/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-kube-api-access-vcslf\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.256795 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-config-volume\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.257552 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-secret-volume\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.359182 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clxhw\" (UniqueName: \"kubernetes.io/projected/582bce55-de44-45ad-a382-bf64e80dffb9-kube-api-access-clxhw\") pod \"582bce55-de44-45ad-a382-bf64e80dffb9\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.359284 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-config-data\") pod \"582bce55-de44-45ad-a382-bf64e80dffb9\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.359403 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-scripts\") pod \"582bce55-de44-45ad-a382-bf64e80dffb9\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.359428 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-combined-ca-bundle\") pod \"582bce55-de44-45ad-a382-bf64e80dffb9\" (UID: \"582bce55-de44-45ad-a382-bf64e80dffb9\") " Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.359768 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcslf\" (UniqueName: \"kubernetes.io/projected/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-kube-api-access-vcslf\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.359799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-config-volume\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.360826 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-config-volume\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.360939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-secret-volume\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.365864 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-scripts" (OuterVolumeSpecName: "scripts") pod "582bce55-de44-45ad-a382-bf64e80dffb9" (UID: "582bce55-de44-45ad-a382-bf64e80dffb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.366994 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-secret-volume\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.370867 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582bce55-de44-45ad-a382-bf64e80dffb9-kube-api-access-clxhw" (OuterVolumeSpecName: "kube-api-access-clxhw") pod "582bce55-de44-45ad-a382-bf64e80dffb9" (UID: "582bce55-de44-45ad-a382-bf64e80dffb9"). InnerVolumeSpecName "kube-api-access-clxhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.380826 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcslf\" (UniqueName: \"kubernetes.io/projected/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-kube-api-access-vcslf\") pod \"collect-profiles-29525715-rc6cq\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.400339 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-config-data" (OuterVolumeSpecName: "config-data") pod "582bce55-de44-45ad-a382-bf64e80dffb9" (UID: "582bce55-de44-45ad-a382-bf64e80dffb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.402995 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "582bce55-de44-45ad-a382-bf64e80dffb9" (UID: "582bce55-de44-45ad-a382-bf64e80dffb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.453430 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f7ed81-dd1c-497b-b080-6cc1bef19fca" path="/var/lib/kubelet/pods/01f7ed81-dd1c-497b-b080-6cc1bef19fca/volumes" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.454477 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd72920f-34be-4cde-becc-7d1b0e5a7daf" path="/var/lib/kubelet/pods/dd72920f-34be-4cde-becc-7d1b0e5a7daf/volumes" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.463870 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clxhw\" (UniqueName: \"kubernetes.io/projected/582bce55-de44-45ad-a382-bf64e80dffb9-kube-api-access-clxhw\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.463934 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.463962 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.463986 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bce55-de44-45ad-a382-bf64e80dffb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.556963 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.819779 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bkmmk" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.819809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bkmmk" event={"ID":"582bce55-de44-45ad-a382-bf64e80dffb9","Type":"ContainerDied","Data":"6ac4443b657bc30837d4b260410b7474246eb7b2226b9407c45f54d3e829c8e4"} Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.820212 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ac4443b657bc30837d4b260410b7474246eb7b2226b9407c45f54d3e829c8e4" Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.824910 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"70d28726-108a-4624-9dfb-8300d74cf9e0","Type":"ContainerStarted","Data":"6ca3697db3eeea2d3bd1c60ee002ab6c04525c112b078421e280635288dc74fb"} Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.824941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"70d28726-108a-4624-9dfb-8300d74cf9e0","Type":"ContainerStarted","Data":"b27bea48b03ff2752a4ba39de3b9b40126e32e75572a84307886ed47896307c3"} Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.869212 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.869193048 podStartE2EDuration="16.869193048s" podCreationTimestamp="2026-02-19 23:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:15:00.858349048 +0000 UTC m=+6401.129791558" watchObservedRunningTime="2026-02-19 23:15:00.869193048 +0000 UTC m=+6401.140635518" Feb 19 23:15:00 crc kubenswrapper[4771]: W0219 23:15:00.886954 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a4fd33_3aaf_4ae3_9fe1_8a020bfef7ba.slice/crio-88bc1a73ee768ef260f3d23cee6292b993bf6781b694bce859544ac888b02a48 WatchSource:0}: Error finding container 88bc1a73ee768ef260f3d23cee6292b993bf6781b694bce859544ac888b02a48: Status 404 returned error can't find the container with id 88bc1a73ee768ef260f3d23cee6292b993bf6781b694bce859544ac888b02a48 Feb 19 23:15:00 crc kubenswrapper[4771]: I0219 23:15:00.889488 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq"] Feb 19 23:15:01 crc kubenswrapper[4771]: I0219 23:15:01.837775 4771 generic.go:334] "Generic (PLEG): container finished" podID="00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" containerID="abaca83184754c69711f5832c292163c8d6f2070dd9563df084f01ba1189de58" exitCode=0 Feb 19 23:15:01 crc kubenswrapper[4771]: I0219 23:15:01.840290 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" event={"ID":"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba","Type":"ContainerDied","Data":"abaca83184754c69711f5832c292163c8d6f2070dd9563df084f01ba1189de58"} Feb 19 23:15:01 crc kubenswrapper[4771]: I0219 23:15:01.840328 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" event={"ID":"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba","Type":"ContainerStarted","Data":"88bc1a73ee768ef260f3d23cee6292b993bf6781b694bce859544ac888b02a48"} Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.306626 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.455065 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-config-volume\") pod \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.455297 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcslf\" (UniqueName: \"kubernetes.io/projected/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-kube-api-access-vcslf\") pod \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.455338 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-secret-volume\") pod \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\" (UID: \"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba\") " Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.456350 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" (UID: "00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.461577 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" (UID: "00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.462508 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-kube-api-access-vcslf" (OuterVolumeSpecName: "kube-api-access-vcslf") pod "00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" (UID: "00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba"). InnerVolumeSpecName "kube-api-access-vcslf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.558348 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.558391 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcslf\" (UniqueName: \"kubernetes.io/projected/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-kube-api-access-vcslf\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.558411 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.846879 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.865540 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" event={"ID":"00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba","Type":"ContainerDied","Data":"88bc1a73ee768ef260f3d23cee6292b993bf6781b694bce859544ac888b02a48"} Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.865587 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88bc1a73ee768ef260f3d23cee6292b993bf6781b694bce859544ac888b02a48" Feb 19 23:15:03 crc kubenswrapper[4771]: I0219 23:15:03.865652 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq" Feb 19 23:15:04 crc kubenswrapper[4771]: I0219 23:15:04.412145 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc"] Feb 19 23:15:04 crc kubenswrapper[4771]: I0219 23:15:04.422282 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-628zc"] Feb 19 23:15:04 crc kubenswrapper[4771]: I0219 23:15:04.458190 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feca65d4-e1dc-4ff9-9f72-870368c444da" path="/var/lib/kubelet/pods/feca65d4-e1dc-4ff9-9f72-870368c444da/volumes" Feb 19 23:15:04 crc kubenswrapper[4771]: I0219 23:15:04.925369 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.904213 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:05 crc kubenswrapper[4771]: E0219 23:15:05.904982 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582bce55-de44-45ad-a382-bf64e80dffb9" containerName="aodh-db-sync" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.904999 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="582bce55-de44-45ad-a382-bf64e80dffb9" containerName="aodh-db-sync" Feb 19 23:15:05 crc kubenswrapper[4771]: E0219 23:15:05.916571 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" containerName="collect-profiles" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.916601 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" containerName="collect-profiles" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.916927 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="582bce55-de44-45ad-a382-bf64e80dffb9" containerName="aodh-db-sync" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.916940 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" containerName="collect-profiles" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.919937 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.920033 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.925244 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.925430 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-l9xvc" Feb 19 23:15:05 crc kubenswrapper[4771]: I0219 23:15:05.925590 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.020149 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-scripts\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.020339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-config-data\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.020449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbjst\" (UniqueName: \"kubernetes.io/projected/99e011a6-6f92-4d4e-85de-a7e6252e3e30-kube-api-access-lbjst\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.020643 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-combined-ca-bundle\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.037692 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-s2xmq"] Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.051625 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-s2xmq"] Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.122748 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-scripts\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.122829 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-config-data\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.122871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbjst\" (UniqueName: \"kubernetes.io/projected/99e011a6-6f92-4d4e-85de-a7e6252e3e30-kube-api-access-lbjst\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.122919 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-combined-ca-bundle\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.128257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-scripts\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.131224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-config-data\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.134558 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-combined-ca-bundle\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.153588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbjst\" (UniqueName: \"kubernetes.io/projected/99e011a6-6f92-4d4e-85de-a7e6252e3e30-kube-api-access-lbjst\") pod \"aodh-0\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.266215 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.449491 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86cab8a-89d1-4124-8601-450fff70eeaf" path="/var/lib/kubelet/pods/f86cab8a-89d1-4124-8601-450fff70eeaf/volumes" Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.728193 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:06 crc kubenswrapper[4771]: I0219 23:15:06.898517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerStarted","Data":"a8733d18bfeb594ebabf48cd9ef5ff42fb5758b4a70f4f895bda85ff9463ba47"} Feb 19 23:15:07 crc kubenswrapper[4771]: I0219 23:15:07.908186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerStarted","Data":"cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce"} Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.313407 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.314100 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-central-agent" containerID="cri-o://db16fdf7b031bfe8805ef313014646a977198661c66f25c462d287a5d5fbf064" gracePeriod=30 Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.314305 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="proxy-httpd" containerID="cri-o://6f375f55917a4c6010a4a01b88605c49c88300e815878b404b21e8b3cce1030e" gracePeriod=30 Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.314373 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-notification-agent" containerID="cri-o://b3613a957f67a39240971d786a3be85f30d14c5c3db7b7133d5d98c2c7de3549" gracePeriod=30 Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.314444 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="sg-core" containerID="cri-o://111ab8bb94ccbbb4862754acfe668de288d22e0308afa73e3ff6719cc9941de8" gracePeriod=30 Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.924822 4771 generic.go:334] "Generic (PLEG): container finished" podID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerID="6f375f55917a4c6010a4a01b88605c49c88300e815878b404b21e8b3cce1030e" exitCode=0 Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.925136 4771 generic.go:334] "Generic (PLEG): container finished" podID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerID="111ab8bb94ccbbb4862754acfe668de288d22e0308afa73e3ff6719cc9941de8" exitCode=2 Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.925148 4771 generic.go:334] "Generic (PLEG): container finished" podID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerID="db16fdf7b031bfe8805ef313014646a977198661c66f25c462d287a5d5fbf064" exitCode=0 Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.924876 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerDied","Data":"6f375f55917a4c6010a4a01b88605c49c88300e815878b404b21e8b3cce1030e"} Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.925187 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerDied","Data":"111ab8bb94ccbbb4862754acfe668de288d22e0308afa73e3ff6719cc9941de8"} Feb 19 23:15:08 crc kubenswrapper[4771]: I0219 23:15:08.925206 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerDied","Data":"db16fdf7b031bfe8805ef313014646a977198661c66f25c462d287a5d5fbf064"} Feb 19 23:15:09 crc kubenswrapper[4771]: I0219 23:15:09.946628 4771 generic.go:334] "Generic (PLEG): container finished" podID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerID="b3613a957f67a39240971d786a3be85f30d14c5c3db7b7133d5d98c2c7de3549" exitCode=0 Feb 19 23:15:09 crc kubenswrapper[4771]: I0219 23:15:09.947299 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerDied","Data":"b3613a957f67a39240971d786a3be85f30d14c5c3db7b7133d5d98c2c7de3549"} Feb 19 23:15:09 crc kubenswrapper[4771]: I0219 23:15:09.949475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerStarted","Data":"b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d"} Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.123244 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.206331 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhxpt\" (UniqueName: \"kubernetes.io/projected/cd66242b-d76e-4210-868b-97a78a3c93a3-kube-api-access-qhxpt\") pod \"cd66242b-d76e-4210-868b-97a78a3c93a3\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.206404 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-log-httpd\") pod \"cd66242b-d76e-4210-868b-97a78a3c93a3\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.206456 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-scripts\") pod \"cd66242b-d76e-4210-868b-97a78a3c93a3\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.206514 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-sg-core-conf-yaml\") pod \"cd66242b-d76e-4210-868b-97a78a3c93a3\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.206603 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-combined-ca-bundle\") pod \"cd66242b-d76e-4210-868b-97a78a3c93a3\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.206650 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-config-data\") pod \"cd66242b-d76e-4210-868b-97a78a3c93a3\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.206736 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-run-httpd\") pod \"cd66242b-d76e-4210-868b-97a78a3c93a3\" (UID: \"cd66242b-d76e-4210-868b-97a78a3c93a3\") " Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.209325 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cd66242b-d76e-4210-868b-97a78a3c93a3" (UID: "cd66242b-d76e-4210-868b-97a78a3c93a3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.209511 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cd66242b-d76e-4210-868b-97a78a3c93a3" (UID: "cd66242b-d76e-4210-868b-97a78a3c93a3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.216820 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-scripts" (OuterVolumeSpecName: "scripts") pod "cd66242b-d76e-4210-868b-97a78a3c93a3" (UID: "cd66242b-d76e-4210-868b-97a78a3c93a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.230531 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd66242b-d76e-4210-868b-97a78a3c93a3-kube-api-access-qhxpt" (OuterVolumeSpecName: "kube-api-access-qhxpt") pod "cd66242b-d76e-4210-868b-97a78a3c93a3" (UID: "cd66242b-d76e-4210-868b-97a78a3c93a3"). InnerVolumeSpecName "kube-api-access-qhxpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.259198 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cd66242b-d76e-4210-868b-97a78a3c93a3" (UID: "cd66242b-d76e-4210-868b-97a78a3c93a3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.310861 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.310980 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.310994 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.311007 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd66242b-d76e-4210-868b-97a78a3c93a3-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.311033 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhxpt\" (UniqueName: \"kubernetes.io/projected/cd66242b-d76e-4210-868b-97a78a3c93a3-kube-api-access-qhxpt\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.366176 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-config-data" (OuterVolumeSpecName: "config-data") pod "cd66242b-d76e-4210-868b-97a78a3c93a3" (UID: "cd66242b-d76e-4210-868b-97a78a3c93a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.372779 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd66242b-d76e-4210-868b-97a78a3c93a3" (UID: "cd66242b-d76e-4210-868b-97a78a3c93a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.412905 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.412938 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd66242b-d76e-4210-868b-97a78a3c93a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.963669 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd66242b-d76e-4210-868b-97a78a3c93a3","Type":"ContainerDied","Data":"01dc72a56d477dbdccbda97f0887a6c37c3b2db6691fb4346d8298702a2b2d68"} Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.963946 4771 scope.go:117] "RemoveContainer" containerID="6f375f55917a4c6010a4a01b88605c49c88300e815878b404b21e8b3cce1030e" Feb 19 23:15:10 crc kubenswrapper[4771]: I0219 23:15:10.964097 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.000492 4771 scope.go:117] "RemoveContainer" containerID="111ab8bb94ccbbb4862754acfe668de288d22e0308afa73e3ff6719cc9941de8" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.014082 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.024250 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.038905 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:11 crc kubenswrapper[4771]: E0219 23:15:11.039360 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-central-agent" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039371 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-central-agent" Feb 19 23:15:11 crc kubenswrapper[4771]: E0219 23:15:11.039413 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-notification-agent" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039419 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-notification-agent" Feb 19 23:15:11 crc kubenswrapper[4771]: E0219 23:15:11.039436 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="sg-core" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039441 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="sg-core" Feb 19 23:15:11 crc kubenswrapper[4771]: E0219 23:15:11.039451 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="proxy-httpd" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039456 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="proxy-httpd" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039643 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="proxy-httpd" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039657 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-central-agent" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039668 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="sg-core" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.039678 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" containerName="ceilometer-notification-agent" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.044647 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.047805 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.048106 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.074209 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.081830 4771 scope.go:117] "RemoveContainer" containerID="b3613a957f67a39240971d786a3be85f30d14c5c3db7b7133d5d98c2c7de3549" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.105873 4771 scope.go:117] "RemoveContainer" containerID="db16fdf7b031bfe8805ef313014646a977198661c66f25c462d287a5d5fbf064" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.129946 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-config-data\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.129993 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.130080 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-log-httpd\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.130117 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq4cp\" (UniqueName: \"kubernetes.io/projected/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-kube-api-access-bq4cp\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.130158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.130215 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-run-httpd\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.130252 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-scripts\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.232424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.232516 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-log-httpd\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.232556 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq4cp\" (UniqueName: \"kubernetes.io/projected/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-kube-api-access-bq4cp\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.232600 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.232663 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-run-httpd\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.232708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-scripts\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.232742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-config-data\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.237298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-run-httpd\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.238766 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.241596 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-scripts\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.242013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-config-data\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.258428 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-log-httpd\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.264571 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.274929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq4cp\" (UniqueName: \"kubernetes.io/projected/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-kube-api-access-bq4cp\") pod \"ceilometer-0\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.364766 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.565933 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:11 crc kubenswrapper[4771]: I0219 23:15:11.977500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerStarted","Data":"38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67"} Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.101090 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:12 crc kubenswrapper[4771]: W0219 23:15:12.103470 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d7efbf_5a29_42b3_9a8b_882d83ba2ccd.slice/crio-f43ae2f2fe5cd9d3482df04a9b215a4b5764932d7b77854767f5d07ec7f1620a WatchSource:0}: Error finding container f43ae2f2fe5cd9d3482df04a9b215a4b5764932d7b77854767f5d07ec7f1620a: Status 404 returned error can't find the container with id f43ae2f2fe5cd9d3482df04a9b215a4b5764932d7b77854767f5d07ec7f1620a Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.449818 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd66242b-d76e-4210-868b-97a78a3c93a3" path="/var/lib/kubelet/pods/cd66242b-d76e-4210-868b-97a78a3c93a3/volumes" Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.992344 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerStarted","Data":"5314fe2ac7b7e5c6755edac5750a9be4a9ba30f60a7403e8297509575944fcfd"} Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.992604 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerStarted","Data":"f43ae2f2fe5cd9d3482df04a9b215a4b5764932d7b77854767f5d07ec7f1620a"} Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.994840 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.995315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerStarted","Data":"3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed"} Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.995200 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-notifier" containerID="cri-o://38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67" gracePeriod=30 Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.995237 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-evaluator" containerID="cri-o://b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d" gracePeriod=30 Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.995217 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-listener" containerID="cri-o://3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed" gracePeriod=30 Feb 19 23:15:12 crc kubenswrapper[4771]: I0219 23:15:12.995107 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-api" containerID="cri-o://cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce" gracePeriod=30 Feb 19 23:15:13 crc kubenswrapper[4771]: I0219 23:15:13.026738 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.20786319 podStartE2EDuration="8.026723394s" podCreationTimestamp="2026-02-19 23:15:05 +0000 UTC" firstStartedPulling="2026-02-19 23:15:06.732804955 +0000 UTC m=+6407.004247415" lastFinishedPulling="2026-02-19 23:15:12.551665149 +0000 UTC m=+6412.823107619" observedRunningTime="2026-02-19 23:15:13.021209707 +0000 UTC m=+6413.292652187" watchObservedRunningTime="2026-02-19 23:15:13.026723394 +0000 UTC m=+6413.298165864" Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.034540 4771 generic.go:334] "Generic (PLEG): container finished" podID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerID="38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67" exitCode=0 Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.035169 4771 generic.go:334] "Generic (PLEG): container finished" podID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerID="b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d" exitCode=0 Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.035184 4771 generic.go:334] "Generic (PLEG): container finished" podID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerID="cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce" exitCode=0 Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.034768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerDied","Data":"38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67"} Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.035259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerDied","Data":"b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d"} Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.035272 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerDied","Data":"cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce"} Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.038660 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerStarted","Data":"b01bfdecca3504bc78a00eee98e77bbc7c1b09243a11bd62aff5e7b6d1d3de86"} Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.925243 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 23:15:14 crc kubenswrapper[4771]: I0219 23:15:14.933207 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 23:15:15 crc kubenswrapper[4771]: I0219 23:15:15.050969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerStarted","Data":"aa5a294a7b6c81c982e9302c1dc32bb3f1443233650257d9361277130abd8210"} Feb 19 23:15:15 crc kubenswrapper[4771]: I0219 23:15:15.056623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 23:15:16 crc kubenswrapper[4771]: I0219 23:15:16.062168 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerStarted","Data":"cb039d0aceec8b1cf017a49f3bcf443e44f7763f065f301d60b9e297ce4b458f"} Feb 19 23:15:16 crc kubenswrapper[4771]: I0219 23:15:16.062725 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:15:16 crc kubenswrapper[4771]: I0219 23:15:16.062291 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="proxy-httpd" containerID="cri-o://cb039d0aceec8b1cf017a49f3bcf443e44f7763f065f301d60b9e297ce4b458f" gracePeriod=30 Feb 19 23:15:16 crc kubenswrapper[4771]: I0219 23:15:16.062258 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-central-agent" containerID="cri-o://5314fe2ac7b7e5c6755edac5750a9be4a9ba30f60a7403e8297509575944fcfd" gracePeriod=30 Feb 19 23:15:16 crc kubenswrapper[4771]: I0219 23:15:16.062370 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="sg-core" containerID="cri-o://aa5a294a7b6c81c982e9302c1dc32bb3f1443233650257d9361277130abd8210" gracePeriod=30 Feb 19 23:15:16 crc kubenswrapper[4771]: I0219 23:15:16.062370 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-notification-agent" containerID="cri-o://b01bfdecca3504bc78a00eee98e77bbc7c1b09243a11bd62aff5e7b6d1d3de86" gracePeriod=30 Feb 19 23:15:16 crc kubenswrapper[4771]: I0219 23:15:16.093334 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.033735901 podStartE2EDuration="6.093313935s" podCreationTimestamp="2026-02-19 23:15:10 +0000 UTC" firstStartedPulling="2026-02-19 23:15:12.109305697 +0000 UTC m=+6412.380748167" lastFinishedPulling="2026-02-19 23:15:15.168883731 +0000 UTC m=+6415.440326201" observedRunningTime="2026-02-19 23:15:16.08448313 +0000 UTC m=+6416.355925610" watchObservedRunningTime="2026-02-19 23:15:16.093313935 +0000 UTC m=+6416.364756405" Feb 19 23:15:17 crc kubenswrapper[4771]: I0219 23:15:17.072926 4771 generic.go:334] "Generic (PLEG): container finished" podID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerID="cb039d0aceec8b1cf017a49f3bcf443e44f7763f065f301d60b9e297ce4b458f" exitCode=0 Feb 19 23:15:17 crc kubenswrapper[4771]: I0219 23:15:17.073284 4771 generic.go:334] "Generic (PLEG): container finished" podID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerID="aa5a294a7b6c81c982e9302c1dc32bb3f1443233650257d9361277130abd8210" exitCode=2 Feb 19 23:15:17 crc kubenswrapper[4771]: I0219 23:15:17.073294 4771 generic.go:334] "Generic (PLEG): container finished" podID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerID="b01bfdecca3504bc78a00eee98e77bbc7c1b09243a11bd62aff5e7b6d1d3de86" exitCode=0 Feb 19 23:15:17 crc kubenswrapper[4771]: I0219 23:15:17.072978 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerDied","Data":"cb039d0aceec8b1cf017a49f3bcf443e44f7763f065f301d60b9e297ce4b458f"} Feb 19 23:15:17 crc kubenswrapper[4771]: I0219 23:15:17.073336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerDied","Data":"aa5a294a7b6c81c982e9302c1dc32bb3f1443233650257d9361277130abd8210"} Feb 19 23:15:17 crc kubenswrapper[4771]: I0219 23:15:17.073352 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerDied","Data":"b01bfdecca3504bc78a00eee98e77bbc7c1b09243a11bd62aff5e7b6d1d3de86"} Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.127575 4771 generic.go:334] "Generic (PLEG): container finished" podID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerID="5314fe2ac7b7e5c6755edac5750a9be4a9ba30f60a7403e8297509575944fcfd" exitCode=0 Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.127785 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerDied","Data":"5314fe2ac7b7e5c6755edac5750a9be4a9ba30f60a7403e8297509575944fcfd"} Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.128092 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd","Type":"ContainerDied","Data":"f43ae2f2fe5cd9d3482df04a9b215a4b5764932d7b77854767f5d07ec7f1620a"} Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.128108 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f43ae2f2fe5cd9d3482df04a9b215a4b5764932d7b77854767f5d07ec7f1620a" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.179054 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-config-data\") pod \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-run-httpd\") pod \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262193 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-log-httpd\") pod \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262258 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-sg-core-conf-yaml\") pod \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262287 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-combined-ca-bundle\") pod \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262306 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-scripts\") pod \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262355 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq4cp\" (UniqueName: \"kubernetes.io/projected/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-kube-api-access-bq4cp\") pod \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\" (UID: \"51d7efbf-5a29-42b3-9a8b-882d83ba2ccd\") " Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262590 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" (UID: "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.262823 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.263109 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" (UID: "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.268303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-kube-api-access-bq4cp" (OuterVolumeSpecName: "kube-api-access-bq4cp") pod "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" (UID: "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd"). InnerVolumeSpecName "kube-api-access-bq4cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.271088 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-scripts" (OuterVolumeSpecName: "scripts") pod "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" (UID: "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.296971 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" (UID: "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.364737 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.364778 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.364793 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.364805 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq4cp\" (UniqueName: \"kubernetes.io/projected/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-kube-api-access-bq4cp\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.369481 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" (UID: "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.400904 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-config-data" (OuterVolumeSpecName: "config-data") pod "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" (UID: "51d7efbf-5a29-42b3-9a8b-882d83ba2ccd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.467550 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:21 crc kubenswrapper[4771]: I0219 23:15:21.467592 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.136844 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.176567 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.189709 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198262 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:22 crc kubenswrapper[4771]: E0219 23:15:22.198650 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-central-agent" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198667 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-central-agent" Feb 19 23:15:22 crc kubenswrapper[4771]: E0219 23:15:22.198694 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="proxy-httpd" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198701 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="proxy-httpd" Feb 19 23:15:22 crc kubenswrapper[4771]: E0219 23:15:22.198729 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="sg-core" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198735 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="sg-core" Feb 19 23:15:22 crc kubenswrapper[4771]: E0219 23:15:22.198751 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-notification-agent" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198757 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-notification-agent" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198943 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="sg-core" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198956 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-central-agent" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198971 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="ceilometer-notification-agent" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.198986 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" containerName="proxy-httpd" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.200746 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.204687 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.204802 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.234410 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.286089 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-log-httpd\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.286176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.286527 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-scripts\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.286667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnpfg\" (UniqueName: \"kubernetes.io/projected/753edc49-4dfc-458f-bf7c-f14cd632f21d-kube-api-access-cnpfg\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.286794 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.286832 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-config-data\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.286864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-run-httpd\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.388239 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.388292 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-config-data\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.388326 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-run-httpd\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.388390 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-log-httpd\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.388441 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.388540 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-scripts\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.388591 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnpfg\" (UniqueName: \"kubernetes.io/projected/753edc49-4dfc-458f-bf7c-f14cd632f21d-kube-api-access-cnpfg\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.397732 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-run-httpd\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.397781 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.397729 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-log-httpd\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.401705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-scripts\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.404233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-config-data\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.411655 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.419310 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnpfg\" (UniqueName: \"kubernetes.io/projected/753edc49-4dfc-458f-bf7c-f14cd632f21d-kube-api-access-cnpfg\") pod \"ceilometer-0\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " pod="openstack/ceilometer-0" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.455340 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d7efbf-5a29-42b3-9a8b-882d83ba2ccd" path="/var/lib/kubelet/pods/51d7efbf-5a29-42b3-9a8b-882d83ba2ccd/volumes" Feb 19 23:15:22 crc kubenswrapper[4771]: I0219 23:15:22.518894 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:15:23 crc kubenswrapper[4771]: I0219 23:15:23.024063 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:23 crc kubenswrapper[4771]: I0219 23:15:23.150859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerStarted","Data":"9fb4ccc1f9306bb5a710e6ad2038d4b91f370e936089bff2f84d4e1ab5e72198"} Feb 19 23:15:24 crc kubenswrapper[4771]: I0219 23:15:24.173396 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerStarted","Data":"74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0"} Feb 19 23:15:25 crc kubenswrapper[4771]: I0219 23:15:25.183406 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerStarted","Data":"8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858"} Feb 19 23:15:25 crc kubenswrapper[4771]: I0219 23:15:25.183930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerStarted","Data":"e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29"} Feb 19 23:15:27 crc kubenswrapper[4771]: I0219 23:15:27.207247 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerStarted","Data":"e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a"} Feb 19 23:15:27 crc kubenswrapper[4771]: I0219 23:15:27.207929 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:15:27 crc kubenswrapper[4771]: I0219 23:15:27.226075 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.682868494 podStartE2EDuration="5.226058122s" podCreationTimestamp="2026-02-19 23:15:22 +0000 UTC" firstStartedPulling="2026-02-19 23:15:23.038883527 +0000 UTC m=+6423.310325997" lastFinishedPulling="2026-02-19 23:15:26.582073115 +0000 UTC m=+6426.853515625" observedRunningTime="2026-02-19 23:15:27.224574122 +0000 UTC m=+6427.496016632" watchObservedRunningTime="2026-02-19 23:15:27.226058122 +0000 UTC m=+6427.497500612" Feb 19 23:15:42 crc kubenswrapper[4771]: I0219 23:15:42.619600 4771 scope.go:117] "RemoveContainer" containerID="1bb8904b17a681aafb840c1f8d9a661bf01d4be0966053afd12bf437f1edda6f" Feb 19 23:15:42 crc kubenswrapper[4771]: I0219 23:15:42.678282 4771 scope.go:117] "RemoveContainer" containerID="fe40feaf58a49738ad31ae71ed2fcd99a805500e09586e56da67a6be38f8ce1b" Feb 19 23:15:42 crc kubenswrapper[4771]: I0219 23:15:42.748411 4771 scope.go:117] "RemoveContainer" containerID="848ca39b23f9ea56088e48be7a326aaa56da328d26e10d79cf68a933382f1150" Feb 19 23:15:42 crc kubenswrapper[4771]: I0219 23:15:42.793754 4771 scope.go:117] "RemoveContainer" containerID="520046f48a74454766559a14f452ac54dba46f8c86207b4f39f6045aa65712f4" Feb 19 23:15:42 crc kubenswrapper[4771]: I0219 23:15:42.848657 4771 scope.go:117] "RemoveContainer" containerID="d3940958b65503525cf815eba047883920fa14b29c7289535d08c48df941c092" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.401495 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.418287 4771 generic.go:334] "Generic (PLEG): container finished" podID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerID="3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed" exitCode=137 Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.418396 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerDied","Data":"3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed"} Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.418428 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.418451 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"99e011a6-6f92-4d4e-85de-a7e6252e3e30","Type":"ContainerDied","Data":"a8733d18bfeb594ebabf48cd9ef5ff42fb5758b4a70f4f895bda85ff9463ba47"} Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.418507 4771 scope.go:117] "RemoveContainer" containerID="3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.452496 4771 scope.go:117] "RemoveContainer" containerID="38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.476003 4771 scope.go:117] "RemoveContainer" containerID="b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.523104 4771 scope.go:117] "RemoveContainer" containerID="cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.557371 4771 scope.go:117] "RemoveContainer" containerID="3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed" Feb 19 23:15:43 crc kubenswrapper[4771]: E0219 23:15:43.557796 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed\": container with ID starting with 3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed not found: ID does not exist" containerID="3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.557848 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed"} err="failed to get container status \"3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed\": rpc error: code = NotFound desc = could not find container \"3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed\": container with ID starting with 3fd5524a36cbf9260eda2078880daa9236631a3aa5636f7e92cb1eff2eeca8ed not found: ID does not exist" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.557881 4771 scope.go:117] "RemoveContainer" containerID="38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67" Feb 19 23:15:43 crc kubenswrapper[4771]: E0219 23:15:43.561485 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67\": container with ID starting with 38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67 not found: ID does not exist" containerID="38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.561519 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67"} err="failed to get container status \"38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67\": rpc error: code = NotFound desc = could not find container \"38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67\": container with ID starting with 38168b72b7abf879431ab754fb698744e34c1f55a1c0541be990d50e5a6f4e67 not found: ID does not exist" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.561540 4771 scope.go:117] "RemoveContainer" containerID="b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d" Feb 19 23:15:43 crc kubenswrapper[4771]: E0219 23:15:43.561779 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d\": container with ID starting with b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d not found: ID does not exist" containerID="b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.561814 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d"} err="failed to get container status \"b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d\": rpc error: code = NotFound desc = could not find container \"b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d\": container with ID starting with b9b72e65e6ef97baeaf6b42ade519567376f2ebb8863aed7774eecd0a5bbbc2d not found: ID does not exist" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.561856 4771 scope.go:117] "RemoveContainer" containerID="cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce" Feb 19 23:15:43 crc kubenswrapper[4771]: E0219 23:15:43.562080 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce\": container with ID starting with cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce not found: ID does not exist" containerID="cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.562096 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce"} err="failed to get container status \"cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce\": rpc error: code = NotFound desc = could not find container \"cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce\": container with ID starting with cdaaa72025645d4589bcb9abb0a2e5e55298da224ce265e59793e4bf0e749bce not found: ID does not exist" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.591043 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbjst\" (UniqueName: \"kubernetes.io/projected/99e011a6-6f92-4d4e-85de-a7e6252e3e30-kube-api-access-lbjst\") pod \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.591148 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-combined-ca-bundle\") pod \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.591311 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-config-data\") pod \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.591343 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-scripts\") pod \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\" (UID: \"99e011a6-6f92-4d4e-85de-a7e6252e3e30\") " Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.597291 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e011a6-6f92-4d4e-85de-a7e6252e3e30-kube-api-access-lbjst" (OuterVolumeSpecName: "kube-api-access-lbjst") pod "99e011a6-6f92-4d4e-85de-a7e6252e3e30" (UID: "99e011a6-6f92-4d4e-85de-a7e6252e3e30"). InnerVolumeSpecName "kube-api-access-lbjst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.598396 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-scripts" (OuterVolumeSpecName: "scripts") pod "99e011a6-6f92-4d4e-85de-a7e6252e3e30" (UID: "99e011a6-6f92-4d4e-85de-a7e6252e3e30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.695033 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.695068 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbjst\" (UniqueName: \"kubernetes.io/projected/99e011a6-6f92-4d4e-85de-a7e6252e3e30-kube-api-access-lbjst\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.696825 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-config-data" (OuterVolumeSpecName: "config-data") pod "99e011a6-6f92-4d4e-85de-a7e6252e3e30" (UID: "99e011a6-6f92-4d4e-85de-a7e6252e3e30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.721503 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99e011a6-6f92-4d4e-85de-a7e6252e3e30" (UID: "99e011a6-6f92-4d4e-85de-a7e6252e3e30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.796791 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:43 crc kubenswrapper[4771]: I0219 23:15:43.796820 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e011a6-6f92-4d4e-85de-a7e6252e3e30-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.070627 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.085933 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144030 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:44 crc kubenswrapper[4771]: E0219 23:15:44.144459 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-evaluator" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144477 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-evaluator" Feb 19 23:15:44 crc kubenswrapper[4771]: E0219 23:15:44.144499 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-notifier" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144505 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-notifier" Feb 19 23:15:44 crc kubenswrapper[4771]: E0219 23:15:44.144530 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-api" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144536 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-api" Feb 19 23:15:44 crc kubenswrapper[4771]: E0219 23:15:44.144548 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-listener" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144554 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-listener" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144735 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-listener" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144757 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-api" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144769 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-evaluator" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.144777 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" containerName="aodh-notifier" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.146710 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.148931 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.149097 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.149199 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.149414 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.149530 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-l9xvc" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.158041 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.310235 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-public-tls-certs\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.310284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-config-data\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.310317 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-scripts\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.310752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-internal-tls-certs\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.310837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.310922 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv4ls\" (UniqueName: \"kubernetes.io/projected/f65694fc-cb68-4dfd-9984-7fa66df7ba37-kube-api-access-tv4ls\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.413279 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-public-tls-certs\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.413327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-config-data\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.413358 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-scripts\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.413484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-internal-tls-certs\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.413518 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.413557 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv4ls\" (UniqueName: \"kubernetes.io/projected/f65694fc-cb68-4dfd-9984-7fa66df7ba37-kube-api-access-tv4ls\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.429216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.430130 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-internal-tls-certs\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.430447 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-public-tls-certs\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.431205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-config-data\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.431971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f65694fc-cb68-4dfd-9984-7fa66df7ba37-scripts\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.437523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv4ls\" (UniqueName: \"kubernetes.io/projected/f65694fc-cb68-4dfd-9984-7fa66df7ba37-kube-api-access-tv4ls\") pod \"aodh-0\" (UID: \"f65694fc-cb68-4dfd-9984-7fa66df7ba37\") " pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.453377 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e011a6-6f92-4d4e-85de-a7e6252e3e30" path="/var/lib/kubelet/pods/99e011a6-6f92-4d4e-85de-a7e6252e3e30/volumes" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.494963 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:15:44 crc kubenswrapper[4771]: I0219 23:15:44.987014 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:15:45 crc kubenswrapper[4771]: I0219 23:15:45.454860 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f65694fc-cb68-4dfd-9984-7fa66df7ba37","Type":"ContainerStarted","Data":"616b3beb857cb5f87471df265917c222b5e3139df61022b7490b2dacafbbe9da"} Feb 19 23:15:46 crc kubenswrapper[4771]: I0219 23:15:46.465884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f65694fc-cb68-4dfd-9984-7fa66df7ba37","Type":"ContainerStarted","Data":"9a0245ffd32b6fa9fd3fe3664c71467cf575c08d86bf143f1a5747bae9907c03"} Feb 19 23:15:46 crc kubenswrapper[4771]: I0219 23:15:46.466243 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f65694fc-cb68-4dfd-9984-7fa66df7ba37","Type":"ContainerStarted","Data":"dd4074d541f3cdd06d9d30eb87c96702b2f435049a69fd4e977f7d5f6d226fc0"} Feb 19 23:15:47 crc kubenswrapper[4771]: I0219 23:15:47.481840 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f65694fc-cb68-4dfd-9984-7fa66df7ba37","Type":"ContainerStarted","Data":"fcb40887fca60cbae6feaef842369bf1b604dcf622d838c060d4812445d6cea0"} Feb 19 23:15:47 crc kubenswrapper[4771]: I0219 23:15:47.995901 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbc456f59-z9cr5"] Feb 19 23:15:47 crc kubenswrapper[4771]: I0219 23:15:47.997402 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.000753 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.016751 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbc456f59-z9cr5"] Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.112161 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-dns-svc\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.112227 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-config\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.112273 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-sb\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.112330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-openstack-cell1\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.112407 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2krqv\" (UniqueName: \"kubernetes.io/projected/70205606-64b4-4e9e-a3c5-c8e94cda8d20-kube-api-access-2krqv\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.112445 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-nb\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.214241 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krqv\" (UniqueName: \"kubernetes.io/projected/70205606-64b4-4e9e-a3c5-c8e94cda8d20-kube-api-access-2krqv\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.214509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-nb\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.214545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-dns-svc\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.214572 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-config\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.214607 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-sb\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.214655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-openstack-cell1\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.215413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-openstack-cell1\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.216215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-nb\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.216783 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-dns-svc\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.217345 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-config\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.217839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-sb\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.239396 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krqv\" (UniqueName: \"kubernetes.io/projected/70205606-64b4-4e9e-a3c5-c8e94cda8d20-kube-api-access-2krqv\") pod \"dnsmasq-dns-dbc456f59-z9cr5\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.316168 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.508397 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f65694fc-cb68-4dfd-9984-7fa66df7ba37","Type":"ContainerStarted","Data":"278518e71cd4b3e53e8c13e0d59ea553fc9619b9a11f3e93a3924c15fdbd0efa"} Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.567218 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.137458978 podStartE2EDuration="4.567200282s" podCreationTimestamp="2026-02-19 23:15:44 +0000 UTC" firstStartedPulling="2026-02-19 23:15:44.988846007 +0000 UTC m=+6445.260288477" lastFinishedPulling="2026-02-19 23:15:47.418587301 +0000 UTC m=+6447.690029781" observedRunningTime="2026-02-19 23:15:48.541225279 +0000 UTC m=+6448.812667749" watchObservedRunningTime="2026-02-19 23:15:48.567200282 +0000 UTC m=+6448.838642752" Feb 19 23:15:48 crc kubenswrapper[4771]: I0219 23:15:48.902864 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbc456f59-z9cr5"] Feb 19 23:15:48 crc kubenswrapper[4771]: W0219 23:15:48.915756 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70205606_64b4_4e9e_a3c5_c8e94cda8d20.slice/crio-9532479d7397d17fb02e00b5eaaf06fd1a9b760eebf37763f9400d60d5c69278 WatchSource:0}: Error finding container 9532479d7397d17fb02e00b5eaaf06fd1a9b760eebf37763f9400d60d5c69278: Status 404 returned error can't find the container with id 9532479d7397d17fb02e00b5eaaf06fd1a9b760eebf37763f9400d60d5c69278 Feb 19 23:15:49 crc kubenswrapper[4771]: I0219 23:15:49.521794 4771 generic.go:334] "Generic (PLEG): container finished" podID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerID="34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26" exitCode=0 Feb 19 23:15:49 crc kubenswrapper[4771]: I0219 23:15:49.521922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" event={"ID":"70205606-64b4-4e9e-a3c5-c8e94cda8d20","Type":"ContainerDied","Data":"34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26"} Feb 19 23:15:49 crc kubenswrapper[4771]: I0219 23:15:49.522197 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" event={"ID":"70205606-64b4-4e9e-a3c5-c8e94cda8d20","Type":"ContainerStarted","Data":"9532479d7397d17fb02e00b5eaaf06fd1a9b760eebf37763f9400d60d5c69278"} Feb 19 23:15:50 crc kubenswrapper[4771]: I0219 23:15:50.536614 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" event={"ID":"70205606-64b4-4e9e-a3c5-c8e94cda8d20","Type":"ContainerStarted","Data":"9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c"} Feb 19 23:15:50 crc kubenswrapper[4771]: I0219 23:15:50.536891 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:50 crc kubenswrapper[4771]: I0219 23:15:50.563108 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" podStartSLOduration=3.56308886 podStartE2EDuration="3.56308886s" podCreationTimestamp="2026-02-19 23:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:15:50.561120478 +0000 UTC m=+6450.832562948" watchObservedRunningTime="2026-02-19 23:15:50.56308886 +0000 UTC m=+6450.834531330" Feb 19 23:15:52 crc kubenswrapper[4771]: I0219 23:15:52.532543 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 23:15:56 crc kubenswrapper[4771]: I0219 23:15:56.584138 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:15:56 crc kubenswrapper[4771]: I0219 23:15:56.585169 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b936b940-5d5a-4b62-8cd3-acff548506db" containerName="kube-state-metrics" containerID="cri-o://5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9" gracePeriod=30 Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.158311 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.244364 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbszl\" (UniqueName: \"kubernetes.io/projected/b936b940-5d5a-4b62-8cd3-acff548506db-kube-api-access-bbszl\") pod \"b936b940-5d5a-4b62-8cd3-acff548506db\" (UID: \"b936b940-5d5a-4b62-8cd3-acff548506db\") " Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.255383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b936b940-5d5a-4b62-8cd3-acff548506db-kube-api-access-bbszl" (OuterVolumeSpecName: "kube-api-access-bbszl") pod "b936b940-5d5a-4b62-8cd3-acff548506db" (UID: "b936b940-5d5a-4b62-8cd3-acff548506db"). InnerVolumeSpecName "kube-api-access-bbszl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.346658 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbszl\" (UniqueName: \"kubernetes.io/projected/b936b940-5d5a-4b62-8cd3-acff548506db-kube-api-access-bbszl\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.621583 4771 generic.go:334] "Generic (PLEG): container finished" podID="b936b940-5d5a-4b62-8cd3-acff548506db" containerID="5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9" exitCode=2 Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.621708 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.621741 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b936b940-5d5a-4b62-8cd3-acff548506db","Type":"ContainerDied","Data":"5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9"} Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.628077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b936b940-5d5a-4b62-8cd3-acff548506db","Type":"ContainerDied","Data":"005483e24a77f8b134f2ffbfe719af5ab88671a649e4f9c8ac9c66aaa679ada8"} Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.628103 4771 scope.go:117] "RemoveContainer" containerID="5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.672090 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.681764 4771 scope.go:117] "RemoveContainer" containerID="5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.692807 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:15:57 crc kubenswrapper[4771]: E0219 23:15:57.693569 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9\": container with ID starting with 5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9 not found: ID does not exist" containerID="5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.693649 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9"} err="failed to get container status \"5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9\": rpc error: code = NotFound desc = could not find container \"5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9\": container with ID starting with 5c124e2a8f2ea44da79003b6f0776c73d28510f5a3ce55ddbf51a3eaae02fac9 not found: ID does not exist" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.701673 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:15:57 crc kubenswrapper[4771]: E0219 23:15:57.702417 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b936b940-5d5a-4b62-8cd3-acff548506db" containerName="kube-state-metrics" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.702449 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b936b940-5d5a-4b62-8cd3-acff548506db" containerName="kube-state-metrics" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.702855 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b936b940-5d5a-4b62-8cd3-acff548506db" containerName="kube-state-metrics" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.704214 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.707533 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.707755 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.712051 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.855790 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.855871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.855899 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.855998 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfpcg\" (UniqueName: \"kubernetes.io/projected/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-api-access-xfpcg\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.957539 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.957695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.957780 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.957909 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfpcg\" (UniqueName: \"kubernetes.io/projected/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-api-access-xfpcg\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.963054 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.963212 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.963304 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:57 crc kubenswrapper[4771]: I0219 23:15:57.989638 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfpcg\" (UniqueName: \"kubernetes.io/projected/5ed8b8ea-3ead-422c-88dd-f7e8421f52b6-kube-api-access-xfpcg\") pod \"kube-state-metrics-0\" (UID: \"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6\") " pod="openstack/kube-state-metrics-0" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.023694 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.318191 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.388950 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59f59d5469-nmcdd"] Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.389252 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" podUID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerName="dnsmasq-dns" containerID="cri-o://a125b5df89feb4839f77a4bfae4c008ef9e001055736db446ca43ca7673360ce" gracePeriod=10 Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.448676 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b936b940-5d5a-4b62-8cd3-acff548506db" path="/var/lib/kubelet/pods/b936b940-5d5a-4b62-8cd3-acff548506db/volumes" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.547127 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-564f74d4f7-gtwrv"] Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.552870 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.580425 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564f74d4f7-gtwrv"] Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.619907 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:15:58 crc kubenswrapper[4771]: W0219 23:15:58.623552 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ed8b8ea_3ead_422c_88dd_f7e8421f52b6.slice/crio-06c9a5f3cea669ff15c564d9492aef5a8299294c7a5f86f4cc63bde08fd1590c WatchSource:0}: Error finding container 06c9a5f3cea669ff15c564d9492aef5a8299294c7a5f86f4cc63bde08fd1590c: Status 404 returned error can't find the container with id 06c9a5f3cea669ff15c564d9492aef5a8299294c7a5f86f4cc63bde08fd1590c Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.639165 4771 generic.go:334] "Generic (PLEG): container finished" podID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerID="a125b5df89feb4839f77a4bfae4c008ef9e001055736db446ca43ca7673360ce" exitCode=0 Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.639205 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" event={"ID":"4deb3c04-8000-4b51-b811-5fed89da11f9","Type":"ContainerDied","Data":"a125b5df89feb4839f77a4bfae4c008ef9e001055736db446ca43ca7673360ce"} Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.673768 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-ovsdbserver-nb\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.673858 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-config\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.673880 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-ovsdbserver-sb\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.673918 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-openstack-cell1\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.674070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ww9l\" (UniqueName: \"kubernetes.io/projected/6182ac80-a485-411b-883b-56b2b74eb9c9-kube-api-access-2ww9l\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.674110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-dns-svc\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.776386 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-ovsdbserver-nb\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.776764 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-config\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.776812 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-ovsdbserver-sb\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.776850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-openstack-cell1\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.776986 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ww9l\" (UniqueName: \"kubernetes.io/projected/6182ac80-a485-411b-883b-56b2b74eb9c9-kube-api-access-2ww9l\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.777044 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-dns-svc\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.778467 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-ovsdbserver-nb\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.778780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-dns-svc\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.778836 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-openstack-cell1\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.778893 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-ovsdbserver-sb\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.780162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6182ac80-a485-411b-883b-56b2b74eb9c9-config\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.796978 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ww9l\" (UniqueName: \"kubernetes.io/projected/6182ac80-a485-411b-883b-56b2b74eb9c9-kube-api-access-2ww9l\") pod \"dnsmasq-dns-564f74d4f7-gtwrv\" (UID: \"6182ac80-a485-411b-883b-56b2b74eb9c9\") " pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.831972 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.877999 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.980083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-nb\") pod \"4deb3c04-8000-4b51-b811-5fed89da11f9\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.980477 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-sb\") pod \"4deb3c04-8000-4b51-b811-5fed89da11f9\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.980519 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-config\") pod \"4deb3c04-8000-4b51-b811-5fed89da11f9\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.980543 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-dns-svc\") pod \"4deb3c04-8000-4b51-b811-5fed89da11f9\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.980799 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns278\" (UniqueName: \"kubernetes.io/projected/4deb3c04-8000-4b51-b811-5fed89da11f9-kube-api-access-ns278\") pod \"4deb3c04-8000-4b51-b811-5fed89da11f9\" (UID: \"4deb3c04-8000-4b51-b811-5fed89da11f9\") " Feb 19 23:15:58 crc kubenswrapper[4771]: I0219 23:15:58.991380 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4deb3c04-8000-4b51-b811-5fed89da11f9-kube-api-access-ns278" (OuterVolumeSpecName: "kube-api-access-ns278") pod "4deb3c04-8000-4b51-b811-5fed89da11f9" (UID: "4deb3c04-8000-4b51-b811-5fed89da11f9"). InnerVolumeSpecName "kube-api-access-ns278". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.052631 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4deb3c04-8000-4b51-b811-5fed89da11f9" (UID: "4deb3c04-8000-4b51-b811-5fed89da11f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.062981 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4deb3c04-8000-4b51-b811-5fed89da11f9" (UID: "4deb3c04-8000-4b51-b811-5fed89da11f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.080848 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.081476 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-central-agent" containerID="cri-o://74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0" gracePeriod=30 Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.081815 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="proxy-httpd" containerID="cri-o://e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a" gracePeriod=30 Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.081974 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="sg-core" containerID="cri-o://8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858" gracePeriod=30 Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.082060 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-notification-agent" containerID="cri-o://e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29" gracePeriod=30 Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.113678 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns278\" (UniqueName: \"kubernetes.io/projected/4deb3c04-8000-4b51-b811-5fed89da11f9-kube-api-access-ns278\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.115132 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.115343 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.134466 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-config" (OuterVolumeSpecName: "config") pod "4deb3c04-8000-4b51-b811-5fed89da11f9" (UID: "4deb3c04-8000-4b51-b811-5fed89da11f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.150240 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4deb3c04-8000-4b51-b811-5fed89da11f9" (UID: "4deb3c04-8000-4b51-b811-5fed89da11f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.221266 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.221307 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4deb3c04-8000-4b51-b811-5fed89da11f9-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.408546 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-564f74d4f7-gtwrv"] Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.650151 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" event={"ID":"6182ac80-a485-411b-883b-56b2b74eb9c9","Type":"ContainerStarted","Data":"b297bd6951262bc694a462fcc4450e7a4ec7bb00b9ab111483e9b8610cce0c48"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.650208 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" event={"ID":"6182ac80-a485-411b-883b-56b2b74eb9c9","Type":"ContainerStarted","Data":"434cb2dc23f3d2dbb7d5dc1d65231897296fd5a985933578c6bf29c24636458a"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.654106 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6","Type":"ContainerStarted","Data":"c60a1ea2400af56eaf179be019391dc7619a6197917a0ae5f13e1e144e409dd9"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.654156 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ed8b8ea-3ead-422c-88dd-f7e8421f52b6","Type":"ContainerStarted","Data":"06c9a5f3cea669ff15c564d9492aef5a8299294c7a5f86f4cc63bde08fd1590c"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.654194 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.667664 4771 generic.go:334] "Generic (PLEG): container finished" podID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerID="e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a" exitCode=0 Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.667705 4771 generic.go:334] "Generic (PLEG): container finished" podID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerID="8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858" exitCode=2 Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.667712 4771 generic.go:334] "Generic (PLEG): container finished" podID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerID="74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0" exitCode=0 Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.667768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerDied","Data":"e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.667854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerDied","Data":"8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.667867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerDied","Data":"74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.674088 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" event={"ID":"4deb3c04-8000-4b51-b811-5fed89da11f9","Type":"ContainerDied","Data":"489d8b9e7a5d3b5027f595993b83f403fdd83ad356dae895e14c6c9271139364"} Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.674140 4771 scope.go:117] "RemoveContainer" containerID="a125b5df89feb4839f77a4bfae4c008ef9e001055736db446ca43ca7673360ce" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.674154 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f59d5469-nmcdd" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.700095 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.243403168 podStartE2EDuration="2.700080782s" podCreationTimestamp="2026-02-19 23:15:57 +0000 UTC" firstStartedPulling="2026-02-19 23:15:58.625245488 +0000 UTC m=+6458.896687958" lastFinishedPulling="2026-02-19 23:15:59.081923102 +0000 UTC m=+6459.353365572" observedRunningTime="2026-02-19 23:15:59.687174348 +0000 UTC m=+6459.958616848" watchObservedRunningTime="2026-02-19 23:15:59.700080782 +0000 UTC m=+6459.971523262" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.724441 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59f59d5469-nmcdd"] Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.727996 4771 scope.go:117] "RemoveContainer" containerID="ee259689c0dd1c5f2195500575a146ea48e5b1ee33a49c370c0f3a578c9777c4" Feb 19 23:15:59 crc kubenswrapper[4771]: I0219 23:15:59.734791 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59f59d5469-nmcdd"] Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.391101 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.446821 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-sg-core-conf-yaml\") pod \"753edc49-4dfc-458f-bf7c-f14cd632f21d\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.446910 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-config-data\") pod \"753edc49-4dfc-458f-bf7c-f14cd632f21d\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.446972 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-scripts\") pod \"753edc49-4dfc-458f-bf7c-f14cd632f21d\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.447013 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-log-httpd\") pod \"753edc49-4dfc-458f-bf7c-f14cd632f21d\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.447075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-run-httpd\") pod \"753edc49-4dfc-458f-bf7c-f14cd632f21d\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.447106 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-combined-ca-bundle\") pod \"753edc49-4dfc-458f-bf7c-f14cd632f21d\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.447167 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnpfg\" (UniqueName: \"kubernetes.io/projected/753edc49-4dfc-458f-bf7c-f14cd632f21d-kube-api-access-cnpfg\") pod \"753edc49-4dfc-458f-bf7c-f14cd632f21d\" (UID: \"753edc49-4dfc-458f-bf7c-f14cd632f21d\") " Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.447770 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "753edc49-4dfc-458f-bf7c-f14cd632f21d" (UID: "753edc49-4dfc-458f-bf7c-f14cd632f21d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.448070 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "753edc49-4dfc-458f-bf7c-f14cd632f21d" (UID: "753edc49-4dfc-458f-bf7c-f14cd632f21d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.468304 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4deb3c04-8000-4b51-b811-5fed89da11f9" path="/var/lib/kubelet/pods/4deb3c04-8000-4b51-b811-5fed89da11f9/volumes" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.471204 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-scripts" (OuterVolumeSpecName: "scripts") pod "753edc49-4dfc-458f-bf7c-f14cd632f21d" (UID: "753edc49-4dfc-458f-bf7c-f14cd632f21d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.484195 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753edc49-4dfc-458f-bf7c-f14cd632f21d-kube-api-access-cnpfg" (OuterVolumeSpecName: "kube-api-access-cnpfg") pod "753edc49-4dfc-458f-bf7c-f14cd632f21d" (UID: "753edc49-4dfc-458f-bf7c-f14cd632f21d"). InnerVolumeSpecName "kube-api-access-cnpfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.507354 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "753edc49-4dfc-458f-bf7c-f14cd632f21d" (UID: "753edc49-4dfc-458f-bf7c-f14cd632f21d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.550098 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.550133 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.550145 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/753edc49-4dfc-458f-bf7c-f14cd632f21d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.550160 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnpfg\" (UniqueName: \"kubernetes.io/projected/753edc49-4dfc-458f-bf7c-f14cd632f21d-kube-api-access-cnpfg\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.550174 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.581116 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "753edc49-4dfc-458f-bf7c-f14cd632f21d" (UID: "753edc49-4dfc-458f-bf7c-f14cd632f21d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.598871 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-config-data" (OuterVolumeSpecName: "config-data") pod "753edc49-4dfc-458f-bf7c-f14cd632f21d" (UID: "753edc49-4dfc-458f-bf7c-f14cd632f21d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.652051 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.652088 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753edc49-4dfc-458f-bf7c-f14cd632f21d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.683528 4771 generic.go:334] "Generic (PLEG): container finished" podID="6182ac80-a485-411b-883b-56b2b74eb9c9" containerID="b297bd6951262bc694a462fcc4450e7a4ec7bb00b9ab111483e9b8610cce0c48" exitCode=0 Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.683600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" event={"ID":"6182ac80-a485-411b-883b-56b2b74eb9c9","Type":"ContainerDied","Data":"b297bd6951262bc694a462fcc4450e7a4ec7bb00b9ab111483e9b8610cce0c48"} Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.686424 4771 generic.go:334] "Generic (PLEG): container finished" podID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerID="e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29" exitCode=0 Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.686615 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.687100 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerDied","Data":"e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29"} Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.687219 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"753edc49-4dfc-458f-bf7c-f14cd632f21d","Type":"ContainerDied","Data":"9fb4ccc1f9306bb5a710e6ad2038d4b91f370e936089bff2f84d4e1ab5e72198"} Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.687256 4771 scope.go:117] "RemoveContainer" containerID="e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.712120 4771 scope.go:117] "RemoveContainer" containerID="8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.736181 4771 scope.go:117] "RemoveContainer" containerID="e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.744540 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.764586 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.778565 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.779032 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerName="init" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779048 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerName="init" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.779059 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerName="dnsmasq-dns" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779066 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerName="dnsmasq-dns" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.779085 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-notification-agent" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779091 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-notification-agent" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.779109 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="sg-core" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779115 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="sg-core" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.779128 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-central-agent" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779133 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-central-agent" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.779149 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="proxy-httpd" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779155 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="proxy-httpd" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779340 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="proxy-httpd" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779360 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4deb3c04-8000-4b51-b811-5fed89da11f9" containerName="dnsmasq-dns" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779372 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-central-agent" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779384 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="sg-core" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.779395 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" containerName="ceilometer-notification-agent" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.784738 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.787204 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.787408 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.787453 4771 scope.go:117] "RemoveContainer" containerID="74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.789763 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.798763 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.828461 4771 scope.go:117] "RemoveContainer" containerID="e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.829370 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a\": container with ID starting with e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a not found: ID does not exist" containerID="e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.829397 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a"} err="failed to get container status \"e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a\": rpc error: code = NotFound desc = could not find container \"e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a\": container with ID starting with e236013cd22a8b8d3a5a0c54c510ffc5105454d4e8bb694e221561453b12ec2a not found: ID does not exist" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.829415 4771 scope.go:117] "RemoveContainer" containerID="8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.829683 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858\": container with ID starting with 8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858 not found: ID does not exist" containerID="8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.829703 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858"} err="failed to get container status \"8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858\": rpc error: code = NotFound desc = could not find container \"8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858\": container with ID starting with 8ada4ee3b096d908fe687e01495cf226326a08b0df41db071fe684d0f84a1858 not found: ID does not exist" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.829715 4771 scope.go:117] "RemoveContainer" containerID="e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.829964 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29\": container with ID starting with e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29 not found: ID does not exist" containerID="e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.829982 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29"} err="failed to get container status \"e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29\": rpc error: code = NotFound desc = could not find container \"e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29\": container with ID starting with e51d42b0bc3b178bdfcb0c9ba7a7a45217b28f6ead448320b11879fd9c6d3c29 not found: ID does not exist" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.829993 4771 scope.go:117] "RemoveContainer" containerID="74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0" Feb 19 23:16:00 crc kubenswrapper[4771]: E0219 23:16:00.830335 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0\": container with ID starting with 74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0 not found: ID does not exist" containerID="74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.830364 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0"} err="failed to get container status \"74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0\": rpc error: code = NotFound desc = could not find container \"74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0\": container with ID starting with 74534e9c5095f09cb447e733038dee5bfa72e588572d7190fc48162f187bcae0 not found: ID does not exist" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964447 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-run-httpd\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964506 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nz6q\" (UniqueName: \"kubernetes.io/projected/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-kube-api-access-7nz6q\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964525 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964557 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-config-data\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964594 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964660 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-log-httpd\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:00 crc kubenswrapper[4771]: I0219 23:16:00.964707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-scripts\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.066776 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-run-httpd\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.066862 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nz6q\" (UniqueName: \"kubernetes.io/projected/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-kube-api-access-7nz6q\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.066893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.066926 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-config-data\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.066971 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.067042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.067086 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-log-httpd\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.067296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-run-httpd\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.067534 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-log-httpd\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.067596 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-scripts\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.072865 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.073062 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-scripts\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.073234 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-config-data\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.075495 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.078959 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.084407 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nz6q\" (UniqueName: \"kubernetes.io/projected/88c6e800-2233-40b4-aab1-b4a7ac0fbb13-kube-api-access-7nz6q\") pod \"ceilometer-0\" (UID: \"88c6e800-2233-40b4-aab1-b4a7ac0fbb13\") " pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.131144 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.637728 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:16:01 crc kubenswrapper[4771]: W0219 23:16:01.652261 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88c6e800_2233_40b4_aab1_b4a7ac0fbb13.slice/crio-7b2a9461b6fec56a796cdf348e2b2fbc0b4d23da08f0ec615c9838d337c0c5a4 WatchSource:0}: Error finding container 7b2a9461b6fec56a796cdf348e2b2fbc0b4d23da08f0ec615c9838d337c0c5a4: Status 404 returned error can't find the container with id 7b2a9461b6fec56a796cdf348e2b2fbc0b4d23da08f0ec615c9838d337c0c5a4 Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.706294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" event={"ID":"6182ac80-a485-411b-883b-56b2b74eb9c9","Type":"ContainerStarted","Data":"7d2fb4c308c435df24b32e64fe33135aa528243750797df4834e89a4a32ed10d"} Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.706405 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.707844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88c6e800-2233-40b4-aab1-b4a7ac0fbb13","Type":"ContainerStarted","Data":"7b2a9461b6fec56a796cdf348e2b2fbc0b4d23da08f0ec615c9838d337c0c5a4"} Feb 19 23:16:01 crc kubenswrapper[4771]: I0219 23:16:01.739824 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" podStartSLOduration=3.739798229 podStartE2EDuration="3.739798229s" podCreationTimestamp="2026-02-19 23:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:16:01.730240644 +0000 UTC m=+6462.001683134" watchObservedRunningTime="2026-02-19 23:16:01.739798229 +0000 UTC m=+6462.011240729" Feb 19 23:16:02 crc kubenswrapper[4771]: I0219 23:16:02.455796 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753edc49-4dfc-458f-bf7c-f14cd632f21d" path="/var/lib/kubelet/pods/753edc49-4dfc-458f-bf7c-f14cd632f21d/volumes" Feb 19 23:16:02 crc kubenswrapper[4771]: I0219 23:16:02.739045 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88c6e800-2233-40b4-aab1-b4a7ac0fbb13","Type":"ContainerStarted","Data":"79a1777560da1332564b4b7579762350d46812ab3ead629a1ea3cb83abed0427"} Feb 19 23:16:03 crc kubenswrapper[4771]: I0219 23:16:03.750262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88c6e800-2233-40b4-aab1-b4a7ac0fbb13","Type":"ContainerStarted","Data":"e011bc7901c5a3092f1a05c229a62b26dbfe2b70496ead3416bce3b25b247ba7"} Feb 19 23:16:04 crc kubenswrapper[4771]: I0219 23:16:04.776684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88c6e800-2233-40b4-aab1-b4a7ac0fbb13","Type":"ContainerStarted","Data":"235fc7cf0991f9d9e8afcbeb5161197e2adb99f55bd25a59deb6808a0411a421"} Feb 19 23:16:05 crc kubenswrapper[4771]: I0219 23:16:05.786901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88c6e800-2233-40b4-aab1-b4a7ac0fbb13","Type":"ContainerStarted","Data":"c10eafa0421de1fba2d05931b9f85adcbe4cf5f61f35060e81029c67f0cad0b5"} Feb 19 23:16:05 crc kubenswrapper[4771]: I0219 23:16:05.788859 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:16:05 crc kubenswrapper[4771]: I0219 23:16:05.821807 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.332265452 podStartE2EDuration="5.82178795s" podCreationTimestamp="2026-02-19 23:16:00 +0000 UTC" firstStartedPulling="2026-02-19 23:16:01.663916185 +0000 UTC m=+6461.935358695" lastFinishedPulling="2026-02-19 23:16:05.153438733 +0000 UTC m=+6465.424881193" observedRunningTime="2026-02-19 23:16:05.818596576 +0000 UTC m=+6466.090039076" watchObservedRunningTime="2026-02-19 23:16:05.82178795 +0000 UTC m=+6466.093230420" Feb 19 23:16:06 crc kubenswrapper[4771]: I0219 23:16:06.057155 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vwf8n"] Feb 19 23:16:06 crc kubenswrapper[4771]: I0219 23:16:06.079302 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-ccv4j"] Feb 19 23:16:06 crc kubenswrapper[4771]: I0219 23:16:06.090561 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-ccv4j"] Feb 19 23:16:06 crc kubenswrapper[4771]: I0219 23:16:06.100146 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vwf8n"] Feb 19 23:16:06 crc kubenswrapper[4771]: I0219 23:16:06.452877 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7009bd32-7008-4634-8fb5-650244264fbf" path="/var/lib/kubelet/pods/7009bd32-7008-4634-8fb5-650244264fbf/volumes" Feb 19 23:16:06 crc kubenswrapper[4771]: I0219 23:16:06.455240 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9711ee8d-1ba8-43b1-ac70-c07eda069724" path="/var/lib/kubelet/pods/9711ee8d-1ba8-43b1-ac70-c07eda069724/volumes" Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.042412 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ae92-account-create-update-f86k2"] Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.054646 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2d41-account-create-update-flb7v"] Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.064329 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2d41-account-create-update-flb7v"] Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.075300 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bm85p"] Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.086784 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ae92-account-create-update-f86k2"] Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.094750 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-03a8-account-create-update-6fqt6"] Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.102981 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bm85p"] Feb 19 23:16:07 crc kubenswrapper[4771]: I0219 23:16:07.110602 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-03a8-account-create-update-6fqt6"] Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.042971 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.475609 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59024c00-6963-4ade-bff9-6a0b81a98369" path="/var/lib/kubelet/pods/59024c00-6963-4ade-bff9-6a0b81a98369/volumes" Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.478911 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65078338-c90f-4ad7-b419-c72272042cde" path="/var/lib/kubelet/pods/65078338-c90f-4ad7-b419-c72272042cde/volumes" Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.481697 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70523008-772b-455a-9931-8ed6ea6dc6b0" path="/var/lib/kubelet/pods/70523008-772b-455a-9931-8ed6ea6dc6b0/volumes" Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.483654 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ca9a86-ec44-44d2-b792-c8e14f43b0cc" path="/var/lib/kubelet/pods/c7ca9a86-ec44-44d2-b792-c8e14f43b0cc/volumes" Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.880577 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-564f74d4f7-gtwrv" Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.970797 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbc456f59-z9cr5"] Feb 19 23:16:08 crc kubenswrapper[4771]: I0219 23:16:08.971080 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" podUID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerName="dnsmasq-dns" containerID="cri-o://9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c" gracePeriod=10 Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.451058 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.562309 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-nb\") pod \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.562655 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-dns-svc\") pod \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.562689 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2krqv\" (UniqueName: \"kubernetes.io/projected/70205606-64b4-4e9e-a3c5-c8e94cda8d20-kube-api-access-2krqv\") pod \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.562743 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-config\") pod \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.562814 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-openstack-cell1\") pod \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.562864 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-sb\") pod \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\" (UID: \"70205606-64b4-4e9e-a3c5-c8e94cda8d20\") " Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.569300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70205606-64b4-4e9e-a3c5-c8e94cda8d20-kube-api-access-2krqv" (OuterVolumeSpecName: "kube-api-access-2krqv") pod "70205606-64b4-4e9e-a3c5-c8e94cda8d20" (UID: "70205606-64b4-4e9e-a3c5-c8e94cda8d20"). InnerVolumeSpecName "kube-api-access-2krqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.617621 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "70205606-64b4-4e9e-a3c5-c8e94cda8d20" (UID: "70205606-64b4-4e9e-a3c5-c8e94cda8d20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.623272 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "70205606-64b4-4e9e-a3c5-c8e94cda8d20" (UID: "70205606-64b4-4e9e-a3c5-c8e94cda8d20"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.631935 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-config" (OuterVolumeSpecName: "config") pod "70205606-64b4-4e9e-a3c5-c8e94cda8d20" (UID: "70205606-64b4-4e9e-a3c5-c8e94cda8d20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.638684 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70205606-64b4-4e9e-a3c5-c8e94cda8d20" (UID: "70205606-64b4-4e9e-a3c5-c8e94cda8d20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.640102 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70205606-64b4-4e9e-a3c5-c8e94cda8d20" (UID: "70205606-64b4-4e9e-a3c5-c8e94cda8d20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.666181 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.666239 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2krqv\" (UniqueName: \"kubernetes.io/projected/70205606-64b4-4e9e-a3c5-c8e94cda8d20-kube-api-access-2krqv\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.666261 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.666279 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.666296 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.666315 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70205606-64b4-4e9e-a3c5-c8e94cda8d20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.834211 4771 generic.go:334] "Generic (PLEG): container finished" podID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerID="9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c" exitCode=0 Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.834256 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" event={"ID":"70205606-64b4-4e9e-a3c5-c8e94cda8d20","Type":"ContainerDied","Data":"9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c"} Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.834289 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" event={"ID":"70205606-64b4-4e9e-a3c5-c8e94cda8d20","Type":"ContainerDied","Data":"9532479d7397d17fb02e00b5eaaf06fd1a9b760eebf37763f9400d60d5c69278"} Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.834287 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbc456f59-z9cr5" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.834308 4771 scope.go:117] "RemoveContainer" containerID="9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.854285 4771 scope.go:117] "RemoveContainer" containerID="34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.876786 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbc456f59-z9cr5"] Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.886204 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbc456f59-z9cr5"] Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.889081 4771 scope.go:117] "RemoveContainer" containerID="9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c" Feb 19 23:16:09 crc kubenswrapper[4771]: E0219 23:16:09.889503 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c\": container with ID starting with 9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c not found: ID does not exist" containerID="9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.889533 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c"} err="failed to get container status \"9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c\": rpc error: code = NotFound desc = could not find container \"9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c\": container with ID starting with 9a92c6eee4e69c0f5f1c36cbd0cd7663a136055dcb026e4d2b8664f8eece750c not found: ID does not exist" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.889555 4771 scope.go:117] "RemoveContainer" containerID="34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26" Feb 19 23:16:09 crc kubenswrapper[4771]: E0219 23:16:09.889879 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26\": container with ID starting with 34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26 not found: ID does not exist" containerID="34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26" Feb 19 23:16:09 crc kubenswrapper[4771]: I0219 23:16:09.889916 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26"} err="failed to get container status \"34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26\": rpc error: code = NotFound desc = could not find container \"34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26\": container with ID starting with 34ceb630cc5cbb169547c5d19a7d41f4b11bdb8822a7a822394ed64f527b8d26 not found: ID does not exist" Feb 19 23:16:10 crc kubenswrapper[4771]: I0219 23:16:10.457376 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" path="/var/lib/kubelet/pods/70205606-64b4-4e9e-a3c5-c8e94cda8d20/volumes" Feb 19 23:16:12 crc kubenswrapper[4771]: I0219 23:16:12.957734 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:16:12 crc kubenswrapper[4771]: I0219 23:16:12.958552 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:16:16 crc kubenswrapper[4771]: I0219 23:16:16.050944 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fxqj4"] Feb 19 23:16:16 crc kubenswrapper[4771]: I0219 23:16:16.066038 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fxqj4"] Feb 19 23:16:16 crc kubenswrapper[4771]: I0219 23:16:16.455755 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d642f3-ad84-4cc5-b273-e790472f40fe" path="/var/lib/kubelet/pods/71d642f3-ad84-4cc5-b273-e790472f40fe/volumes" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.429831 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp"] Feb 19 23:16:19 crc kubenswrapper[4771]: E0219 23:16:19.431062 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerName="dnsmasq-dns" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.431083 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerName="dnsmasq-dns" Feb 19 23:16:19 crc kubenswrapper[4771]: E0219 23:16:19.431098 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerName="init" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.431106 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerName="init" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.431389 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="70205606-64b4-4e9e-a3c5-c8e94cda8d20" containerName="dnsmasq-dns" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.432322 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.434167 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.443151 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.443348 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.444818 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.447122 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp"] Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.553014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.553125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj9nn\" (UniqueName: \"kubernetes.io/projected/93edb2e0-5438-4b84-80f0-835768c89a64-kube-api-access-fj9nn\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.555751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.556108 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.658400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.658534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj9nn\" (UniqueName: \"kubernetes.io/projected/93edb2e0-5438-4b84-80f0-835768c89a64-kube-api-access-fj9nn\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.658620 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.658656 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.666920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.667657 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.668298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.684776 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj9nn\" (UniqueName: \"kubernetes.io/projected/93edb2e0-5438-4b84-80f0-835768c89a64-kube-api-access-fj9nn\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:19 crc kubenswrapper[4771]: I0219 23:16:19.768565 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:20 crc kubenswrapper[4771]: I0219 23:16:20.682165 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp"] Feb 19 23:16:20 crc kubenswrapper[4771]: W0219 23:16:20.689732 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93edb2e0_5438_4b84_80f0_835768c89a64.slice/crio-a4d0b61346f8f0991ea11f88e822c7d598c17aded8210a074440523ce4570510 WatchSource:0}: Error finding container a4d0b61346f8f0991ea11f88e822c7d598c17aded8210a074440523ce4570510: Status 404 returned error can't find the container with id a4d0b61346f8f0991ea11f88e822c7d598c17aded8210a074440523ce4570510 Feb 19 23:16:20 crc kubenswrapper[4771]: I0219 23:16:20.983717 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" event={"ID":"93edb2e0-5438-4b84-80f0-835768c89a64","Type":"ContainerStarted","Data":"a4d0b61346f8f0991ea11f88e822c7d598c17aded8210a074440523ce4570510"} Feb 19 23:16:31 crc kubenswrapper[4771]: I0219 23:16:31.314579 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 23:16:32 crc kubenswrapper[4771]: I0219 23:16:32.499262 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:16:33 crc kubenswrapper[4771]: I0219 23:16:33.133094 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" event={"ID":"93edb2e0-5438-4b84-80f0-835768c89a64","Type":"ContainerStarted","Data":"0257c8877e949cf9f53618364db71c3af0482cd0f98e1fae571fe6b29b9cd941"} Feb 19 23:16:33 crc kubenswrapper[4771]: I0219 23:16:33.174185 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" podStartSLOduration=2.371531698 podStartE2EDuration="14.174165343s" podCreationTimestamp="2026-02-19 23:16:19 +0000 UTC" firstStartedPulling="2026-02-19 23:16:20.693255596 +0000 UTC m=+6480.964698076" lastFinishedPulling="2026-02-19 23:16:32.495889251 +0000 UTC m=+6492.767331721" observedRunningTime="2026-02-19 23:16:33.160867749 +0000 UTC m=+6493.432310229" watchObservedRunningTime="2026-02-19 23:16:33.174165343 +0000 UTC m=+6493.445607823" Feb 19 23:16:34 crc kubenswrapper[4771]: I0219 23:16:34.069079 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggftl"] Feb 19 23:16:34 crc kubenswrapper[4771]: I0219 23:16:34.085318 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggftl"] Feb 19 23:16:34 crc kubenswrapper[4771]: I0219 23:16:34.468462 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93356608-bac2-4fee-bcf1-e4ec814ddfe4" path="/var/lib/kubelet/pods/93356608-bac2-4fee-bcf1-e4ec814ddfe4/volumes" Feb 19 23:16:35 crc kubenswrapper[4771]: I0219 23:16:35.037346 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvzsn"] Feb 19 23:16:35 crc kubenswrapper[4771]: I0219 23:16:35.047648 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvzsn"] Feb 19 23:16:36 crc kubenswrapper[4771]: I0219 23:16:36.451239 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303" path="/var/lib/kubelet/pods/0aa1e3e9-d4c6-4ba6-90ea-1a76af18e303/volumes" Feb 19 23:16:42 crc kubenswrapper[4771]: I0219 23:16:42.957148 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:16:42 crc kubenswrapper[4771]: I0219 23:16:42.957786 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.211810 4771 scope.go:117] "RemoveContainer" containerID="7264dd981f72e749a7b9fd57314d94e5de584b191b6b1bb796ddc61772f0220e" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.270348 4771 scope.go:117] "RemoveContainer" containerID="5776aa983426e70ecaeb5a0ee1155a2d0bb3c9d678b8871b4bb991d51a156be1" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.355802 4771 scope.go:117] "RemoveContainer" containerID="ce210e944b894fb5354050822e3053e5657f70859f0203fdc6e4af88ec19a242" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.397980 4771 scope.go:117] "RemoveContainer" containerID="a7b4fc49ac89cca91c577f95232c0a2ec49f882f49fc8d7230b7beddf06d5778" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.448397 4771 scope.go:117] "RemoveContainer" containerID="ff3cac6c7cdb99823f9f21144e816d2f08185e42098aa21aadb58acc71bd559b" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.500578 4771 scope.go:117] "RemoveContainer" containerID="9200c4d3453b5ec6a86e456695e4c48f952e38fc11fc04bf30bc0f07bb9b77fa" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.560373 4771 scope.go:117] "RemoveContainer" containerID="6768f57661f1581c1f917fdd40b1309b3e1434312a68016421a19bf86837b3c3" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.596486 4771 scope.go:117] "RemoveContainer" containerID="33af11130d84c5f08c7b8badbd17fcea2fcf547807a7391f0de991f5658f71b1" Feb 19 23:16:43 crc kubenswrapper[4771]: I0219 23:16:43.637281 4771 scope.go:117] "RemoveContainer" containerID="13aac0fa1273c849d61dda255befe006e51a51e3cb45e5a3ce32d2f1fd1345e4" Feb 19 23:16:46 crc kubenswrapper[4771]: I0219 23:16:46.333091 4771 generic.go:334] "Generic (PLEG): container finished" podID="93edb2e0-5438-4b84-80f0-835768c89a64" containerID="0257c8877e949cf9f53618364db71c3af0482cd0f98e1fae571fe6b29b9cd941" exitCode=0 Feb 19 23:16:46 crc kubenswrapper[4771]: I0219 23:16:46.333161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" event={"ID":"93edb2e0-5438-4b84-80f0-835768c89a64","Type":"ContainerDied","Data":"0257c8877e949cf9f53618364db71c3af0482cd0f98e1fae571fe6b29b9cd941"} Feb 19 23:16:47 crc kubenswrapper[4771]: I0219 23:16:47.894830 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:47 crc kubenswrapper[4771]: I0219 23:16:47.965488 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj9nn\" (UniqueName: \"kubernetes.io/projected/93edb2e0-5438-4b84-80f0-835768c89a64-kube-api-access-fj9nn\") pod \"93edb2e0-5438-4b84-80f0-835768c89a64\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " Feb 19 23:16:47 crc kubenswrapper[4771]: I0219 23:16:47.965568 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-pre-adoption-validation-combined-ca-bundle\") pod \"93edb2e0-5438-4b84-80f0-835768c89a64\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " Feb 19 23:16:47 crc kubenswrapper[4771]: I0219 23:16:47.965726 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-inventory\") pod \"93edb2e0-5438-4b84-80f0-835768c89a64\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " Feb 19 23:16:47 crc kubenswrapper[4771]: I0219 23:16:47.965799 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-ssh-key-openstack-cell1\") pod \"93edb2e0-5438-4b84-80f0-835768c89a64\" (UID: \"93edb2e0-5438-4b84-80f0-835768c89a64\") " Feb 19 23:16:47 crc kubenswrapper[4771]: I0219 23:16:47.971758 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "93edb2e0-5438-4b84-80f0-835768c89a64" (UID: "93edb2e0-5438-4b84-80f0-835768c89a64"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:16:47 crc kubenswrapper[4771]: I0219 23:16:47.972539 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93edb2e0-5438-4b84-80f0-835768c89a64-kube-api-access-fj9nn" (OuterVolumeSpecName: "kube-api-access-fj9nn") pod "93edb2e0-5438-4b84-80f0-835768c89a64" (UID: "93edb2e0-5438-4b84-80f0-835768c89a64"). InnerVolumeSpecName "kube-api-access-fj9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.005392 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-inventory" (OuterVolumeSpecName: "inventory") pod "93edb2e0-5438-4b84-80f0-835768c89a64" (UID: "93edb2e0-5438-4b84-80f0-835768c89a64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.008314 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "93edb2e0-5438-4b84-80f0-835768c89a64" (UID: "93edb2e0-5438-4b84-80f0-835768c89a64"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.089916 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj9nn\" (UniqueName: \"kubernetes.io/projected/93edb2e0-5438-4b84-80f0-835768c89a64-kube-api-access-fj9nn\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.089982 4771 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.090086 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.090117 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/93edb2e0-5438-4b84-80f0-835768c89a64-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.368718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" event={"ID":"93edb2e0-5438-4b84-80f0-835768c89a64","Type":"ContainerDied","Data":"a4d0b61346f8f0991ea11f88e822c7d598c17aded8210a074440523ce4570510"} Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.368770 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4d0b61346f8f0991ea11f88e822c7d598c17aded8210a074440523ce4570510" Feb 19 23:16:48 crc kubenswrapper[4771]: I0219 23:16:48.368832 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.054300 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-knpb5"] Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.066411 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-knpb5"] Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.455525 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de417c6b-799d-41a8-8c43-795e1868759e" path="/var/lib/kubelet/pods/de417c6b-799d-41a8-8c43-795e1868759e/volumes" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.941450 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8"] Feb 19 23:16:52 crc kubenswrapper[4771]: E0219 23:16:52.942135 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93edb2e0-5438-4b84-80f0-835768c89a64" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.942154 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="93edb2e0-5438-4b84-80f0-835768c89a64" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.942393 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="93edb2e0-5438-4b84-80f0-835768c89a64" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.943105 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.951253 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.951326 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.953538 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.953626 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:16:52 crc kubenswrapper[4771]: I0219 23:16:52.981549 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8"] Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.040403 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtcmt\" (UniqueName: \"kubernetes.io/projected/a1aa1c24-5f3f-4177-a132-a34da7c18d33-kube-api-access-gtcmt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.040575 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.040724 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.040912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.143070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.143131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.143200 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.143312 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtcmt\" (UniqueName: \"kubernetes.io/projected/a1aa1c24-5f3f-4177-a132-a34da7c18d33-kube-api-access-gtcmt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.150342 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.151367 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.152737 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.168619 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtcmt\" (UniqueName: \"kubernetes.io/projected/a1aa1c24-5f3f-4177-a132-a34da7c18d33-kube-api-access-gtcmt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.267013 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:16:53 crc kubenswrapper[4771]: I0219 23:16:53.843771 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8"] Feb 19 23:16:54 crc kubenswrapper[4771]: I0219 23:16:54.455893 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" event={"ID":"a1aa1c24-5f3f-4177-a132-a34da7c18d33","Type":"ContainerStarted","Data":"0b9536630487b5837b65f2008aea091f1ad32ea068a7bcab7b739d6933a3207a"} Feb 19 23:16:55 crc kubenswrapper[4771]: I0219 23:16:55.458693 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" event={"ID":"a1aa1c24-5f3f-4177-a132-a34da7c18d33","Type":"ContainerStarted","Data":"3871bfd4108f8ecd85fca5560989ca6ca8941ce3ed25403c87c3d1adac629e68"} Feb 19 23:16:55 crc kubenswrapper[4771]: I0219 23:16:55.488113 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" podStartSLOduration=3.074433668 podStartE2EDuration="3.488092636s" podCreationTimestamp="2026-02-19 23:16:52 +0000 UTC" firstStartedPulling="2026-02-19 23:16:53.835322475 +0000 UTC m=+6514.106764945" lastFinishedPulling="2026-02-19 23:16:54.248981433 +0000 UTC m=+6514.520423913" observedRunningTime="2026-02-19 23:16:55.482264351 +0000 UTC m=+6515.753706821" watchObservedRunningTime="2026-02-19 23:16:55.488092636 +0000 UTC m=+6515.759535096" Feb 19 23:17:12 crc kubenswrapper[4771]: I0219 23:17:12.957145 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:17:12 crc kubenswrapper[4771]: I0219 23:17:12.957846 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:17:12 crc kubenswrapper[4771]: I0219 23:17:12.957910 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:17:12 crc kubenswrapper[4771]: I0219 23:17:12.959054 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"686e92da4ca4cf49dcd362c98b7bf5eea623f83136fe222a54561367030f55e3"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:17:12 crc kubenswrapper[4771]: I0219 23:17:12.959151 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://686e92da4ca4cf49dcd362c98b7bf5eea623f83136fe222a54561367030f55e3" gracePeriod=600 Feb 19 23:17:13 crc kubenswrapper[4771]: I0219 23:17:13.757136 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="686e92da4ca4cf49dcd362c98b7bf5eea623f83136fe222a54561367030f55e3" exitCode=0 Feb 19 23:17:13 crc kubenswrapper[4771]: I0219 23:17:13.757226 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"686e92da4ca4cf49dcd362c98b7bf5eea623f83136fe222a54561367030f55e3"} Feb 19 23:17:13 crc kubenswrapper[4771]: I0219 23:17:13.757534 4771 scope.go:117] "RemoveContainer" containerID="320cd2db3ec270b47fb112aeb0ceb8c5f265cd948fb76dfe2eca3b1f04969277" Feb 19 23:17:14 crc kubenswrapper[4771]: I0219 23:17:14.776350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb"} Feb 19 23:17:44 crc kubenswrapper[4771]: I0219 23:17:44.078438 4771 scope.go:117] "RemoveContainer" containerID="bbc5b958f999b0cf68b3753ba45fcd8a7baaa22f7b0c53b91938808f3b7330be" Feb 19 23:17:44 crc kubenswrapper[4771]: I0219 23:17:44.389360 4771 scope.go:117] "RemoveContainer" containerID="a37c6fd0a20e5c0d77c5b847b38f566c682d7b51929be901c12c4f7a2f3910ae" Feb 19 23:17:44 crc kubenswrapper[4771]: I0219 23:17:44.441770 4771 scope.go:117] "RemoveContainer" containerID="dd48c29a601dba4deff2490fc19867ca5e31ed1cac63daf877fdf70dae8de0e3" Feb 19 23:18:35 crc kubenswrapper[4771]: I0219 23:18:35.057439 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-nj457"] Feb 19 23:18:35 crc kubenswrapper[4771]: I0219 23:18:35.076078 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-nj457"] Feb 19 23:18:36 crc kubenswrapper[4771]: I0219 23:18:36.054440 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-98eb-account-create-update-jvwgt"] Feb 19 23:18:36 crc kubenswrapper[4771]: I0219 23:18:36.070604 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-98eb-account-create-update-jvwgt"] Feb 19 23:18:36 crc kubenswrapper[4771]: I0219 23:18:36.451153 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5147a669-e152-45ec-80cb-a7318eac23a1" path="/var/lib/kubelet/pods/5147a669-e152-45ec-80cb-a7318eac23a1/volumes" Feb 19 23:18:36 crc kubenswrapper[4771]: I0219 23:18:36.452655 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575155d0-ff3a-4e31-9f9d-e7005ec5b5cb" path="/var/lib/kubelet/pods/575155d0-ff3a-4e31-9f9d-e7005ec5b5cb/volumes" Feb 19 23:18:41 crc kubenswrapper[4771]: I0219 23:18:41.047974 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-trz9l"] Feb 19 23:18:41 crc kubenswrapper[4771]: I0219 23:18:41.064099 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-trz9l"] Feb 19 23:18:42 crc kubenswrapper[4771]: I0219 23:18:42.038947 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-69c1-account-create-update-zxdsx"] Feb 19 23:18:42 crc kubenswrapper[4771]: I0219 23:18:42.050429 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-69c1-account-create-update-zxdsx"] Feb 19 23:18:42 crc kubenswrapper[4771]: I0219 23:18:42.457827 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd8c186-0f03-411e-ae54-85ad1e70b87b" path="/var/lib/kubelet/pods/0fd8c186-0f03-411e-ae54-85ad1e70b87b/volumes" Feb 19 23:18:42 crc kubenswrapper[4771]: I0219 23:18:42.459530 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a63473d-4647-4d2d-bdd3-cab040945fa0" path="/var/lib/kubelet/pods/9a63473d-4647-4d2d-bdd3-cab040945fa0/volumes" Feb 19 23:18:44 crc kubenswrapper[4771]: I0219 23:18:44.562295 4771 scope.go:117] "RemoveContainer" containerID="78d2c2dffe1289bb8cf51c4b139304cc0ad91d4703a84dd185bc79e61f7451b7" Feb 19 23:18:44 crc kubenswrapper[4771]: I0219 23:18:44.613305 4771 scope.go:117] "RemoveContainer" containerID="5762d750b90d48a30a4ebf135b72e2a9a608059df1041b0bcc28566cdf3efd5a" Feb 19 23:18:44 crc kubenswrapper[4771]: I0219 23:18:44.684099 4771 scope.go:117] "RemoveContainer" containerID="ede17bb660b830051f01cb0f083b460b8e0931d3bc0c12d80b09579d351f11dc" Feb 19 23:18:44 crc kubenswrapper[4771]: I0219 23:18:44.737787 4771 scope.go:117] "RemoveContainer" containerID="7a110e52caf3d8ab491cb6f9085ce7bdf86c6ca494e48c82b2a9e741ca866d36" Feb 19 23:18:44 crc kubenswrapper[4771]: I0219 23:18:44.789761 4771 scope.go:117] "RemoveContainer" containerID="b0109a6144bca47fe54906e9a4d587fd9d07ec0378da3b7348d390ed9909d985" Feb 19 23:19:30 crc kubenswrapper[4771]: I0219 23:19:30.069589 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-z862b"] Feb 19 23:19:30 crc kubenswrapper[4771]: I0219 23:19:30.088053 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-z862b"] Feb 19 23:19:30 crc kubenswrapper[4771]: I0219 23:19:30.451616 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb667377-e02a-4f75-824e-8ad5f3a5b5a6" path="/var/lib/kubelet/pods/cb667377-e02a-4f75-824e-8ad5f3a5b5a6/volumes" Feb 19 23:19:42 crc kubenswrapper[4771]: I0219 23:19:42.956807 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:19:42 crc kubenswrapper[4771]: I0219 23:19:42.957290 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:19:44 crc kubenswrapper[4771]: I0219 23:19:44.945226 4771 scope.go:117] "RemoveContainer" containerID="3043db322a09d2d9a58d83ed5be81dfe810ae73cd9a2726f5d9f5b3c38c41962" Feb 19 23:19:44 crc kubenswrapper[4771]: I0219 23:19:44.982730 4771 scope.go:117] "RemoveContainer" containerID="2029aebb18e7e0e6b0cc6bc15121892b709ed89f6c824173dd9d5d5ca6d83e48" Feb 19 23:20:12 crc kubenswrapper[4771]: I0219 23:20:12.956470 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:20:12 crc kubenswrapper[4771]: I0219 23:20:12.957147 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:20:42 crc kubenswrapper[4771]: I0219 23:20:42.956680 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:20:42 crc kubenswrapper[4771]: I0219 23:20:42.957346 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:20:42 crc kubenswrapper[4771]: I0219 23:20:42.957400 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:20:42 crc kubenswrapper[4771]: I0219 23:20:42.958320 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:20:42 crc kubenswrapper[4771]: I0219 23:20:42.958614 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" gracePeriod=600 Feb 19 23:20:43 crc kubenswrapper[4771]: E0219 23:20:43.103123 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:20:43 crc kubenswrapper[4771]: I0219 23:20:43.651298 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" exitCode=0 Feb 19 23:20:43 crc kubenswrapper[4771]: I0219 23:20:43.651365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb"} Feb 19 23:20:43 crc kubenswrapper[4771]: I0219 23:20:43.651404 4771 scope.go:117] "RemoveContainer" containerID="686e92da4ca4cf49dcd362c98b7bf5eea623f83136fe222a54561367030f55e3" Feb 19 23:20:43 crc kubenswrapper[4771]: I0219 23:20:43.652511 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:20:43 crc kubenswrapper[4771]: E0219 23:20:43.653181 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:20:55 crc kubenswrapper[4771]: I0219 23:20:55.439255 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:20:55 crc kubenswrapper[4771]: E0219 23:20:55.440979 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:21:07 crc kubenswrapper[4771]: I0219 23:21:07.437915 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:21:07 crc kubenswrapper[4771]: E0219 23:21:07.438902 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:21:18 crc kubenswrapper[4771]: I0219 23:21:18.438438 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:21:18 crc kubenswrapper[4771]: E0219 23:21:18.439658 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:21:31 crc kubenswrapper[4771]: I0219 23:21:31.437889 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:21:31 crc kubenswrapper[4771]: E0219 23:21:31.439054 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:21:45 crc kubenswrapper[4771]: I0219 23:21:45.118347 4771 scope.go:117] "RemoveContainer" containerID="5314fe2ac7b7e5c6755edac5750a9be4a9ba30f60a7403e8297509575944fcfd" Feb 19 23:21:45 crc kubenswrapper[4771]: I0219 23:21:45.157241 4771 scope.go:117] "RemoveContainer" containerID="cb039d0aceec8b1cf017a49f3bcf443e44f7763f065f301d60b9e297ce4b458f" Feb 19 23:21:45 crc kubenswrapper[4771]: I0219 23:21:45.190204 4771 scope.go:117] "RemoveContainer" containerID="b01bfdecca3504bc78a00eee98e77bbc7c1b09243a11bd62aff5e7b6d1d3de86" Feb 19 23:21:45 crc kubenswrapper[4771]: I0219 23:21:45.223201 4771 scope.go:117] "RemoveContainer" containerID="aa5a294a7b6c81c982e9302c1dc32bb3f1443233650257d9361277130abd8210" Feb 19 23:21:46 crc kubenswrapper[4771]: I0219 23:21:46.437311 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:21:46 crc kubenswrapper[4771]: E0219 23:21:46.437826 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:21:57 crc kubenswrapper[4771]: I0219 23:21:57.437513 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:21:57 crc kubenswrapper[4771]: E0219 23:21:57.438407 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.376213 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9z5dx"] Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.379782 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.510092 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9z5dx"] Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.522410 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-utilities\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.522704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh4wc\" (UniqueName: \"kubernetes.io/projected/e8bff659-6a38-4161-a533-349e6e2c7383-kube-api-access-nh4wc\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.522815 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-catalog-content\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.624971 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-catalog-content\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.625212 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-utilities\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.625353 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh4wc\" (UniqueName: \"kubernetes.io/projected/e8bff659-6a38-4161-a533-349e6e2c7383-kube-api-access-nh4wc\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.625719 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-utilities\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.625987 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-catalog-content\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.654793 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh4wc\" (UniqueName: \"kubernetes.io/projected/e8bff659-6a38-4161-a533-349e6e2c7383-kube-api-access-nh4wc\") pod \"community-operators-9z5dx\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:00 crc kubenswrapper[4771]: I0219 23:22:00.796784 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.057373 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-hb27f"] Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.075102 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-5d53-account-create-update-kwgwp"] Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.083217 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-hb27f"] Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.099287 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-5d53-account-create-update-kwgwp"] Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.434661 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9z5dx"] Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.717164 4771 generic.go:334] "Generic (PLEG): container finished" podID="e8bff659-6a38-4161-a533-349e6e2c7383" containerID="e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002" exitCode=0 Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.717214 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z5dx" event={"ID":"e8bff659-6a38-4161-a533-349e6e2c7383","Type":"ContainerDied","Data":"e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002"} Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.717244 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z5dx" event={"ID":"e8bff659-6a38-4161-a533-349e6e2c7383","Type":"ContainerStarted","Data":"5bdd509fbc43e4e69f5658e26c96e1475e59defbc2413985c9155502a44567c6"} Feb 19 23:22:01 crc kubenswrapper[4771]: I0219 23:22:01.719558 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:22:02 crc kubenswrapper[4771]: I0219 23:22:02.459977 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de27f6b-5621-4e18-81a6-85dbfdd89711" path="/var/lib/kubelet/pods/2de27f6b-5621-4e18-81a6-85dbfdd89711/volumes" Feb 19 23:22:02 crc kubenswrapper[4771]: I0219 23:22:02.461858 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d75dbbd-a8bf-4250-a891-e8de80593ef2" path="/var/lib/kubelet/pods/7d75dbbd-a8bf-4250-a891-e8de80593ef2/volumes" Feb 19 23:22:03 crc kubenswrapper[4771]: I0219 23:22:03.741734 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z5dx" event={"ID":"e8bff659-6a38-4161-a533-349e6e2c7383","Type":"ContainerStarted","Data":"dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535"} Feb 19 23:22:04 crc kubenswrapper[4771]: I0219 23:22:04.757143 4771 generic.go:334] "Generic (PLEG): container finished" podID="e8bff659-6a38-4161-a533-349e6e2c7383" containerID="dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535" exitCode=0 Feb 19 23:22:04 crc kubenswrapper[4771]: I0219 23:22:04.757236 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z5dx" event={"ID":"e8bff659-6a38-4161-a533-349e6e2c7383","Type":"ContainerDied","Data":"dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535"} Feb 19 23:22:05 crc kubenswrapper[4771]: I0219 23:22:05.770867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z5dx" event={"ID":"e8bff659-6a38-4161-a533-349e6e2c7383","Type":"ContainerStarted","Data":"10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595"} Feb 19 23:22:05 crc kubenswrapper[4771]: I0219 23:22:05.808104 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9z5dx" podStartSLOduration=2.287730228 podStartE2EDuration="5.808077255s" podCreationTimestamp="2026-02-19 23:22:00 +0000 UTC" firstStartedPulling="2026-02-19 23:22:01.719308516 +0000 UTC m=+6821.990750986" lastFinishedPulling="2026-02-19 23:22:05.239655543 +0000 UTC m=+6825.511098013" observedRunningTime="2026-02-19 23:22:05.79771827 +0000 UTC m=+6826.069160790" watchObservedRunningTime="2026-02-19 23:22:05.808077255 +0000 UTC m=+6826.079519765" Feb 19 23:22:09 crc kubenswrapper[4771]: I0219 23:22:09.437595 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:22:09 crc kubenswrapper[4771]: E0219 23:22:09.438495 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:22:10 crc kubenswrapper[4771]: I0219 23:22:10.797818 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:10 crc kubenswrapper[4771]: I0219 23:22:10.798244 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:10 crc kubenswrapper[4771]: I0219 23:22:10.862128 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:10 crc kubenswrapper[4771]: I0219 23:22:10.938859 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:11 crc kubenswrapper[4771]: I0219 23:22:11.122097 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9z5dx"] Feb 19 23:22:12 crc kubenswrapper[4771]: I0219 23:22:12.861624 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9z5dx" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="registry-server" containerID="cri-o://10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595" gracePeriod=2 Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.535634 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.564646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-catalog-content\") pod \"e8bff659-6a38-4161-a533-349e6e2c7383\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.564754 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh4wc\" (UniqueName: \"kubernetes.io/projected/e8bff659-6a38-4161-a533-349e6e2c7383-kube-api-access-nh4wc\") pod \"e8bff659-6a38-4161-a533-349e6e2c7383\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.565157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-utilities\") pod \"e8bff659-6a38-4161-a533-349e6e2c7383\" (UID: \"e8bff659-6a38-4161-a533-349e6e2c7383\") " Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.566542 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-utilities" (OuterVolumeSpecName: "utilities") pod "e8bff659-6a38-4161-a533-349e6e2c7383" (UID: "e8bff659-6a38-4161-a533-349e6e2c7383"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.569720 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.573091 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bff659-6a38-4161-a533-349e6e2c7383-kube-api-access-nh4wc" (OuterVolumeSpecName: "kube-api-access-nh4wc") pod "e8bff659-6a38-4161-a533-349e6e2c7383" (UID: "e8bff659-6a38-4161-a533-349e6e2c7383"). InnerVolumeSpecName "kube-api-access-nh4wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.612007 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8bff659-6a38-4161-a533-349e6e2c7383" (UID: "e8bff659-6a38-4161-a533-349e6e2c7383"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.671984 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8bff659-6a38-4161-a533-349e6e2c7383-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.672011 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh4wc\" (UniqueName: \"kubernetes.io/projected/e8bff659-6a38-4161-a533-349e6e2c7383-kube-api-access-nh4wc\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.882777 4771 generic.go:334] "Generic (PLEG): container finished" podID="e8bff659-6a38-4161-a533-349e6e2c7383" containerID="10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595" exitCode=0 Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.882919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z5dx" event={"ID":"e8bff659-6a38-4161-a533-349e6e2c7383","Type":"ContainerDied","Data":"10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595"} Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.883385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9z5dx" event={"ID":"e8bff659-6a38-4161-a533-349e6e2c7383","Type":"ContainerDied","Data":"5bdd509fbc43e4e69f5658e26c96e1475e59defbc2413985c9155502a44567c6"} Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.882970 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9z5dx" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.883477 4771 scope.go:117] "RemoveContainer" containerID="10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.926390 4771 scope.go:117] "RemoveContainer" containerID="dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535" Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.957412 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9z5dx"] Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.970416 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9z5dx"] Feb 19 23:22:13 crc kubenswrapper[4771]: I0219 23:22:13.986563 4771 scope.go:117] "RemoveContainer" containerID="e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002" Feb 19 23:22:14 crc kubenswrapper[4771]: I0219 23:22:14.049013 4771 scope.go:117] "RemoveContainer" containerID="10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595" Feb 19 23:22:14 crc kubenswrapper[4771]: E0219 23:22:14.049643 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595\": container with ID starting with 10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595 not found: ID does not exist" containerID="10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595" Feb 19 23:22:14 crc kubenswrapper[4771]: I0219 23:22:14.049693 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595"} err="failed to get container status \"10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595\": rpc error: code = NotFound desc = could not find container \"10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595\": container with ID starting with 10f776648d0513c6f04920ee1a9a1c2cd4646c1c188c3bf2490fe65286868595 not found: ID does not exist" Feb 19 23:22:14 crc kubenswrapper[4771]: I0219 23:22:14.049726 4771 scope.go:117] "RemoveContainer" containerID="dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535" Feb 19 23:22:14 crc kubenswrapper[4771]: E0219 23:22:14.050187 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535\": container with ID starting with dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535 not found: ID does not exist" containerID="dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535" Feb 19 23:22:14 crc kubenswrapper[4771]: I0219 23:22:14.050215 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535"} err="failed to get container status \"dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535\": rpc error: code = NotFound desc = could not find container \"dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535\": container with ID starting with dabaac0ebc28cf08f4c8411cefb2335906f0f73b0acb1baed23a3f8923280535 not found: ID does not exist" Feb 19 23:22:14 crc kubenswrapper[4771]: I0219 23:22:14.050237 4771 scope.go:117] "RemoveContainer" containerID="e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002" Feb 19 23:22:14 crc kubenswrapper[4771]: E0219 23:22:14.050686 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002\": container with ID starting with e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002 not found: ID does not exist" containerID="e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002" Feb 19 23:22:14 crc kubenswrapper[4771]: I0219 23:22:14.050727 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002"} err="failed to get container status \"e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002\": rpc error: code = NotFound desc = could not find container \"e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002\": container with ID starting with e892d68c083f768895aeafa2806ee68f85b4ab1ede0f07ddf8036f1c54c62002 not found: ID does not exist" Feb 19 23:22:14 crc kubenswrapper[4771]: I0219 23:22:14.464209 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" path="/var/lib/kubelet/pods/e8bff659-6a38-4161-a533-349e6e2c7383/volumes" Feb 19 23:22:16 crc kubenswrapper[4771]: I0219 23:22:16.046421 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-zshpp"] Feb 19 23:22:16 crc kubenswrapper[4771]: I0219 23:22:16.056269 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-zshpp"] Feb 19 23:22:16 crc kubenswrapper[4771]: I0219 23:22:16.451955 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7b1685-c41d-48b5-94d8-5b4b87420bf2" path="/var/lib/kubelet/pods/0a7b1685-c41d-48b5-94d8-5b4b87420bf2/volumes" Feb 19 23:22:22 crc kubenswrapper[4771]: I0219 23:22:22.437142 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:22:22 crc kubenswrapper[4771]: E0219 23:22:22.438226 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:22:37 crc kubenswrapper[4771]: I0219 23:22:37.438174 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:22:37 crc kubenswrapper[4771]: E0219 23:22:37.439233 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:22:45 crc kubenswrapper[4771]: I0219 23:22:45.302545 4771 scope.go:117] "RemoveContainer" containerID="13bf8aa751b83f481f26eb4700a682da97ae702a76baedff00b9214f321ef45b" Feb 19 23:22:45 crc kubenswrapper[4771]: I0219 23:22:45.342724 4771 scope.go:117] "RemoveContainer" containerID="859442df92b24a8dcf3f85864479d8775c41021804d905e2f731a7f22f9d2716" Feb 19 23:22:45 crc kubenswrapper[4771]: I0219 23:22:45.387502 4771 scope.go:117] "RemoveContainer" containerID="d8accff843352756205b6562f6a48140332f0f78f391ce21454b3ddf52ce4b4f" Feb 19 23:22:52 crc kubenswrapper[4771]: I0219 23:22:52.437622 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:22:52 crc kubenswrapper[4771]: E0219 23:22:52.438681 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:23:05 crc kubenswrapper[4771]: I0219 23:23:05.438562 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:23:05 crc kubenswrapper[4771]: E0219 23:23:05.439492 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:23:16 crc kubenswrapper[4771]: I0219 23:23:16.437182 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:23:16 crc kubenswrapper[4771]: E0219 23:23:16.437895 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:23:27 crc kubenswrapper[4771]: I0219 23:23:27.439258 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:23:27 crc kubenswrapper[4771]: E0219 23:23:27.440309 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:23:42 crc kubenswrapper[4771]: I0219 23:23:42.438148 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:23:42 crc kubenswrapper[4771]: E0219 23:23:42.439293 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:23:53 crc kubenswrapper[4771]: I0219 23:23:53.437963 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:23:53 crc kubenswrapper[4771]: E0219 23:23:53.438853 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:24:05 crc kubenswrapper[4771]: I0219 23:24:05.438161 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:24:05 crc kubenswrapper[4771]: E0219 23:24:05.439504 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:24:18 crc kubenswrapper[4771]: I0219 23:24:18.448746 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:24:18 crc kubenswrapper[4771]: E0219 23:24:18.450196 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:24:33 crc kubenswrapper[4771]: I0219 23:24:33.437574 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:24:33 crc kubenswrapper[4771]: E0219 23:24:33.438424 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:24:45 crc kubenswrapper[4771]: I0219 23:24:45.437980 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:24:45 crc kubenswrapper[4771]: E0219 23:24:45.438858 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.827484 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-txrch"] Feb 19 23:24:48 crc kubenswrapper[4771]: E0219 23:24:48.828867 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="extract-utilities" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.828885 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="extract-utilities" Feb 19 23:24:48 crc kubenswrapper[4771]: E0219 23:24:48.828921 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="extract-content" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.828929 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="extract-content" Feb 19 23:24:48 crc kubenswrapper[4771]: E0219 23:24:48.828935 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="registry-server" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.828942 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="registry-server" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.829363 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bff659-6a38-4161-a533-349e6e2c7383" containerName="registry-server" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.832184 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.861115 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txrch"] Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.960853 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czj5c\" (UniqueName: \"kubernetes.io/projected/fc89cbd0-62fb-40c5-b870-84f349e8f500-kube-api-access-czj5c\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.960936 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-utilities\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:48 crc kubenswrapper[4771]: I0219 23:24:48.960983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-catalog-content\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.046322 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-z49xb"] Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.054462 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-dd81-account-create-update-wvzrn"] Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.063337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czj5c\" (UniqueName: \"kubernetes.io/projected/fc89cbd0-62fb-40c5-b870-84f349e8f500-kube-api-access-czj5c\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.063431 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-utilities\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.063504 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-catalog-content\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.063958 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-utilities\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.064141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-catalog-content\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.067262 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-z49xb"] Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.074565 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-dd81-account-create-update-wvzrn"] Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.094902 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czj5c\" (UniqueName: \"kubernetes.io/projected/fc89cbd0-62fb-40c5-b870-84f349e8f500-kube-api-access-czj5c\") pod \"certified-operators-txrch\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.164340 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.704515 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txrch"] Feb 19 23:24:49 crc kubenswrapper[4771]: I0219 23:24:49.758366 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txrch" event={"ID":"fc89cbd0-62fb-40c5-b870-84f349e8f500","Type":"ContainerStarted","Data":"27a4b7128fe1df45117ae3a77aad24ac56d42937348098ae676e9b801fd8d686"} Feb 19 23:24:50 crc kubenswrapper[4771]: I0219 23:24:50.450907 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0abecce5-baef-4eca-b5de-b2152e68114a" path="/var/lib/kubelet/pods/0abecce5-baef-4eca-b5de-b2152e68114a/volumes" Feb 19 23:24:50 crc kubenswrapper[4771]: I0219 23:24:50.452675 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd353262-ccb6-422b-8f66-19091f7f34dd" path="/var/lib/kubelet/pods/bd353262-ccb6-422b-8f66-19091f7f34dd/volumes" Feb 19 23:24:50 crc kubenswrapper[4771]: I0219 23:24:50.772436 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerID="374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf" exitCode=0 Feb 19 23:24:50 crc kubenswrapper[4771]: I0219 23:24:50.772494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txrch" event={"ID":"fc89cbd0-62fb-40c5-b870-84f349e8f500","Type":"ContainerDied","Data":"374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf"} Feb 19 23:24:51 crc kubenswrapper[4771]: I0219 23:24:51.786429 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txrch" event={"ID":"fc89cbd0-62fb-40c5-b870-84f349e8f500","Type":"ContainerStarted","Data":"469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65"} Feb 19 23:24:53 crc kubenswrapper[4771]: I0219 23:24:53.850634 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerID="469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65" exitCode=0 Feb 19 23:24:53 crc kubenswrapper[4771]: I0219 23:24:53.850851 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txrch" event={"ID":"fc89cbd0-62fb-40c5-b870-84f349e8f500","Type":"ContainerDied","Data":"469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65"} Feb 19 23:24:54 crc kubenswrapper[4771]: I0219 23:24:54.865086 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txrch" event={"ID":"fc89cbd0-62fb-40c5-b870-84f349e8f500","Type":"ContainerStarted","Data":"1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72"} Feb 19 23:24:54 crc kubenswrapper[4771]: I0219 23:24:54.887508 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-txrch" podStartSLOduration=3.302940134 podStartE2EDuration="6.887486069s" podCreationTimestamp="2026-02-19 23:24:48 +0000 UTC" firstStartedPulling="2026-02-19 23:24:50.776445801 +0000 UTC m=+6991.047888281" lastFinishedPulling="2026-02-19 23:24:54.360991706 +0000 UTC m=+6994.632434216" observedRunningTime="2026-02-19 23:24:54.881335016 +0000 UTC m=+6995.152777516" watchObservedRunningTime="2026-02-19 23:24:54.887486069 +0000 UTC m=+6995.158928569" Feb 19 23:24:57 crc kubenswrapper[4771]: I0219 23:24:57.437415 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:24:57 crc kubenswrapper[4771]: E0219 23:24:57.437822 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.165304 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.165631 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.228975 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.655016 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s7xwl"] Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.658499 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.689119 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7xwl"] Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.729577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-catalog-content\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.729627 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-utilities\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.729982 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lsg5\" (UniqueName: \"kubernetes.io/projected/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-kube-api-access-9lsg5\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.832545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-catalog-content\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.832916 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-utilities\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.833153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lsg5\" (UniqueName: \"kubernetes.io/projected/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-kube-api-access-9lsg5\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.834306 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-catalog-content\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.834726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-utilities\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.855434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lsg5\" (UniqueName: \"kubernetes.io/projected/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-kube-api-access-9lsg5\") pod \"redhat-marketplace-s7xwl\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.970734 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:24:59 crc kubenswrapper[4771]: I0219 23:24:59.984141 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:25:00 crc kubenswrapper[4771]: I0219 23:25:00.035898 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-bkmmk"] Feb 19 23:25:00 crc kubenswrapper[4771]: I0219 23:25:00.051984 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-bkmmk"] Feb 19 23:25:00 crc kubenswrapper[4771]: I0219 23:25:00.449576 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582bce55-de44-45ad-a382-bf64e80dffb9" path="/var/lib/kubelet/pods/582bce55-de44-45ad-a382-bf64e80dffb9/volumes" Feb 19 23:25:01 crc kubenswrapper[4771]: I0219 23:25:01.511831 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7xwl"] Feb 19 23:25:01 crc kubenswrapper[4771]: I0219 23:25:01.940542 4771 generic.go:334] "Generic (PLEG): container finished" podID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerID="414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705" exitCode=0 Feb 19 23:25:01 crc kubenswrapper[4771]: I0219 23:25:01.940669 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7xwl" event={"ID":"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d","Type":"ContainerDied","Data":"414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705"} Feb 19 23:25:01 crc kubenswrapper[4771]: I0219 23:25:01.940953 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7xwl" event={"ID":"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d","Type":"ContainerStarted","Data":"fe252c88e93a3ff47fec3ab0e5c296a25dd92478ec97c2dc947e9561fac87247"} Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.277418 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txrch"] Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.277731 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-txrch" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="registry-server" containerID="cri-o://1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72" gracePeriod=2 Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.882015 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.951710 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerID="1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72" exitCode=0 Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.951749 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txrch" Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.951788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txrch" event={"ID":"fc89cbd0-62fb-40c5-b870-84f349e8f500","Type":"ContainerDied","Data":"1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72"} Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.951857 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txrch" event={"ID":"fc89cbd0-62fb-40c5-b870-84f349e8f500","Type":"ContainerDied","Data":"27a4b7128fe1df45117ae3a77aad24ac56d42937348098ae676e9b801fd8d686"} Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.951883 4771 scope.go:117] "RemoveContainer" containerID="1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72" Feb 19 23:25:02 crc kubenswrapper[4771]: I0219 23:25:02.979761 4771 scope.go:117] "RemoveContainer" containerID="469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.001748 4771 scope.go:117] "RemoveContainer" containerID="374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.052651 4771 scope.go:117] "RemoveContainer" containerID="1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72" Feb 19 23:25:03 crc kubenswrapper[4771]: E0219 23:25:03.053517 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72\": container with ID starting with 1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72 not found: ID does not exist" containerID="1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.053558 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-utilities\") pod \"fc89cbd0-62fb-40c5-b870-84f349e8f500\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.053566 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72"} err="failed to get container status \"1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72\": rpc error: code = NotFound desc = could not find container \"1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72\": container with ID starting with 1da81d1014fef21c2455f73c153150c7e80f7128cb4689a5eecafdc2fb55cb72 not found: ID does not exist" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.053632 4771 scope.go:117] "RemoveContainer" containerID="469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.053773 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-catalog-content\") pod \"fc89cbd0-62fb-40c5-b870-84f349e8f500\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.053901 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czj5c\" (UniqueName: \"kubernetes.io/projected/fc89cbd0-62fb-40c5-b870-84f349e8f500-kube-api-access-czj5c\") pod \"fc89cbd0-62fb-40c5-b870-84f349e8f500\" (UID: \"fc89cbd0-62fb-40c5-b870-84f349e8f500\") " Feb 19 23:25:03 crc kubenswrapper[4771]: E0219 23:25:03.053956 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65\": container with ID starting with 469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65 not found: ID does not exist" containerID="469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.053988 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65"} err="failed to get container status \"469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65\": rpc error: code = NotFound desc = could not find container \"469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65\": container with ID starting with 469f494648d8412e1f7cf3c9c9ddcb79b200f6fdf9cfb14b78d2c9df34bc3c65 not found: ID does not exist" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.054011 4771 scope.go:117] "RemoveContainer" containerID="374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf" Feb 19 23:25:03 crc kubenswrapper[4771]: E0219 23:25:03.054409 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf\": container with ID starting with 374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf not found: ID does not exist" containerID="374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.054451 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf"} err="failed to get container status \"374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf\": rpc error: code = NotFound desc = could not find container \"374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf\": container with ID starting with 374e444deabd35733d5969646ed1fc676fd3daca5e7c567d90c47f3a0920f8cf not found: ID does not exist" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.054594 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-utilities" (OuterVolumeSpecName: "utilities") pod "fc89cbd0-62fb-40c5-b870-84f349e8f500" (UID: "fc89cbd0-62fb-40c5-b870-84f349e8f500"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.073496 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc89cbd0-62fb-40c5-b870-84f349e8f500-kube-api-access-czj5c" (OuterVolumeSpecName: "kube-api-access-czj5c") pod "fc89cbd0-62fb-40c5-b870-84f349e8f500" (UID: "fc89cbd0-62fb-40c5-b870-84f349e8f500"). InnerVolumeSpecName "kube-api-access-czj5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.107930 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc89cbd0-62fb-40c5-b870-84f349e8f500" (UID: "fc89cbd0-62fb-40c5-b870-84f349e8f500"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.157397 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.157695 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89cbd0-62fb-40c5-b870-84f349e8f500-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.157822 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czj5c\" (UniqueName: \"kubernetes.io/projected/fc89cbd0-62fb-40c5-b870-84f349e8f500-kube-api-access-czj5c\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.290322 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txrch"] Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.303792 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-txrch"] Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.963082 4771 generic.go:334] "Generic (PLEG): container finished" podID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerID="4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd" exitCode=0 Feb 19 23:25:03 crc kubenswrapper[4771]: I0219 23:25:03.963129 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7xwl" event={"ID":"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d","Type":"ContainerDied","Data":"4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd"} Feb 19 23:25:04 crc kubenswrapper[4771]: I0219 23:25:04.455099 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" path="/var/lib/kubelet/pods/fc89cbd0-62fb-40c5-b870-84f349e8f500/volumes" Feb 19 23:25:04 crc kubenswrapper[4771]: I0219 23:25:04.976052 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7xwl" event={"ID":"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d","Type":"ContainerStarted","Data":"6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3"} Feb 19 23:25:04 crc kubenswrapper[4771]: I0219 23:25:04.999662 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s7xwl" podStartSLOduration=3.612713997 podStartE2EDuration="5.99962753s" podCreationTimestamp="2026-02-19 23:24:59 +0000 UTC" firstStartedPulling="2026-02-19 23:25:01.944825603 +0000 UTC m=+7002.216268113" lastFinishedPulling="2026-02-19 23:25:04.331739166 +0000 UTC m=+7004.603181646" observedRunningTime="2026-02-19 23:25:04.993923658 +0000 UTC m=+7005.265366148" watchObservedRunningTime="2026-02-19 23:25:04.99962753 +0000 UTC m=+7005.271070010" Feb 19 23:25:09 crc kubenswrapper[4771]: I0219 23:25:09.984503 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:25:09 crc kubenswrapper[4771]: I0219 23:25:09.984799 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:25:10 crc kubenswrapper[4771]: I0219 23:25:10.050617 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:25:10 crc kubenswrapper[4771]: I0219 23:25:10.109867 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:25:10 crc kubenswrapper[4771]: I0219 23:25:10.302137 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7xwl"] Feb 19 23:25:10 crc kubenswrapper[4771]: I0219 23:25:10.452168 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:25:10 crc kubenswrapper[4771]: E0219 23:25:10.452665 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.039535 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s7xwl" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="registry-server" containerID="cri-o://6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3" gracePeriod=2 Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.633742 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.803339 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-utilities\") pod \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.803646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lsg5\" (UniqueName: \"kubernetes.io/projected/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-kube-api-access-9lsg5\") pod \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.803727 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-catalog-content\") pod \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\" (UID: \"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d\") " Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.805152 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-utilities" (OuterVolumeSpecName: "utilities") pod "3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" (UID: "3e19ec0b-4f5a-4ead-a5a7-223c17fc635d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.810402 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-kube-api-access-9lsg5" (OuterVolumeSpecName: "kube-api-access-9lsg5") pod "3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" (UID: "3e19ec0b-4f5a-4ead-a5a7-223c17fc635d"). InnerVolumeSpecName "kube-api-access-9lsg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.833648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" (UID: "3e19ec0b-4f5a-4ead-a5a7-223c17fc635d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.906991 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.907322 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:12 crc kubenswrapper[4771]: I0219 23:25:12.907407 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lsg5\" (UniqueName: \"kubernetes.io/projected/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d-kube-api-access-9lsg5\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.053943 4771 generic.go:334] "Generic (PLEG): container finished" podID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerID="6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3" exitCode=0 Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.054064 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7xwl" event={"ID":"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d","Type":"ContainerDied","Data":"6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3"} Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.054112 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s7xwl" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.055552 4771 scope.go:117] "RemoveContainer" containerID="6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.055460 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s7xwl" event={"ID":"3e19ec0b-4f5a-4ead-a5a7-223c17fc635d","Type":"ContainerDied","Data":"fe252c88e93a3ff47fec3ab0e5c296a25dd92478ec97c2dc947e9561fac87247"} Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.097680 4771 scope.go:117] "RemoveContainer" containerID="4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.103244 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7xwl"] Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.127208 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s7xwl"] Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.147918 4771 scope.go:117] "RemoveContainer" containerID="414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.192147 4771 scope.go:117] "RemoveContainer" containerID="6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3" Feb 19 23:25:13 crc kubenswrapper[4771]: E0219 23:25:13.192890 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3\": container with ID starting with 6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3 not found: ID does not exist" containerID="6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.192958 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3"} err="failed to get container status \"6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3\": rpc error: code = NotFound desc = could not find container \"6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3\": container with ID starting with 6767bc0ca94c9b3e04ba606ba24491d17938c4073bc78bb2a07f3c4f5e4bbea3 not found: ID does not exist" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.192997 4771 scope.go:117] "RemoveContainer" containerID="4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd" Feb 19 23:25:13 crc kubenswrapper[4771]: E0219 23:25:13.193446 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd\": container with ID starting with 4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd not found: ID does not exist" containerID="4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.193526 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd"} err="failed to get container status \"4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd\": rpc error: code = NotFound desc = could not find container \"4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd\": container with ID starting with 4263deb70a4cfaedc43a93b28fa31b2060ba87888ef01866a7523c428b8d02dd not found: ID does not exist" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.193558 4771 scope.go:117] "RemoveContainer" containerID="414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705" Feb 19 23:25:13 crc kubenswrapper[4771]: E0219 23:25:13.194343 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705\": container with ID starting with 414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705 not found: ID does not exist" containerID="414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705" Feb 19 23:25:13 crc kubenswrapper[4771]: I0219 23:25:13.194391 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705"} err="failed to get container status \"414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705\": rpc error: code = NotFound desc = could not find container \"414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705\": container with ID starting with 414fd05ea03a23165d946e2ae71fcfd78237d70269f3022e4aef38fdb8ab2705 not found: ID does not exist" Feb 19 23:25:14 crc kubenswrapper[4771]: I0219 23:25:14.455871 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" path="/var/lib/kubelet/pods/3e19ec0b-4f5a-4ead-a5a7-223c17fc635d/volumes" Feb 19 23:25:22 crc kubenswrapper[4771]: I0219 23:25:22.437457 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:25:22 crc kubenswrapper[4771]: E0219 23:25:22.438234 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:25:35 crc kubenswrapper[4771]: I0219 23:25:35.438354 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:25:35 crc kubenswrapper[4771]: E0219 23:25:35.439695 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:25:45 crc kubenswrapper[4771]: I0219 23:25:45.561800 4771 scope.go:117] "RemoveContainer" containerID="134a0efcc84cbb7d21b6c4ca38cc3d96ff8995dfe7fad26c49354611ebaac7c5" Feb 19 23:25:45 crc kubenswrapper[4771]: I0219 23:25:45.601542 4771 scope.go:117] "RemoveContainer" containerID="4b384c0ea0f5f9cb5bfbda905e1adedd54736efd09b3d97c0217bb263d79af9b" Feb 19 23:25:45 crc kubenswrapper[4771]: I0219 23:25:45.674977 4771 scope.go:117] "RemoveContainer" containerID="50aeb47c2615bfd1b17f343aae62951925270370c528d7d2b9e9ef4b2757564e" Feb 19 23:25:47 crc kubenswrapper[4771]: I0219 23:25:47.437652 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:25:48 crc kubenswrapper[4771]: I0219 23:25:48.472336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"dd6ed8bfb60102bd792d2fddbecb1a88666d249dc5badf8af50e58d941abf30f"} Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.434206 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7knd9"] Feb 19 23:26:16 crc kubenswrapper[4771]: E0219 23:26:16.435569 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="extract-content" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.435589 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="extract-content" Feb 19 23:26:16 crc kubenswrapper[4771]: E0219 23:26:16.435615 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="extract-utilities" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.442375 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="extract-utilities" Feb 19 23:26:16 crc kubenswrapper[4771]: E0219 23:26:16.442443 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="registry-server" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.442460 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="registry-server" Feb 19 23:26:16 crc kubenswrapper[4771]: E0219 23:26:16.442513 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="extract-utilities" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.442531 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="extract-utilities" Feb 19 23:26:16 crc kubenswrapper[4771]: E0219 23:26:16.442574 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="extract-content" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.442586 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="extract-content" Feb 19 23:26:16 crc kubenswrapper[4771]: E0219 23:26:16.442653 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="registry-server" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.442672 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="registry-server" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.443342 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc89cbd0-62fb-40c5-b870-84f349e8f500" containerName="registry-server" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.443384 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e19ec0b-4f5a-4ead-a5a7-223c17fc635d" containerName="registry-server" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.447535 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.475372 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7knd9"] Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.582174 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-utilities\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.582242 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lpjd\" (UniqueName: \"kubernetes.io/projected/9391a9e6-30a3-4f6d-87f5-4e28003c7337-kube-api-access-2lpjd\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.582355 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-catalog-content\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.685252 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-utilities\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.685303 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lpjd\" (UniqueName: \"kubernetes.io/projected/9391a9e6-30a3-4f6d-87f5-4e28003c7337-kube-api-access-2lpjd\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.685336 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-catalog-content\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.685886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-utilities\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.685914 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-catalog-content\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.714428 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lpjd\" (UniqueName: \"kubernetes.io/projected/9391a9e6-30a3-4f6d-87f5-4e28003c7337-kube-api-access-2lpjd\") pod \"redhat-operators-7knd9\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:16 crc kubenswrapper[4771]: I0219 23:26:16.783755 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:17 crc kubenswrapper[4771]: I0219 23:26:17.310514 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7knd9"] Feb 19 23:26:17 crc kubenswrapper[4771]: I0219 23:26:17.862784 4771 generic.go:334] "Generic (PLEG): container finished" podID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerID="f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519" exitCode=0 Feb 19 23:26:17 crc kubenswrapper[4771]: I0219 23:26:17.862829 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7knd9" event={"ID":"9391a9e6-30a3-4f6d-87f5-4e28003c7337","Type":"ContainerDied","Data":"f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519"} Feb 19 23:26:17 crc kubenswrapper[4771]: I0219 23:26:17.863049 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7knd9" event={"ID":"9391a9e6-30a3-4f6d-87f5-4e28003c7337","Type":"ContainerStarted","Data":"8ebbf820389550cef0ce3e292c378ecaf80c0abf87a89bf74519b89af8f2c529"} Feb 19 23:26:19 crc kubenswrapper[4771]: I0219 23:26:19.890779 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7knd9" event={"ID":"9391a9e6-30a3-4f6d-87f5-4e28003c7337","Type":"ContainerStarted","Data":"aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b"} Feb 19 23:26:24 crc kubenswrapper[4771]: I0219 23:26:24.954230 4771 generic.go:334] "Generic (PLEG): container finished" podID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerID="aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b" exitCode=0 Feb 19 23:26:24 crc kubenswrapper[4771]: I0219 23:26:24.954341 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7knd9" event={"ID":"9391a9e6-30a3-4f6d-87f5-4e28003c7337","Type":"ContainerDied","Data":"aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b"} Feb 19 23:26:25 crc kubenswrapper[4771]: I0219 23:26:25.973426 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7knd9" event={"ID":"9391a9e6-30a3-4f6d-87f5-4e28003c7337","Type":"ContainerStarted","Data":"80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b"} Feb 19 23:26:26 crc kubenswrapper[4771]: I0219 23:26:26.003774 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7knd9" podStartSLOduration=2.285650248 podStartE2EDuration="10.003745318s" podCreationTimestamp="2026-02-19 23:26:16 +0000 UTC" firstStartedPulling="2026-02-19 23:26:17.865852249 +0000 UTC m=+7078.137294719" lastFinishedPulling="2026-02-19 23:26:25.583947319 +0000 UTC m=+7085.855389789" observedRunningTime="2026-02-19 23:26:26.002237718 +0000 UTC m=+7086.273680278" watchObservedRunningTime="2026-02-19 23:26:26.003745318 +0000 UTC m=+7086.275187828" Feb 19 23:26:26 crc kubenswrapper[4771]: I0219 23:26:26.784123 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:26 crc kubenswrapper[4771]: I0219 23:26:26.785308 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:27 crc kubenswrapper[4771]: I0219 23:26:27.857585 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7knd9" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="registry-server" probeResult="failure" output=< Feb 19 23:26:27 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:26:27 crc kubenswrapper[4771]: > Feb 19 23:26:37 crc kubenswrapper[4771]: I0219 23:26:37.864790 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7knd9" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="registry-server" probeResult="failure" output=< Feb 19 23:26:37 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:26:37 crc kubenswrapper[4771]: > Feb 19 23:26:46 crc kubenswrapper[4771]: I0219 23:26:46.848319 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:46 crc kubenswrapper[4771]: I0219 23:26:46.926966 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:47 crc kubenswrapper[4771]: I0219 23:26:47.637100 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7knd9"] Feb 19 23:26:48 crc kubenswrapper[4771]: I0219 23:26:48.256874 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7knd9" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="registry-server" containerID="cri-o://80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b" gracePeriod=2 Feb 19 23:26:48 crc kubenswrapper[4771]: I0219 23:26:48.817409 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.011621 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lpjd\" (UniqueName: \"kubernetes.io/projected/9391a9e6-30a3-4f6d-87f5-4e28003c7337-kube-api-access-2lpjd\") pod \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.011677 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-utilities\") pod \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.011805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-catalog-content\") pod \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\" (UID: \"9391a9e6-30a3-4f6d-87f5-4e28003c7337\") " Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.013112 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-utilities" (OuterVolumeSpecName: "utilities") pod "9391a9e6-30a3-4f6d-87f5-4e28003c7337" (UID: "9391a9e6-30a3-4f6d-87f5-4e28003c7337"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.021310 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9391a9e6-30a3-4f6d-87f5-4e28003c7337-kube-api-access-2lpjd" (OuterVolumeSpecName: "kube-api-access-2lpjd") pod "9391a9e6-30a3-4f6d-87f5-4e28003c7337" (UID: "9391a9e6-30a3-4f6d-87f5-4e28003c7337"). InnerVolumeSpecName "kube-api-access-2lpjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.115970 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lpjd\" (UniqueName: \"kubernetes.io/projected/9391a9e6-30a3-4f6d-87f5-4e28003c7337-kube-api-access-2lpjd\") on node \"crc\" DevicePath \"\"" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.116043 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.157635 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9391a9e6-30a3-4f6d-87f5-4e28003c7337" (UID: "9391a9e6-30a3-4f6d-87f5-4e28003c7337"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.218545 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9391a9e6-30a3-4f6d-87f5-4e28003c7337-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.272087 4771 generic.go:334] "Generic (PLEG): container finished" podID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerID="80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b" exitCode=0 Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.272146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7knd9" event={"ID":"9391a9e6-30a3-4f6d-87f5-4e28003c7337","Type":"ContainerDied","Data":"80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b"} Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.272188 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7knd9" event={"ID":"9391a9e6-30a3-4f6d-87f5-4e28003c7337","Type":"ContainerDied","Data":"8ebbf820389550cef0ce3e292c378ecaf80c0abf87a89bf74519b89af8f2c529"} Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.272215 4771 scope.go:117] "RemoveContainer" containerID="80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.272266 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7knd9" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.314083 4771 scope.go:117] "RemoveContainer" containerID="aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.328238 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7knd9"] Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.341052 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7knd9"] Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.350216 4771 scope.go:117] "RemoveContainer" containerID="f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.390410 4771 scope.go:117] "RemoveContainer" containerID="80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b" Feb 19 23:26:49 crc kubenswrapper[4771]: E0219 23:26:49.391368 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b\": container with ID starting with 80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b not found: ID does not exist" containerID="80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.391409 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b"} err="failed to get container status \"80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b\": rpc error: code = NotFound desc = could not find container \"80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b\": container with ID starting with 80f74267b51b20eef822d82396510c06cc3e1882af1574e81e226c73da89aa7b not found: ID does not exist" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.391436 4771 scope.go:117] "RemoveContainer" containerID="aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b" Feb 19 23:26:49 crc kubenswrapper[4771]: E0219 23:26:49.391827 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b\": container with ID starting with aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b not found: ID does not exist" containerID="aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.391852 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b"} err="failed to get container status \"aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b\": rpc error: code = NotFound desc = could not find container \"aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b\": container with ID starting with aa02da63f4b547528abf9deb3193ae3e648bdb4f7d0ed92ca1bbeac5d936365b not found: ID does not exist" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.391869 4771 scope.go:117] "RemoveContainer" containerID="f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519" Feb 19 23:26:49 crc kubenswrapper[4771]: E0219 23:26:49.392437 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519\": container with ID starting with f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519 not found: ID does not exist" containerID="f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519" Feb 19 23:26:49 crc kubenswrapper[4771]: I0219 23:26:49.392464 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519"} err="failed to get container status \"f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519\": rpc error: code = NotFound desc = could not find container \"f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519\": container with ID starting with f47b9cd7a653be75d01085f63950b4587e590db1fdc973874fa9d1b5801c7519 not found: ID does not exist" Feb 19 23:26:50 crc kubenswrapper[4771]: I0219 23:26:50.453294 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" path="/var/lib/kubelet/pods/9391a9e6-30a3-4f6d-87f5-4e28003c7337/volumes" Feb 19 23:28:12 crc kubenswrapper[4771]: I0219 23:28:12.956803 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:28:12 crc kubenswrapper[4771]: I0219 23:28:12.957469 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:28:42 crc kubenswrapper[4771]: I0219 23:28:42.957191 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:28:42 crc kubenswrapper[4771]: I0219 23:28:42.957965 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:29:12 crc kubenswrapper[4771]: I0219 23:29:12.957481 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:29:12 crc kubenswrapper[4771]: I0219 23:29:12.958396 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:29:12 crc kubenswrapper[4771]: I0219 23:29:12.958502 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:29:12 crc kubenswrapper[4771]: I0219 23:29:12.960204 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd6ed8bfb60102bd792d2fddbecb1a88666d249dc5badf8af50e58d941abf30f"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:29:12 crc kubenswrapper[4771]: I0219 23:29:12.960336 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://dd6ed8bfb60102bd792d2fddbecb1a88666d249dc5badf8af50e58d941abf30f" gracePeriod=600 Feb 19 23:29:14 crc kubenswrapper[4771]: I0219 23:29:14.117712 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="dd6ed8bfb60102bd792d2fddbecb1a88666d249dc5badf8af50e58d941abf30f" exitCode=0 Feb 19 23:29:14 crc kubenswrapper[4771]: I0219 23:29:14.118294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"dd6ed8bfb60102bd792d2fddbecb1a88666d249dc5badf8af50e58d941abf30f"} Feb 19 23:29:14 crc kubenswrapper[4771]: I0219 23:29:14.118340 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2"} Feb 19 23:29:14 crc kubenswrapper[4771]: I0219 23:29:14.118386 4771 scope.go:117] "RemoveContainer" containerID="9f8f25d6e5bc33b33675b803d38ff12210d0fff9afe1753208f9ad1e474825cb" Feb 19 23:29:36 crc kubenswrapper[4771]: I0219 23:29:36.379850 4771 generic.go:334] "Generic (PLEG): container finished" podID="a1aa1c24-5f3f-4177-a132-a34da7c18d33" containerID="3871bfd4108f8ecd85fca5560989ca6ca8941ce3ed25403c87c3d1adac629e68" exitCode=0 Feb 19 23:29:36 crc kubenswrapper[4771]: I0219 23:29:36.379949 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" event={"ID":"a1aa1c24-5f3f-4177-a132-a34da7c18d33","Type":"ContainerDied","Data":"3871bfd4108f8ecd85fca5560989ca6ca8941ce3ed25403c87c3d1adac629e68"} Feb 19 23:29:37 crc kubenswrapper[4771]: I0219 23:29:37.881373 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.011344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtcmt\" (UniqueName: \"kubernetes.io/projected/a1aa1c24-5f3f-4177-a132-a34da7c18d33-kube-api-access-gtcmt\") pod \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.011456 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-tripleo-cleanup-combined-ca-bundle\") pod \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.011494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-ssh-key-openstack-cell1\") pod \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.011546 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-inventory\") pod \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\" (UID: \"a1aa1c24-5f3f-4177-a132-a34da7c18d33\") " Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.027066 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "a1aa1c24-5f3f-4177-a132-a34da7c18d33" (UID: "a1aa1c24-5f3f-4177-a132-a34da7c18d33"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.030697 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1aa1c24-5f3f-4177-a132-a34da7c18d33-kube-api-access-gtcmt" (OuterVolumeSpecName: "kube-api-access-gtcmt") pod "a1aa1c24-5f3f-4177-a132-a34da7c18d33" (UID: "a1aa1c24-5f3f-4177-a132-a34da7c18d33"). InnerVolumeSpecName "kube-api-access-gtcmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.048839 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a1aa1c24-5f3f-4177-a132-a34da7c18d33" (UID: "a1aa1c24-5f3f-4177-a132-a34da7c18d33"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.072037 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-inventory" (OuterVolumeSpecName: "inventory") pod "a1aa1c24-5f3f-4177-a132-a34da7c18d33" (UID: "a1aa1c24-5f3f-4177-a132-a34da7c18d33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.114768 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtcmt\" (UniqueName: \"kubernetes.io/projected/a1aa1c24-5f3f-4177-a132-a34da7c18d33-kube-api-access-gtcmt\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.114821 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.114844 4771 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.114867 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1aa1c24-5f3f-4177-a132-a34da7c18d33-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.401551 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" event={"ID":"a1aa1c24-5f3f-4177-a132-a34da7c18d33","Type":"ContainerDied","Data":"0b9536630487b5837b65f2008aea091f1ad32ea068a7bcab7b739d6933a3207a"} Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.401607 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b9536630487b5837b65f2008aea091f1ad32ea068a7bcab7b739d6933a3207a" Feb 19 23:29:38 crc kubenswrapper[4771]: I0219 23:29:38.401621 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.474513 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-5hlk6"] Feb 19 23:29:49 crc kubenswrapper[4771]: E0219 23:29:49.476456 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="registry-server" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.476573 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="registry-server" Feb 19 23:29:49 crc kubenswrapper[4771]: E0219 23:29:49.476684 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1aa1c24-5f3f-4177-a132-a34da7c18d33" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.476768 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1aa1c24-5f3f-4177-a132-a34da7c18d33" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:29:49 crc kubenswrapper[4771]: E0219 23:29:49.476869 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="extract-content" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.478921 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="extract-content" Feb 19 23:29:49 crc kubenswrapper[4771]: E0219 23:29:49.479110 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="extract-utilities" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.479203 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="extract-utilities" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.479535 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9391a9e6-30a3-4f6d-87f5-4e28003c7337" containerName="registry-server" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.479639 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1aa1c24-5f3f-4177-a132-a34da7c18d33" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.480671 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.482586 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.484237 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.484971 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.485063 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.490452 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-5hlk6"] Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.601652 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlc2s\" (UniqueName: \"kubernetes.io/projected/e0006193-39c1-45bb-924e-26c8cb22638b-kube-api-access-tlc2s\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.601944 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.602373 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.602524 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-inventory\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.704479 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.704532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-inventory\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.704594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlc2s\" (UniqueName: \"kubernetes.io/projected/e0006193-39c1-45bb-924e-26c8cb22638b-kube-api-access-tlc2s\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.704617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.719898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.720563 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.722621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-inventory\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.737794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlc2s\" (UniqueName: \"kubernetes.io/projected/e0006193-39c1-45bb-924e-26c8cb22638b-kube-api-access-tlc2s\") pod \"bootstrap-openstack-openstack-cell1-5hlk6\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:49 crc kubenswrapper[4771]: I0219 23:29:49.806367 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:29:50 crc kubenswrapper[4771]: I0219 23:29:50.441229 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:29:50 crc kubenswrapper[4771]: I0219 23:29:50.484152 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-5hlk6"] Feb 19 23:29:50 crc kubenswrapper[4771]: I0219 23:29:50.552342 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" event={"ID":"e0006193-39c1-45bb-924e-26c8cb22638b","Type":"ContainerStarted","Data":"143afb899e91d3133294a80816e29ab76f891cadfc20a8b1c9973403951e52ed"} Feb 19 23:29:51 crc kubenswrapper[4771]: I0219 23:29:51.561734 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" event={"ID":"e0006193-39c1-45bb-924e-26c8cb22638b","Type":"ContainerStarted","Data":"181f34a9cdee3cf62c93bbda2a34136c4aaad42bb3b1b26eb05761684d53a5da"} Feb 19 23:29:51 crc kubenswrapper[4771]: I0219 23:29:51.576880 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" podStartSLOduration=2.122487893 podStartE2EDuration="2.576861083s" podCreationTimestamp="2026-02-19 23:29:49 +0000 UTC" firstStartedPulling="2026-02-19 23:29:50.440850292 +0000 UTC m=+7290.712292772" lastFinishedPulling="2026-02-19 23:29:50.895223452 +0000 UTC m=+7291.166665962" observedRunningTime="2026-02-19 23:29:51.576359489 +0000 UTC m=+7291.847801969" watchObservedRunningTime="2026-02-19 23:29:51.576861083 +0000 UTC m=+7291.848303553" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.178177 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz"] Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.181707 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.184336 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.192005 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.192840 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz"] Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.261908 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx86z\" (UniqueName: \"kubernetes.io/projected/dc7607e6-697e-42e5-999f-c3d8dd3e0171-kube-api-access-fx86z\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.262384 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc7607e6-697e-42e5-999f-c3d8dd3e0171-config-volume\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.262591 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc7607e6-697e-42e5-999f-c3d8dd3e0171-secret-volume\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.365762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx86z\" (UniqueName: \"kubernetes.io/projected/dc7607e6-697e-42e5-999f-c3d8dd3e0171-kube-api-access-fx86z\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.366781 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc7607e6-697e-42e5-999f-c3d8dd3e0171-config-volume\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.366871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc7607e6-697e-42e5-999f-c3d8dd3e0171-secret-volume\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.368095 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc7607e6-697e-42e5-999f-c3d8dd3e0171-config-volume\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.376241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc7607e6-697e-42e5-999f-c3d8dd3e0171-secret-volume\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.386811 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx86z\" (UniqueName: \"kubernetes.io/projected/dc7607e6-697e-42e5-999f-c3d8dd3e0171-kube-api-access-fx86z\") pod \"collect-profiles-29525730-wcxlz\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.519529 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:00 crc kubenswrapper[4771]: I0219 23:30:00.916548 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz"] Feb 19 23:30:01 crc kubenswrapper[4771]: I0219 23:30:01.681684 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc7607e6-697e-42e5-999f-c3d8dd3e0171" containerID="19f7de9e816c0656b12fd08e3b9409252ba9f7195934450c2213f0dd8ac72420" exitCode=0 Feb 19 23:30:01 crc kubenswrapper[4771]: I0219 23:30:01.681792 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" event={"ID":"dc7607e6-697e-42e5-999f-c3d8dd3e0171","Type":"ContainerDied","Data":"19f7de9e816c0656b12fd08e3b9409252ba9f7195934450c2213f0dd8ac72420"} Feb 19 23:30:01 crc kubenswrapper[4771]: I0219 23:30:01.681961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" event={"ID":"dc7607e6-697e-42e5-999f-c3d8dd3e0171","Type":"ContainerStarted","Data":"f7230bb7fc95b6176c342d020ba661912a3bed1f5538a0482fa739ba89913202"} Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.104389 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.239792 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc7607e6-697e-42e5-999f-c3d8dd3e0171-secret-volume\") pod \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.240436 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc7607e6-697e-42e5-999f-c3d8dd3e0171-config-volume\") pod \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.240539 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx86z\" (UniqueName: \"kubernetes.io/projected/dc7607e6-697e-42e5-999f-c3d8dd3e0171-kube-api-access-fx86z\") pod \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\" (UID: \"dc7607e6-697e-42e5-999f-c3d8dd3e0171\") " Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.241214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc7607e6-697e-42e5-999f-c3d8dd3e0171-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc7607e6-697e-42e5-999f-c3d8dd3e0171" (UID: "dc7607e6-697e-42e5-999f-c3d8dd3e0171"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.245371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7607e6-697e-42e5-999f-c3d8dd3e0171-kube-api-access-fx86z" (OuterVolumeSpecName: "kube-api-access-fx86z") pod "dc7607e6-697e-42e5-999f-c3d8dd3e0171" (UID: "dc7607e6-697e-42e5-999f-c3d8dd3e0171"). InnerVolumeSpecName "kube-api-access-fx86z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.246985 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7607e6-697e-42e5-999f-c3d8dd3e0171-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc7607e6-697e-42e5-999f-c3d8dd3e0171" (UID: "dc7607e6-697e-42e5-999f-c3d8dd3e0171"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.343488 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc7607e6-697e-42e5-999f-c3d8dd3e0171-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.343531 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx86z\" (UniqueName: \"kubernetes.io/projected/dc7607e6-697e-42e5-999f-c3d8dd3e0171-kube-api-access-fx86z\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.343545 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc7607e6-697e-42e5-999f-c3d8dd3e0171-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.711234 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" event={"ID":"dc7607e6-697e-42e5-999f-c3d8dd3e0171","Type":"ContainerDied","Data":"f7230bb7fc95b6176c342d020ba661912a3bed1f5538a0482fa739ba89913202"} Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.711355 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz" Feb 19 23:30:03 crc kubenswrapper[4771]: I0219 23:30:03.711307 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7230bb7fc95b6176c342d020ba661912a3bed1f5538a0482fa739ba89913202" Feb 19 23:30:04 crc kubenswrapper[4771]: I0219 23:30:04.221259 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd"] Feb 19 23:30:04 crc kubenswrapper[4771]: I0219 23:30:04.230455 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-rt6sd"] Feb 19 23:30:04 crc kubenswrapper[4771]: I0219 23:30:04.448081 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61d94fa-d9a8-45a9-9a72-6225b9a56324" path="/var/lib/kubelet/pods/e61d94fa-d9a8-45a9-9a72-6225b9a56324/volumes" Feb 19 23:30:46 crc kubenswrapper[4771]: I0219 23:30:46.003246 4771 scope.go:117] "RemoveContainer" containerID="bda9220221576eac491a463fa6600a90a1595da25e8dab5fb874998da453224e" Feb 19 23:31:42 crc kubenswrapper[4771]: I0219 23:31:42.956493 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:31:42 crc kubenswrapper[4771]: I0219 23:31:42.957345 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:32:12 crc kubenswrapper[4771]: I0219 23:32:12.957356 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:32:12 crc kubenswrapper[4771]: I0219 23:32:12.958197 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.291689 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f8d78"] Feb 19 23:32:25 crc kubenswrapper[4771]: E0219 23:32:25.292664 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7607e6-697e-42e5-999f-c3d8dd3e0171" containerName="collect-profiles" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.292679 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7607e6-697e-42e5-999f-c3d8dd3e0171" containerName="collect-profiles" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.292938 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7607e6-697e-42e5-999f-c3d8dd3e0171" containerName="collect-profiles" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.296282 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.322225 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8d78"] Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.366543 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-catalog-content\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.366615 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzj95\" (UniqueName: \"kubernetes.io/projected/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-kube-api-access-qzj95\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.366651 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-utilities\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.468851 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-utilities\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.469068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-catalog-content\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.469101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzj95\" (UniqueName: \"kubernetes.io/projected/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-kube-api-access-qzj95\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.469361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-utilities\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.469750 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-catalog-content\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.487838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzj95\" (UniqueName: \"kubernetes.io/projected/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-kube-api-access-qzj95\") pod \"community-operators-f8d78\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:25 crc kubenswrapper[4771]: I0219 23:32:25.663219 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:26 crc kubenswrapper[4771]: I0219 23:32:26.200450 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8d78"] Feb 19 23:32:26 crc kubenswrapper[4771]: W0219 23:32:26.211382 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf5522c_38ff_4a76_bae0_f5bf2de19d6d.slice/crio-999250f45ae0bfb6bc8226b8565e8d2a8cab6565c2e66dfa7769d9e178d233d8 WatchSource:0}: Error finding container 999250f45ae0bfb6bc8226b8565e8d2a8cab6565c2e66dfa7769d9e178d233d8: Status 404 returned error can't find the container with id 999250f45ae0bfb6bc8226b8565e8d2a8cab6565c2e66dfa7769d9e178d233d8 Feb 19 23:32:26 crc kubenswrapper[4771]: I0219 23:32:26.514894 4771 generic.go:334] "Generic (PLEG): container finished" podID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerID="babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d" exitCode=0 Feb 19 23:32:26 crc kubenswrapper[4771]: I0219 23:32:26.514995 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d78" event={"ID":"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d","Type":"ContainerDied","Data":"babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d"} Feb 19 23:32:26 crc kubenswrapper[4771]: I0219 23:32:26.515254 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d78" event={"ID":"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d","Type":"ContainerStarted","Data":"999250f45ae0bfb6bc8226b8565e8d2a8cab6565c2e66dfa7769d9e178d233d8"} Feb 19 23:32:27 crc kubenswrapper[4771]: I0219 23:32:27.528111 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d78" event={"ID":"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d","Type":"ContainerStarted","Data":"83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9"} Feb 19 23:32:29 crc kubenswrapper[4771]: I0219 23:32:29.554922 4771 generic.go:334] "Generic (PLEG): container finished" podID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerID="83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9" exitCode=0 Feb 19 23:32:29 crc kubenswrapper[4771]: I0219 23:32:29.554995 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d78" event={"ID":"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d","Type":"ContainerDied","Data":"83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9"} Feb 19 23:32:30 crc kubenswrapper[4771]: I0219 23:32:30.568972 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d78" event={"ID":"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d","Type":"ContainerStarted","Data":"737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6"} Feb 19 23:32:30 crc kubenswrapper[4771]: I0219 23:32:30.606818 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f8d78" podStartSLOduration=2.159906953 podStartE2EDuration="5.606800681s" podCreationTimestamp="2026-02-19 23:32:25 +0000 UTC" firstStartedPulling="2026-02-19 23:32:26.517721664 +0000 UTC m=+7446.789164134" lastFinishedPulling="2026-02-19 23:32:29.964615362 +0000 UTC m=+7450.236057862" observedRunningTime="2026-02-19 23:32:30.594946544 +0000 UTC m=+7450.866389014" watchObservedRunningTime="2026-02-19 23:32:30.606800681 +0000 UTC m=+7450.878243151" Feb 19 23:32:35 crc kubenswrapper[4771]: I0219 23:32:35.664619 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:35 crc kubenswrapper[4771]: I0219 23:32:35.665348 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:35 crc kubenswrapper[4771]: I0219 23:32:35.775834 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:36 crc kubenswrapper[4771]: I0219 23:32:36.738172 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:36 crc kubenswrapper[4771]: I0219 23:32:36.798069 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8d78"] Feb 19 23:32:38 crc kubenswrapper[4771]: I0219 23:32:38.673492 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f8d78" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="registry-server" containerID="cri-o://737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6" gracePeriod=2 Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.205501 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.335103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-catalog-content\") pod \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.335263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzj95\" (UniqueName: \"kubernetes.io/projected/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-kube-api-access-qzj95\") pod \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.335580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-utilities\") pod \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\" (UID: \"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d\") " Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.336761 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-utilities" (OuterVolumeSpecName: "utilities") pod "ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" (UID: "ddf5522c-38ff-4a76-bae0-f5bf2de19d6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.345442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-kube-api-access-qzj95" (OuterVolumeSpecName: "kube-api-access-qzj95") pod "ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" (UID: "ddf5522c-38ff-4a76-bae0-f5bf2de19d6d"). InnerVolumeSpecName "kube-api-access-qzj95". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.386666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" (UID: "ddf5522c-38ff-4a76-bae0-f5bf2de19d6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.438595 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.438636 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzj95\" (UniqueName: \"kubernetes.io/projected/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-kube-api-access-qzj95\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.438651 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.688782 4771 generic.go:334] "Generic (PLEG): container finished" podID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerID="737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6" exitCode=0 Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.688870 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8d78" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.688852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d78" event={"ID":"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d","Type":"ContainerDied","Data":"737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6"} Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.689094 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8d78" event={"ID":"ddf5522c-38ff-4a76-bae0-f5bf2de19d6d","Type":"ContainerDied","Data":"999250f45ae0bfb6bc8226b8565e8d2a8cab6565c2e66dfa7769d9e178d233d8"} Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.689127 4771 scope.go:117] "RemoveContainer" containerID="737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.727746 4771 scope.go:117] "RemoveContainer" containerID="83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.746808 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8d78"] Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.757321 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f8d78"] Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.768318 4771 scope.go:117] "RemoveContainer" containerID="babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.829430 4771 scope.go:117] "RemoveContainer" containerID="737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6" Feb 19 23:32:39 crc kubenswrapper[4771]: E0219 23:32:39.830242 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6\": container with ID starting with 737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6 not found: ID does not exist" containerID="737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.830278 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6"} err="failed to get container status \"737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6\": rpc error: code = NotFound desc = could not find container \"737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6\": container with ID starting with 737518a9bff62960b7a9488fc553dfbc9cf06573ce8913a7e24553560eba20f6 not found: ID does not exist" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.830302 4771 scope.go:117] "RemoveContainer" containerID="83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9" Feb 19 23:32:39 crc kubenswrapper[4771]: E0219 23:32:39.831075 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9\": container with ID starting with 83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9 not found: ID does not exist" containerID="83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.831103 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9"} err="failed to get container status \"83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9\": rpc error: code = NotFound desc = could not find container \"83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9\": container with ID starting with 83bee10bbd22f6925489184a6433bfd0d47acc6bea8565e973d30b691a3558f9 not found: ID does not exist" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.831121 4771 scope.go:117] "RemoveContainer" containerID="babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d" Feb 19 23:32:39 crc kubenswrapper[4771]: E0219 23:32:39.831761 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d\": container with ID starting with babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d not found: ID does not exist" containerID="babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d" Feb 19 23:32:39 crc kubenswrapper[4771]: I0219 23:32:39.831826 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d"} err="failed to get container status \"babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d\": rpc error: code = NotFound desc = could not find container \"babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d\": container with ID starting with babbfabbe04d9eaad2bac0ee4258b56c358102c322ece99e81c0b0098251465d not found: ID does not exist" Feb 19 23:32:40 crc kubenswrapper[4771]: I0219 23:32:40.452307 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" path="/var/lib/kubelet/pods/ddf5522c-38ff-4a76-bae0-f5bf2de19d6d/volumes" Feb 19 23:32:42 crc kubenswrapper[4771]: I0219 23:32:42.956352 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:32:42 crc kubenswrapper[4771]: I0219 23:32:42.956877 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:32:42 crc kubenswrapper[4771]: I0219 23:32:42.956927 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:32:42 crc kubenswrapper[4771]: I0219 23:32:42.957876 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:32:42 crc kubenswrapper[4771]: I0219 23:32:42.957948 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" gracePeriod=600 Feb 19 23:32:43 crc kubenswrapper[4771]: E0219 23:32:43.090948 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:32:43 crc kubenswrapper[4771]: E0219 23:32:43.128701 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb19bfc5_46dd_47ae_9608_aafec9e35f9e.slice/crio-conmon-8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb19bfc5_46dd_47ae_9608_aafec9e35f9e.slice/crio-8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2.scope\": RecentStats: unable to find data in memory cache]" Feb 19 23:32:43 crc kubenswrapper[4771]: I0219 23:32:43.751463 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" exitCode=0 Feb 19 23:32:43 crc kubenswrapper[4771]: I0219 23:32:43.751764 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2"} Feb 19 23:32:43 crc kubenswrapper[4771]: I0219 23:32:43.751801 4771 scope.go:117] "RemoveContainer" containerID="dd6ed8bfb60102bd792d2fddbecb1a88666d249dc5badf8af50e58d941abf30f" Feb 19 23:32:43 crc kubenswrapper[4771]: I0219 23:32:43.752602 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:32:43 crc kubenswrapper[4771]: E0219 23:32:43.752907 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:32:55 crc kubenswrapper[4771]: I0219 23:32:55.438144 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:32:55 crc kubenswrapper[4771]: E0219 23:32:55.439605 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:32:55 crc kubenswrapper[4771]: I0219 23:32:55.894184 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0006193-39c1-45bb-924e-26c8cb22638b" containerID="181f34a9cdee3cf62c93bbda2a34136c4aaad42bb3b1b26eb05761684d53a5da" exitCode=0 Feb 19 23:32:55 crc kubenswrapper[4771]: I0219 23:32:55.894234 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" event={"ID":"e0006193-39c1-45bb-924e-26c8cb22638b","Type":"ContainerDied","Data":"181f34a9cdee3cf62c93bbda2a34136c4aaad42bb3b1b26eb05761684d53a5da"} Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.375046 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.565826 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-ssh-key-openstack-cell1\") pod \"e0006193-39c1-45bb-924e-26c8cb22638b\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.565999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-inventory\") pod \"e0006193-39c1-45bb-924e-26c8cb22638b\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.566082 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-bootstrap-combined-ca-bundle\") pod \"e0006193-39c1-45bb-924e-26c8cb22638b\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.566146 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlc2s\" (UniqueName: \"kubernetes.io/projected/e0006193-39c1-45bb-924e-26c8cb22638b-kube-api-access-tlc2s\") pod \"e0006193-39c1-45bb-924e-26c8cb22638b\" (UID: \"e0006193-39c1-45bb-924e-26c8cb22638b\") " Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.576665 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0006193-39c1-45bb-924e-26c8cb22638b-kube-api-access-tlc2s" (OuterVolumeSpecName: "kube-api-access-tlc2s") pod "e0006193-39c1-45bb-924e-26c8cb22638b" (UID: "e0006193-39c1-45bb-924e-26c8cb22638b"). InnerVolumeSpecName "kube-api-access-tlc2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.579032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e0006193-39c1-45bb-924e-26c8cb22638b" (UID: "e0006193-39c1-45bb-924e-26c8cb22638b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.609049 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e0006193-39c1-45bb-924e-26c8cb22638b" (UID: "e0006193-39c1-45bb-924e-26c8cb22638b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.641759 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-inventory" (OuterVolumeSpecName: "inventory") pod "e0006193-39c1-45bb-924e-26c8cb22638b" (UID: "e0006193-39c1-45bb-924e-26c8cb22638b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.669331 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.669381 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.669398 4771 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006193-39c1-45bb-924e-26c8cb22638b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.669417 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlc2s\" (UniqueName: \"kubernetes.io/projected/e0006193-39c1-45bb-924e-26c8cb22638b-kube-api-access-tlc2s\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.922072 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" event={"ID":"e0006193-39c1-45bb-924e-26c8cb22638b","Type":"ContainerDied","Data":"143afb899e91d3133294a80816e29ab76f891cadfc20a8b1c9973403951e52ed"} Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.922115 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="143afb899e91d3133294a80816e29ab76f891cadfc20a8b1c9973403951e52ed" Feb 19 23:32:57 crc kubenswrapper[4771]: I0219 23:32:57.922555 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5hlk6" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.044763 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-95ghw"] Feb 19 23:32:58 crc kubenswrapper[4771]: E0219 23:32:58.045363 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0006193-39c1-45bb-924e-26c8cb22638b" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.045387 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0006193-39c1-45bb-924e-26c8cb22638b" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:32:58 crc kubenswrapper[4771]: E0219 23:32:58.045412 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="extract-utilities" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.045422 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="extract-utilities" Feb 19 23:32:58 crc kubenswrapper[4771]: E0219 23:32:58.045441 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="registry-server" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.045450 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="registry-server" Feb 19 23:32:58 crc kubenswrapper[4771]: E0219 23:32:58.045480 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="extract-content" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.045488 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="extract-content" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.045713 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0006193-39c1-45bb-924e-26c8cb22638b" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.045738 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf5522c-38ff-4a76-bae0-f5bf2de19d6d" containerName="registry-server" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.046613 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.049754 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.049827 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.049783 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.050275 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.061938 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-95ghw"] Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.180509 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.180622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-inventory\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.180681 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfxm\" (UniqueName: \"kubernetes.io/projected/60280b37-1d56-417c-90db-5d8985050b08-kube-api-access-bzfxm\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.283920 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-inventory\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.284059 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfxm\" (UniqueName: \"kubernetes.io/projected/60280b37-1d56-417c-90db-5d8985050b08-kube-api-access-bzfxm\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.284322 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.297064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-inventory\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.303689 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.304767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfxm\" (UniqueName: \"kubernetes.io/projected/60280b37-1d56-417c-90db-5d8985050b08-kube-api-access-bzfxm\") pod \"download-cache-openstack-openstack-cell1-95ghw\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:58 crc kubenswrapper[4771]: I0219 23:32:58.405797 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:32:59 crc kubenswrapper[4771]: I0219 23:32:58.998218 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-95ghw"] Feb 19 23:32:59 crc kubenswrapper[4771]: I0219 23:32:59.974905 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" event={"ID":"60280b37-1d56-417c-90db-5d8985050b08","Type":"ContainerStarted","Data":"f117584c312e96a4d07814f8655e23141ff456503b237da95d4b1ab640b424ff"} Feb 19 23:33:00 crc kubenswrapper[4771]: I0219 23:33:00.992433 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" event={"ID":"60280b37-1d56-417c-90db-5d8985050b08","Type":"ContainerStarted","Data":"a3452513333bc15a376728470594b0f7fc73c9e1107f656c4029da95328565ee"} Feb 19 23:33:01 crc kubenswrapper[4771]: I0219 23:33:01.027343 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" podStartSLOduration=2.42635213 podStartE2EDuration="3.027317619s" podCreationTimestamp="2026-02-19 23:32:58 +0000 UTC" firstStartedPulling="2026-02-19 23:32:59.007600579 +0000 UTC m=+7479.279043069" lastFinishedPulling="2026-02-19 23:32:59.608566048 +0000 UTC m=+7479.880008558" observedRunningTime="2026-02-19 23:33:01.017045795 +0000 UTC m=+7481.288488305" watchObservedRunningTime="2026-02-19 23:33:01.027317619 +0000 UTC m=+7481.298760119" Feb 19 23:33:07 crc kubenswrapper[4771]: I0219 23:33:07.437388 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:33:07 crc kubenswrapper[4771]: E0219 23:33:07.438279 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:33:19 crc kubenswrapper[4771]: I0219 23:33:19.437197 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:33:19 crc kubenswrapper[4771]: E0219 23:33:19.438062 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:33:31 crc kubenswrapper[4771]: I0219 23:33:31.438259 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:33:31 crc kubenswrapper[4771]: E0219 23:33:31.439314 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:33:44 crc kubenswrapper[4771]: I0219 23:33:44.441347 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:33:44 crc kubenswrapper[4771]: E0219 23:33:44.443061 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:33:59 crc kubenswrapper[4771]: I0219 23:33:59.438600 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:33:59 crc kubenswrapper[4771]: E0219 23:33:59.439643 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:34:12 crc kubenswrapper[4771]: I0219 23:34:12.437806 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:34:12 crc kubenswrapper[4771]: E0219 23:34:12.439533 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:34:27 crc kubenswrapper[4771]: I0219 23:34:27.438384 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:34:27 crc kubenswrapper[4771]: E0219 23:34:27.439470 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:34:40 crc kubenswrapper[4771]: I0219 23:34:40.453447 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:34:40 crc kubenswrapper[4771]: E0219 23:34:40.454765 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:34:51 crc kubenswrapper[4771]: I0219 23:34:51.327243 4771 generic.go:334] "Generic (PLEG): container finished" podID="60280b37-1d56-417c-90db-5d8985050b08" containerID="a3452513333bc15a376728470594b0f7fc73c9e1107f656c4029da95328565ee" exitCode=0 Feb 19 23:34:51 crc kubenswrapper[4771]: I0219 23:34:51.327357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" event={"ID":"60280b37-1d56-417c-90db-5d8985050b08","Type":"ContainerDied","Data":"a3452513333bc15a376728470594b0f7fc73c9e1107f656c4029da95328565ee"} Feb 19 23:34:52 crc kubenswrapper[4771]: I0219 23:34:52.915594 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.071578 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-inventory\") pod \"60280b37-1d56-417c-90db-5d8985050b08\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.071727 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-ssh-key-openstack-cell1\") pod \"60280b37-1d56-417c-90db-5d8985050b08\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.071874 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfxm\" (UniqueName: \"kubernetes.io/projected/60280b37-1d56-417c-90db-5d8985050b08-kube-api-access-bzfxm\") pod \"60280b37-1d56-417c-90db-5d8985050b08\" (UID: \"60280b37-1d56-417c-90db-5d8985050b08\") " Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.098265 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60280b37-1d56-417c-90db-5d8985050b08-kube-api-access-bzfxm" (OuterVolumeSpecName: "kube-api-access-bzfxm") pod "60280b37-1d56-417c-90db-5d8985050b08" (UID: "60280b37-1d56-417c-90db-5d8985050b08"). InnerVolumeSpecName "kube-api-access-bzfxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.119110 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-inventory" (OuterVolumeSpecName: "inventory") pod "60280b37-1d56-417c-90db-5d8985050b08" (UID: "60280b37-1d56-417c-90db-5d8985050b08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.131463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "60280b37-1d56-417c-90db-5d8985050b08" (UID: "60280b37-1d56-417c-90db-5d8985050b08"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.175836 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzfxm\" (UniqueName: \"kubernetes.io/projected/60280b37-1d56-417c-90db-5d8985050b08-kube-api-access-bzfxm\") on node \"crc\" DevicePath \"\"" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.175875 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.175888 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/60280b37-1d56-417c-90db-5d8985050b08-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.355299 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" event={"ID":"60280b37-1d56-417c-90db-5d8985050b08","Type":"ContainerDied","Data":"f117584c312e96a4d07814f8655e23141ff456503b237da95d4b1ab640b424ff"} Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.355357 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f117584c312e96a4d07814f8655e23141ff456503b237da95d4b1ab640b424ff" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.355394 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-95ghw" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.548633 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-92xj9"] Feb 19 23:34:53 crc kubenswrapper[4771]: E0219 23:34:53.549412 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60280b37-1d56-417c-90db-5d8985050b08" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.549517 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="60280b37-1d56-417c-90db-5d8985050b08" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.549859 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="60280b37-1d56-417c-90db-5d8985050b08" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.553051 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.570268 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.572237 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.572570 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.572783 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.605044 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-92xj9"] Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.691272 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjkx\" (UniqueName: \"kubernetes.io/projected/0742feef-b258-4e8f-a369-07d7932f47aa-kube-api-access-kjjkx\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.691352 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-inventory\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.691417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.793715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjkx\" (UniqueName: \"kubernetes.io/projected/0742feef-b258-4e8f-a369-07d7932f47aa-kube-api-access-kjjkx\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.794099 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-inventory\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.794236 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.800445 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-inventory\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.805675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.812059 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjkx\" (UniqueName: \"kubernetes.io/projected/0742feef-b258-4e8f-a369-07d7932f47aa-kube-api-access-kjjkx\") pod \"configure-network-openstack-openstack-cell1-92xj9\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:53 crc kubenswrapper[4771]: I0219 23:34:53.899388 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:34:54 crc kubenswrapper[4771]: I0219 23:34:54.383466 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:34:54 crc kubenswrapper[4771]: I0219 23:34:54.383534 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-92xj9"] Feb 19 23:34:55 crc kubenswrapper[4771]: I0219 23:34:55.376243 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" event={"ID":"0742feef-b258-4e8f-a369-07d7932f47aa","Type":"ContainerStarted","Data":"b941f5a9bff32ba5b921d595460d9d3fe68780177fe15ce2c869ff79f947d219"} Feb 19 23:34:55 crc kubenswrapper[4771]: I0219 23:34:55.376883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" event={"ID":"0742feef-b258-4e8f-a369-07d7932f47aa","Type":"ContainerStarted","Data":"115f09511ae3b357e6b162f3648a1a11e8073c9f868b45286c0d3d42ad75a5bf"} Feb 19 23:34:55 crc kubenswrapper[4771]: I0219 23:34:55.401377 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" podStartSLOduration=1.953182975 podStartE2EDuration="2.401351769s" podCreationTimestamp="2026-02-19 23:34:53 +0000 UTC" firstStartedPulling="2026-02-19 23:34:54.383282825 +0000 UTC m=+7594.654725295" lastFinishedPulling="2026-02-19 23:34:54.831451609 +0000 UTC m=+7595.102894089" observedRunningTime="2026-02-19 23:34:55.394587529 +0000 UTC m=+7595.666030039" watchObservedRunningTime="2026-02-19 23:34:55.401351769 +0000 UTC m=+7595.672794299" Feb 19 23:34:55 crc kubenswrapper[4771]: I0219 23:34:55.439128 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:34:55 crc kubenswrapper[4771]: E0219 23:34:55.440224 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:35:07 crc kubenswrapper[4771]: I0219 23:35:07.437824 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:35:07 crc kubenswrapper[4771]: E0219 23:35:07.438688 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:35:18 crc kubenswrapper[4771]: I0219 23:35:18.437104 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:35:18 crc kubenswrapper[4771]: E0219 23:35:18.437672 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:35:31 crc kubenswrapper[4771]: I0219 23:35:31.437510 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:35:31 crc kubenswrapper[4771]: E0219 23:35:31.438235 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:35:46 crc kubenswrapper[4771]: I0219 23:35:46.438038 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:35:46 crc kubenswrapper[4771]: E0219 23:35:46.439095 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:35:47 crc kubenswrapper[4771]: I0219 23:35:47.889393 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2cj"] Feb 19 23:35:47 crc kubenswrapper[4771]: I0219 23:35:47.892429 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:47 crc kubenswrapper[4771]: I0219 23:35:47.905771 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2cj"] Feb 19 23:35:47 crc kubenswrapper[4771]: I0219 23:35:47.959402 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxmks\" (UniqueName: \"kubernetes.io/projected/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-kube-api-access-qxmks\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:47 crc kubenswrapper[4771]: I0219 23:35:47.959731 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-catalog-content\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:47 crc kubenswrapper[4771]: I0219 23:35:47.959783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-utilities\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.061718 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxmks\" (UniqueName: \"kubernetes.io/projected/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-kube-api-access-qxmks\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.061795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-catalog-content\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.061839 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-utilities\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.062438 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-catalog-content\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.062523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-utilities\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.095252 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxmks\" (UniqueName: \"kubernetes.io/projected/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-kube-api-access-qxmks\") pod \"redhat-marketplace-bx2cj\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.224364 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:48 crc kubenswrapper[4771]: I0219 23:35:48.791859 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2cj"] Feb 19 23:35:49 crc kubenswrapper[4771]: I0219 23:35:49.414918 4771 generic.go:334] "Generic (PLEG): container finished" podID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerID="24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0" exitCode=0 Feb 19 23:35:49 crc kubenswrapper[4771]: I0219 23:35:49.414992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2cj" event={"ID":"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2","Type":"ContainerDied","Data":"24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0"} Feb 19 23:35:49 crc kubenswrapper[4771]: I0219 23:35:49.415356 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2cj" event={"ID":"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2","Type":"ContainerStarted","Data":"76791e25f10cbbf48a82bbb553ae3f29d9b32fd43553940abe02061cd1f82dba"} Feb 19 23:35:50 crc kubenswrapper[4771]: I0219 23:35:50.428756 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2cj" event={"ID":"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2","Type":"ContainerStarted","Data":"a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e"} Feb 19 23:35:51 crc kubenswrapper[4771]: I0219 23:35:51.444650 4771 generic.go:334] "Generic (PLEG): container finished" podID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerID="a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e" exitCode=0 Feb 19 23:35:51 crc kubenswrapper[4771]: I0219 23:35:51.444774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2cj" event={"ID":"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2","Type":"ContainerDied","Data":"a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e"} Feb 19 23:35:52 crc kubenswrapper[4771]: I0219 23:35:52.457627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2cj" event={"ID":"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2","Type":"ContainerStarted","Data":"1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49"} Feb 19 23:35:52 crc kubenswrapper[4771]: I0219 23:35:52.484903 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bx2cj" podStartSLOduration=3.02675917 podStartE2EDuration="5.484887219s" podCreationTimestamp="2026-02-19 23:35:47 +0000 UTC" firstStartedPulling="2026-02-19 23:35:49.418039346 +0000 UTC m=+7649.689481826" lastFinishedPulling="2026-02-19 23:35:51.876167375 +0000 UTC m=+7652.147609875" observedRunningTime="2026-02-19 23:35:52.47458627 +0000 UTC m=+7652.746028780" watchObservedRunningTime="2026-02-19 23:35:52.484887219 +0000 UTC m=+7652.756329689" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.281525 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vml7l"] Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.284346 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.298898 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vml7l"] Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.389560 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xhs\" (UniqueName: \"kubernetes.io/projected/40f16ff5-6d6f-41ad-8fb5-64389c460496-kube-api-access-p6xhs\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.389914 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-utilities\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.390109 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-catalog-content\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.493391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-utilities\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.493578 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-catalog-content\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.494205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-utilities\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.494214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xhs\" (UniqueName: \"kubernetes.io/projected/40f16ff5-6d6f-41ad-8fb5-64389c460496-kube-api-access-p6xhs\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.494841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-catalog-content\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.524910 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xhs\" (UniqueName: \"kubernetes.io/projected/40f16ff5-6d6f-41ad-8fb5-64389c460496-kube-api-access-p6xhs\") pod \"certified-operators-vml7l\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:53 crc kubenswrapper[4771]: I0219 23:35:53.635305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:35:54 crc kubenswrapper[4771]: W0219 23:35:54.132853 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f16ff5_6d6f_41ad_8fb5_64389c460496.slice/crio-a734ab90ab14b3af7bfd8783f469d06fd0abab47be41e0131579f1695822e315 WatchSource:0}: Error finding container a734ab90ab14b3af7bfd8783f469d06fd0abab47be41e0131579f1695822e315: Status 404 returned error can't find the container with id a734ab90ab14b3af7bfd8783f469d06fd0abab47be41e0131579f1695822e315 Feb 19 23:35:54 crc kubenswrapper[4771]: I0219 23:35:54.135210 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vml7l"] Feb 19 23:35:54 crc kubenswrapper[4771]: I0219 23:35:54.475215 4771 generic.go:334] "Generic (PLEG): container finished" podID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerID="6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e" exitCode=0 Feb 19 23:35:54 crc kubenswrapper[4771]: I0219 23:35:54.475307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vml7l" event={"ID":"40f16ff5-6d6f-41ad-8fb5-64389c460496","Type":"ContainerDied","Data":"6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e"} Feb 19 23:35:54 crc kubenswrapper[4771]: I0219 23:35:54.475486 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vml7l" event={"ID":"40f16ff5-6d6f-41ad-8fb5-64389c460496","Type":"ContainerStarted","Data":"a734ab90ab14b3af7bfd8783f469d06fd0abab47be41e0131579f1695822e315"} Feb 19 23:35:55 crc kubenswrapper[4771]: I0219 23:35:55.488517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vml7l" event={"ID":"40f16ff5-6d6f-41ad-8fb5-64389c460496","Type":"ContainerStarted","Data":"f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1"} Feb 19 23:35:57 crc kubenswrapper[4771]: I0219 23:35:57.437609 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:35:57 crc kubenswrapper[4771]: E0219 23:35:57.438159 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:35:57 crc kubenswrapper[4771]: I0219 23:35:57.513316 4771 generic.go:334] "Generic (PLEG): container finished" podID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerID="f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1" exitCode=0 Feb 19 23:35:57 crc kubenswrapper[4771]: I0219 23:35:57.513376 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vml7l" event={"ID":"40f16ff5-6d6f-41ad-8fb5-64389c460496","Type":"ContainerDied","Data":"f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1"} Feb 19 23:35:58 crc kubenswrapper[4771]: I0219 23:35:58.224820 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:58 crc kubenswrapper[4771]: I0219 23:35:58.225407 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:58 crc kubenswrapper[4771]: I0219 23:35:58.292466 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:35:58 crc kubenswrapper[4771]: I0219 23:35:58.525743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vml7l" event={"ID":"40f16ff5-6d6f-41ad-8fb5-64389c460496","Type":"ContainerStarted","Data":"7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010"} Feb 19 23:35:58 crc kubenswrapper[4771]: I0219 23:35:58.550851 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vml7l" podStartSLOduration=2.066909875 podStartE2EDuration="5.550833562s" podCreationTimestamp="2026-02-19 23:35:53 +0000 UTC" firstStartedPulling="2026-02-19 23:35:54.477155098 +0000 UTC m=+7654.748597608" lastFinishedPulling="2026-02-19 23:35:57.961078825 +0000 UTC m=+7658.232521295" observedRunningTime="2026-02-19 23:35:58.548298976 +0000 UTC m=+7658.819741476" watchObservedRunningTime="2026-02-19 23:35:58.550833562 +0000 UTC m=+7658.822276032" Feb 19 23:35:58 crc kubenswrapper[4771]: I0219 23:35:58.592055 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:36:00 crc kubenswrapper[4771]: I0219 23:36:00.681814 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2cj"] Feb 19 23:36:01 crc kubenswrapper[4771]: I0219 23:36:01.558547 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bx2cj" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="registry-server" containerID="cri-o://1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49" gracePeriod=2 Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.076786 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.218879 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-utilities\") pod \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.218937 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-catalog-content\") pod \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.219003 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxmks\" (UniqueName: \"kubernetes.io/projected/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-kube-api-access-qxmks\") pod \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\" (UID: \"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2\") " Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.219521 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-utilities" (OuterVolumeSpecName: "utilities") pod "4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" (UID: "4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.225929 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-kube-api-access-qxmks" (OuterVolumeSpecName: "kube-api-access-qxmks") pod "4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" (UID: "4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2"). InnerVolumeSpecName "kube-api-access-qxmks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.246221 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" (UID: "4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.321440 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.321741 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.321825 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxmks\" (UniqueName: \"kubernetes.io/projected/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2-kube-api-access-qxmks\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.574927 4771 generic.go:334] "Generic (PLEG): container finished" podID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerID="1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49" exitCode=0 Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.574967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2cj" event={"ID":"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2","Type":"ContainerDied","Data":"1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49"} Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.575047 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2cj" event={"ID":"4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2","Type":"ContainerDied","Data":"76791e25f10cbbf48a82bbb553ae3f29d9b32fd43553940abe02061cd1f82dba"} Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.575044 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2cj" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.575070 4771 scope.go:117] "RemoveContainer" containerID="1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.614088 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2cj"] Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.617348 4771 scope.go:117] "RemoveContainer" containerID="a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.629450 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2cj"] Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.641156 4771 scope.go:117] "RemoveContainer" containerID="24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.700572 4771 scope.go:117] "RemoveContainer" containerID="1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49" Feb 19 23:36:02 crc kubenswrapper[4771]: E0219 23:36:02.701118 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49\": container with ID starting with 1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49 not found: ID does not exist" containerID="1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.701174 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49"} err="failed to get container status \"1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49\": rpc error: code = NotFound desc = could not find container \"1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49\": container with ID starting with 1af33f20bc166747ec5e2d5e8fc9765a9b51b690f4de2123281814e8a762ce49 not found: ID does not exist" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.701209 4771 scope.go:117] "RemoveContainer" containerID="a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e" Feb 19 23:36:02 crc kubenswrapper[4771]: E0219 23:36:02.701608 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e\": container with ID starting with a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e not found: ID does not exist" containerID="a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.701677 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e"} err="failed to get container status \"a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e\": rpc error: code = NotFound desc = could not find container \"a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e\": container with ID starting with a50866890d5413df6e47ee05b5ecd629d6d3c4decf0e4237a7aeab6750fadd5e not found: ID does not exist" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.701709 4771 scope.go:117] "RemoveContainer" containerID="24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0" Feb 19 23:36:02 crc kubenswrapper[4771]: E0219 23:36:02.702103 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0\": container with ID starting with 24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0 not found: ID does not exist" containerID="24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0" Feb 19 23:36:02 crc kubenswrapper[4771]: I0219 23:36:02.702140 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0"} err="failed to get container status \"24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0\": rpc error: code = NotFound desc = could not find container \"24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0\": container with ID starting with 24c162330baa2f14832cd001649601d26f69f6defe0c30c7416a0f752d3c1bc0 not found: ID does not exist" Feb 19 23:36:03 crc kubenswrapper[4771]: I0219 23:36:03.636264 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:36:03 crc kubenswrapper[4771]: I0219 23:36:03.636732 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:36:04 crc kubenswrapper[4771]: I0219 23:36:04.470537 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" path="/var/lib/kubelet/pods/4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2/volumes" Feb 19 23:36:04 crc kubenswrapper[4771]: I0219 23:36:04.710997 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vml7l" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="registry-server" probeResult="failure" output=< Feb 19 23:36:04 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:36:04 crc kubenswrapper[4771]: > Feb 19 23:36:10 crc kubenswrapper[4771]: I0219 23:36:10.451160 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:36:10 crc kubenswrapper[4771]: E0219 23:36:10.452363 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:36:13 crc kubenswrapper[4771]: I0219 23:36:13.721637 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:36:13 crc kubenswrapper[4771]: I0219 23:36:13.816574 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:36:13 crc kubenswrapper[4771]: I0219 23:36:13.978798 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vml7l"] Feb 19 23:36:15 crc kubenswrapper[4771]: I0219 23:36:15.746846 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vml7l" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="registry-server" containerID="cri-o://7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010" gracePeriod=2 Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.325963 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.387013 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-utilities\") pod \"40f16ff5-6d6f-41ad-8fb5-64389c460496\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.387084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-catalog-content\") pod \"40f16ff5-6d6f-41ad-8fb5-64389c460496\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.387132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xhs\" (UniqueName: \"kubernetes.io/projected/40f16ff5-6d6f-41ad-8fb5-64389c460496-kube-api-access-p6xhs\") pod \"40f16ff5-6d6f-41ad-8fb5-64389c460496\" (UID: \"40f16ff5-6d6f-41ad-8fb5-64389c460496\") " Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.387780 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-utilities" (OuterVolumeSpecName: "utilities") pod "40f16ff5-6d6f-41ad-8fb5-64389c460496" (UID: "40f16ff5-6d6f-41ad-8fb5-64389c460496"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.392117 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f16ff5-6d6f-41ad-8fb5-64389c460496-kube-api-access-p6xhs" (OuterVolumeSpecName: "kube-api-access-p6xhs") pod "40f16ff5-6d6f-41ad-8fb5-64389c460496" (UID: "40f16ff5-6d6f-41ad-8fb5-64389c460496"). InnerVolumeSpecName "kube-api-access-p6xhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.435205 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40f16ff5-6d6f-41ad-8fb5-64389c460496" (UID: "40f16ff5-6d6f-41ad-8fb5-64389c460496"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.490116 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.490719 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40f16ff5-6d6f-41ad-8fb5-64389c460496-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.490738 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xhs\" (UniqueName: \"kubernetes.io/projected/40f16ff5-6d6f-41ad-8fb5-64389c460496-kube-api-access-p6xhs\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.759468 4771 generic.go:334] "Generic (PLEG): container finished" podID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerID="7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010" exitCode=0 Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.759533 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vml7l" event={"ID":"40f16ff5-6d6f-41ad-8fb5-64389c460496","Type":"ContainerDied","Data":"7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010"} Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.759575 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vml7l" event={"ID":"40f16ff5-6d6f-41ad-8fb5-64389c460496","Type":"ContainerDied","Data":"a734ab90ab14b3af7bfd8783f469d06fd0abab47be41e0131579f1695822e315"} Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.759578 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vml7l" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.759604 4771 scope.go:117] "RemoveContainer" containerID="7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.795107 4771 scope.go:117] "RemoveContainer" containerID="f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.815069 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vml7l"] Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.834626 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vml7l"] Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.840271 4771 scope.go:117] "RemoveContainer" containerID="6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.882217 4771 scope.go:117] "RemoveContainer" containerID="7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010" Feb 19 23:36:16 crc kubenswrapper[4771]: E0219 23:36:16.882624 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010\": container with ID starting with 7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010 not found: ID does not exist" containerID="7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.882707 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010"} err="failed to get container status \"7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010\": rpc error: code = NotFound desc = could not find container \"7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010\": container with ID starting with 7065c415789fa1cd122bf970204523d6b4531db3a619371037df680e3fda5010 not found: ID does not exist" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.882733 4771 scope.go:117] "RemoveContainer" containerID="f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1" Feb 19 23:36:16 crc kubenswrapper[4771]: E0219 23:36:16.884204 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1\": container with ID starting with f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1 not found: ID does not exist" containerID="f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.884252 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1"} err="failed to get container status \"f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1\": rpc error: code = NotFound desc = could not find container \"f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1\": container with ID starting with f2c127322ad030c72360d899a955f323ebfceeba11560fc3de69c358c0bab6e1 not found: ID does not exist" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.884285 4771 scope.go:117] "RemoveContainer" containerID="6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e" Feb 19 23:36:16 crc kubenswrapper[4771]: E0219 23:36:16.884591 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e\": container with ID starting with 6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e not found: ID does not exist" containerID="6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e" Feb 19 23:36:16 crc kubenswrapper[4771]: I0219 23:36:16.884613 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e"} err="failed to get container status \"6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e\": rpc error: code = NotFound desc = could not find container \"6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e\": container with ID starting with 6579e3bdc1d25099080522d65442d83ba84433cc418cbd4b576e8d98b9e12e5e not found: ID does not exist" Feb 19 23:36:18 crc kubenswrapper[4771]: I0219 23:36:18.456780 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" path="/var/lib/kubelet/pods/40f16ff5-6d6f-41ad-8fb5-64389c460496/volumes" Feb 19 23:36:22 crc kubenswrapper[4771]: I0219 23:36:22.437783 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:36:22 crc kubenswrapper[4771]: E0219 23:36:22.438301 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:36:22 crc kubenswrapper[4771]: I0219 23:36:22.853511 4771 generic.go:334] "Generic (PLEG): container finished" podID="0742feef-b258-4e8f-a369-07d7932f47aa" containerID="b941f5a9bff32ba5b921d595460d9d3fe68780177fe15ce2c869ff79f947d219" exitCode=0 Feb 19 23:36:22 crc kubenswrapper[4771]: I0219 23:36:22.853551 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" event={"ID":"0742feef-b258-4e8f-a369-07d7932f47aa","Type":"ContainerDied","Data":"b941f5a9bff32ba5b921d595460d9d3fe68780177fe15ce2c869ff79f947d219"} Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.368429 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.490681 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-inventory\") pod \"0742feef-b258-4e8f-a369-07d7932f47aa\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.491177 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-ssh-key-openstack-cell1\") pod \"0742feef-b258-4e8f-a369-07d7932f47aa\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.491353 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjjkx\" (UniqueName: \"kubernetes.io/projected/0742feef-b258-4e8f-a369-07d7932f47aa-kube-api-access-kjjkx\") pod \"0742feef-b258-4e8f-a369-07d7932f47aa\" (UID: \"0742feef-b258-4e8f-a369-07d7932f47aa\") " Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.499264 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0742feef-b258-4e8f-a369-07d7932f47aa-kube-api-access-kjjkx" (OuterVolumeSpecName: "kube-api-access-kjjkx") pod "0742feef-b258-4e8f-a369-07d7932f47aa" (UID: "0742feef-b258-4e8f-a369-07d7932f47aa"). InnerVolumeSpecName "kube-api-access-kjjkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.529076 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0742feef-b258-4e8f-a369-07d7932f47aa" (UID: "0742feef-b258-4e8f-a369-07d7932f47aa"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.531652 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-inventory" (OuterVolumeSpecName: "inventory") pod "0742feef-b258-4e8f-a369-07d7932f47aa" (UID: "0742feef-b258-4e8f-a369-07d7932f47aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.594014 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.594118 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjjkx\" (UniqueName: \"kubernetes.io/projected/0742feef-b258-4e8f-a369-07d7932f47aa-kube-api-access-kjjkx\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.594132 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0742feef-b258-4e8f-a369-07d7932f47aa-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.880320 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" event={"ID":"0742feef-b258-4e8f-a369-07d7932f47aa","Type":"ContainerDied","Data":"115f09511ae3b357e6b162f3648a1a11e8073c9f868b45286c0d3d42ad75a5bf"} Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.880378 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="115f09511ae3b357e6b162f3648a1a11e8073c9f868b45286c0d3d42ad75a5bf" Feb 19 23:36:24 crc kubenswrapper[4771]: I0219 23:36:24.880439 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-92xj9" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.013202 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-d9lvg"] Feb 19 23:36:25 crc kubenswrapper[4771]: E0219 23:36:25.013937 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="registry-server" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.013957 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="registry-server" Feb 19 23:36:25 crc kubenswrapper[4771]: E0219 23:36:25.013981 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="extract-content" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.013990 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="extract-content" Feb 19 23:36:25 crc kubenswrapper[4771]: E0219 23:36:25.014003 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="extract-content" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014029 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="extract-content" Feb 19 23:36:25 crc kubenswrapper[4771]: E0219 23:36:25.014042 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="extract-utilities" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014050 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="extract-utilities" Feb 19 23:36:25 crc kubenswrapper[4771]: E0219 23:36:25.014078 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="registry-server" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014086 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="registry-server" Feb 19 23:36:25 crc kubenswrapper[4771]: E0219 23:36:25.014101 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="extract-utilities" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014111 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="extract-utilities" Feb 19 23:36:25 crc kubenswrapper[4771]: E0219 23:36:25.014125 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0742feef-b258-4e8f-a369-07d7932f47aa" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014133 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0742feef-b258-4e8f-a369-07d7932f47aa" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014357 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b74173a-0298-48e2-a0cb-cbe5c8c1f0d2" containerName="registry-server" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014379 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0742feef-b258-4e8f-a369-07d7932f47aa" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.014413 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f16ff5-6d6f-41ad-8fb5-64389c460496" containerName="registry-server" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.015271 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.017794 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.017858 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.018083 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.018751 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.033130 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-d9lvg"] Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.207130 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdt7f\" (UniqueName: \"kubernetes.io/projected/ae484840-51ce-4b35-a888-b3e1653e847b-kube-api-access-vdt7f\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.207233 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-inventory\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.207283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.309253 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdt7f\" (UniqueName: \"kubernetes.io/projected/ae484840-51ce-4b35-a888-b3e1653e847b-kube-api-access-vdt7f\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.309359 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-inventory\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.309410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.325419 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.325682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-inventory\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.328514 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdt7f\" (UniqueName: \"kubernetes.io/projected/ae484840-51ce-4b35-a888-b3e1653e847b-kube-api-access-vdt7f\") pod \"validate-network-openstack-openstack-cell1-d9lvg\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.342661 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:25 crc kubenswrapper[4771]: I0219 23:36:25.981722 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-d9lvg"] Feb 19 23:36:26 crc kubenswrapper[4771]: I0219 23:36:26.904150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" event={"ID":"ae484840-51ce-4b35-a888-b3e1653e847b","Type":"ContainerStarted","Data":"a611f7e2a42ae46096e0374b55d4b86e0c441e9913f55b5627cb10b8476155da"} Feb 19 23:36:26 crc kubenswrapper[4771]: I0219 23:36:26.905150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" event={"ID":"ae484840-51ce-4b35-a888-b3e1653e847b","Type":"ContainerStarted","Data":"2852641af4190f643d514b140ed80586a57d29332ed02328fa382a2060200b7b"} Feb 19 23:36:26 crc kubenswrapper[4771]: I0219 23:36:26.938189 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" podStartSLOduration=2.435417535 podStartE2EDuration="2.938169652s" podCreationTimestamp="2026-02-19 23:36:24 +0000 UTC" firstStartedPulling="2026-02-19 23:36:25.980178033 +0000 UTC m=+7686.251620503" lastFinishedPulling="2026-02-19 23:36:26.48293014 +0000 UTC m=+7686.754372620" observedRunningTime="2026-02-19 23:36:26.929972268 +0000 UTC m=+7687.201414758" watchObservedRunningTime="2026-02-19 23:36:26.938169652 +0000 UTC m=+7687.209612122" Feb 19 23:36:32 crc kubenswrapper[4771]: I0219 23:36:32.979875 4771 generic.go:334] "Generic (PLEG): container finished" podID="ae484840-51ce-4b35-a888-b3e1653e847b" containerID="a611f7e2a42ae46096e0374b55d4b86e0c441e9913f55b5627cb10b8476155da" exitCode=0 Feb 19 23:36:32 crc kubenswrapper[4771]: I0219 23:36:32.979958 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" event={"ID":"ae484840-51ce-4b35-a888-b3e1653e847b","Type":"ContainerDied","Data":"a611f7e2a42ae46096e0374b55d4b86e0c441e9913f55b5627cb10b8476155da"} Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.437682 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:36:34 crc kubenswrapper[4771]: E0219 23:36:34.438555 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.542486 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.733729 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-ssh-key-openstack-cell1\") pod \"ae484840-51ce-4b35-a888-b3e1653e847b\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.733865 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdt7f\" (UniqueName: \"kubernetes.io/projected/ae484840-51ce-4b35-a888-b3e1653e847b-kube-api-access-vdt7f\") pod \"ae484840-51ce-4b35-a888-b3e1653e847b\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.733894 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-inventory\") pod \"ae484840-51ce-4b35-a888-b3e1653e847b\" (UID: \"ae484840-51ce-4b35-a888-b3e1653e847b\") " Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.743379 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae484840-51ce-4b35-a888-b3e1653e847b-kube-api-access-vdt7f" (OuterVolumeSpecName: "kube-api-access-vdt7f") pod "ae484840-51ce-4b35-a888-b3e1653e847b" (UID: "ae484840-51ce-4b35-a888-b3e1653e847b"). InnerVolumeSpecName "kube-api-access-vdt7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.772745 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ae484840-51ce-4b35-a888-b3e1653e847b" (UID: "ae484840-51ce-4b35-a888-b3e1653e847b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.788475 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-inventory" (OuterVolumeSpecName: "inventory") pod "ae484840-51ce-4b35-a888-b3e1653e847b" (UID: "ae484840-51ce-4b35-a888-b3e1653e847b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.838250 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdt7f\" (UniqueName: \"kubernetes.io/projected/ae484840-51ce-4b35-a888-b3e1653e847b-kube-api-access-vdt7f\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.838532 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:34 crc kubenswrapper[4771]: I0219 23:36:34.838581 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae484840-51ce-4b35-a888-b3e1653e847b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.006357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" event={"ID":"ae484840-51ce-4b35-a888-b3e1653e847b","Type":"ContainerDied","Data":"2852641af4190f643d514b140ed80586a57d29332ed02328fa382a2060200b7b"} Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.006404 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2852641af4190f643d514b140ed80586a57d29332ed02328fa382a2060200b7b" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.006491 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-d9lvg" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.131628 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-dnk4m"] Feb 19 23:36:35 crc kubenswrapper[4771]: E0219 23:36:35.132130 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae484840-51ce-4b35-a888-b3e1653e847b" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.132152 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae484840-51ce-4b35-a888-b3e1653e847b" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.132446 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae484840-51ce-4b35-a888-b3e1653e847b" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.133300 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.136525 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.136805 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.137000 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.137932 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.145324 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-dnk4m"] Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.246148 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chj4w\" (UniqueName: \"kubernetes.io/projected/7ca871d9-7d59-49d1-9794-512c7d52bc8d-kube-api-access-chj4w\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.246478 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.246598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-inventory\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.348363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chj4w\" (UniqueName: \"kubernetes.io/projected/7ca871d9-7d59-49d1-9794-512c7d52bc8d-kube-api-access-chj4w\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.348479 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.348679 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-inventory\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.353582 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-inventory\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.355887 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.369682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chj4w\" (UniqueName: \"kubernetes.io/projected/7ca871d9-7d59-49d1-9794-512c7d52bc8d-kube-api-access-chj4w\") pod \"install-os-openstack-openstack-cell1-dnk4m\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:35 crc kubenswrapper[4771]: I0219 23:36:35.466202 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:36:36 crc kubenswrapper[4771]: I0219 23:36:36.052660 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-dnk4m"] Feb 19 23:36:37 crc kubenswrapper[4771]: I0219 23:36:37.043223 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" event={"ID":"7ca871d9-7d59-49d1-9794-512c7d52bc8d","Type":"ContainerStarted","Data":"5a7a03b1c1ab60240981fc732682ebe7e64ab365f8b1eaf733a49e7b8c4d2ab5"} Feb 19 23:36:37 crc kubenswrapper[4771]: I0219 23:36:37.043559 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" event={"ID":"7ca871d9-7d59-49d1-9794-512c7d52bc8d","Type":"ContainerStarted","Data":"2859752b64e45eaf448315093cc2665be731e7e184f0a15470b8efb89d5ee19b"} Feb 19 23:36:37 crc kubenswrapper[4771]: I0219 23:36:37.063304 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" podStartSLOduration=1.632879407 podStartE2EDuration="2.063275408s" podCreationTimestamp="2026-02-19 23:36:35 +0000 UTC" firstStartedPulling="2026-02-19 23:36:36.052538066 +0000 UTC m=+7696.323980546" lastFinishedPulling="2026-02-19 23:36:36.482934077 +0000 UTC m=+7696.754376547" observedRunningTime="2026-02-19 23:36:37.059946142 +0000 UTC m=+7697.331388642" watchObservedRunningTime="2026-02-19 23:36:37.063275408 +0000 UTC m=+7697.334717918" Feb 19 23:36:46 crc kubenswrapper[4771]: I0219 23:36:46.454725 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:36:46 crc kubenswrapper[4771]: E0219 23:36:46.457082 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:36:58 crc kubenswrapper[4771]: I0219 23:36:58.438386 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:36:58 crc kubenswrapper[4771]: E0219 23:36:58.439462 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.428389 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b98v2"] Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.434372 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.483082 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b98v2"] Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.606750 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-utilities\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.607369 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-catalog-content\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.607822 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxm7\" (UniqueName: \"kubernetes.io/projected/34ca2f72-087c-4091-b944-ce63d69ac48e-kube-api-access-5rxm7\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.709964 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxm7\" (UniqueName: \"kubernetes.io/projected/34ca2f72-087c-4091-b944-ce63d69ac48e-kube-api-access-5rxm7\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.710064 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-utilities\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.710205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-catalog-content\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.710609 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-utilities\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.710641 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-catalog-content\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.729303 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxm7\" (UniqueName: \"kubernetes.io/projected/34ca2f72-087c-4091-b944-ce63d69ac48e-kube-api-access-5rxm7\") pod \"redhat-operators-b98v2\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:00 crc kubenswrapper[4771]: I0219 23:37:00.782775 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:01 crc kubenswrapper[4771]: I0219 23:37:01.251104 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b98v2"] Feb 19 23:37:01 crc kubenswrapper[4771]: I0219 23:37:01.358948 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b98v2" event={"ID":"34ca2f72-087c-4091-b944-ce63d69ac48e","Type":"ContainerStarted","Data":"1f999440cb1d8036f2a3e97fefe51d7de348852242af3dfcf9e9c43e1c35eb1e"} Feb 19 23:37:02 crc kubenswrapper[4771]: I0219 23:37:02.371257 4771 generic.go:334] "Generic (PLEG): container finished" podID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerID="77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8" exitCode=0 Feb 19 23:37:02 crc kubenswrapper[4771]: I0219 23:37:02.371491 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b98v2" event={"ID":"34ca2f72-087c-4091-b944-ce63d69ac48e","Type":"ContainerDied","Data":"77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8"} Feb 19 23:37:04 crc kubenswrapper[4771]: I0219 23:37:04.402433 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b98v2" event={"ID":"34ca2f72-087c-4091-b944-ce63d69ac48e","Type":"ContainerStarted","Data":"e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0"} Feb 19 23:37:08 crc kubenswrapper[4771]: I0219 23:37:08.478178 4771 generic.go:334] "Generic (PLEG): container finished" podID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerID="e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0" exitCode=0 Feb 19 23:37:08 crc kubenswrapper[4771]: I0219 23:37:08.478291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b98v2" event={"ID":"34ca2f72-087c-4091-b944-ce63d69ac48e","Type":"ContainerDied","Data":"e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0"} Feb 19 23:37:09 crc kubenswrapper[4771]: I0219 23:37:09.494322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b98v2" event={"ID":"34ca2f72-087c-4091-b944-ce63d69ac48e","Type":"ContainerStarted","Data":"caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a"} Feb 19 23:37:09 crc kubenswrapper[4771]: I0219 23:37:09.526984 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b98v2" podStartSLOduration=3.036116672 podStartE2EDuration="9.526964492s" podCreationTimestamp="2026-02-19 23:37:00 +0000 UTC" firstStartedPulling="2026-02-19 23:37:02.375533618 +0000 UTC m=+7722.646976088" lastFinishedPulling="2026-02-19 23:37:08.866381428 +0000 UTC m=+7729.137823908" observedRunningTime="2026-02-19 23:37:09.519911878 +0000 UTC m=+7729.791354358" watchObservedRunningTime="2026-02-19 23:37:09.526964492 +0000 UTC m=+7729.798406962" Feb 19 23:37:10 crc kubenswrapper[4771]: I0219 23:37:10.471863 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:37:10 crc kubenswrapper[4771]: E0219 23:37:10.472720 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:37:10 crc kubenswrapper[4771]: I0219 23:37:10.783053 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:10 crc kubenswrapper[4771]: I0219 23:37:10.783127 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:11 crc kubenswrapper[4771]: I0219 23:37:11.841573 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b98v2" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:37:11 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:37:11 crc kubenswrapper[4771]: > Feb 19 23:37:21 crc kubenswrapper[4771]: I0219 23:37:21.869884 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b98v2" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:37:21 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:37:21 crc kubenswrapper[4771]: > Feb 19 23:37:24 crc kubenswrapper[4771]: I0219 23:37:24.437918 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:37:24 crc kubenswrapper[4771]: E0219 23:37:24.439336 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:37:24 crc kubenswrapper[4771]: I0219 23:37:24.660773 4771 generic.go:334] "Generic (PLEG): container finished" podID="7ca871d9-7d59-49d1-9794-512c7d52bc8d" containerID="5a7a03b1c1ab60240981fc732682ebe7e64ab365f8b1eaf733a49e7b8c4d2ab5" exitCode=0 Feb 19 23:37:24 crc kubenswrapper[4771]: I0219 23:37:24.660839 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" event={"ID":"7ca871d9-7d59-49d1-9794-512c7d52bc8d","Type":"ContainerDied","Data":"5a7a03b1c1ab60240981fc732682ebe7e64ab365f8b1eaf733a49e7b8c4d2ab5"} Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.214702 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.366709 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chj4w\" (UniqueName: \"kubernetes.io/projected/7ca871d9-7d59-49d1-9794-512c7d52bc8d-kube-api-access-chj4w\") pod \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.366805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-inventory\") pod \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.366830 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-ssh-key-openstack-cell1\") pod \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\" (UID: \"7ca871d9-7d59-49d1-9794-512c7d52bc8d\") " Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.373256 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca871d9-7d59-49d1-9794-512c7d52bc8d-kube-api-access-chj4w" (OuterVolumeSpecName: "kube-api-access-chj4w") pod "7ca871d9-7d59-49d1-9794-512c7d52bc8d" (UID: "7ca871d9-7d59-49d1-9794-512c7d52bc8d"). InnerVolumeSpecName "kube-api-access-chj4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.398081 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7ca871d9-7d59-49d1-9794-512c7d52bc8d" (UID: "7ca871d9-7d59-49d1-9794-512c7d52bc8d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.417694 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-inventory" (OuterVolumeSpecName: "inventory") pod "7ca871d9-7d59-49d1-9794-512c7d52bc8d" (UID: "7ca871d9-7d59-49d1-9794-512c7d52bc8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.470611 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chj4w\" (UniqueName: \"kubernetes.io/projected/7ca871d9-7d59-49d1-9794-512c7d52bc8d-kube-api-access-chj4w\") on node \"crc\" DevicePath \"\"" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.470646 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.470658 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ca871d9-7d59-49d1-9794-512c7d52bc8d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.686500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" event={"ID":"7ca871d9-7d59-49d1-9794-512c7d52bc8d","Type":"ContainerDied","Data":"2859752b64e45eaf448315093cc2665be731e7e184f0a15470b8efb89d5ee19b"} Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.686542 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2859752b64e45eaf448315093cc2665be731e7e184f0a15470b8efb89d5ee19b" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.686681 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-dnk4m" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.787612 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-472r9"] Feb 19 23:37:26 crc kubenswrapper[4771]: E0219 23:37:26.789563 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca871d9-7d59-49d1-9794-512c7d52bc8d" containerName="install-os-openstack-openstack-cell1" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.789762 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca871d9-7d59-49d1-9794-512c7d52bc8d" containerName="install-os-openstack-openstack-cell1" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.790411 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca871d9-7d59-49d1-9794-512c7d52bc8d" containerName="install-os-openstack-openstack-cell1" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.792336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.794724 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.796813 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.799939 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-472r9"] Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.802539 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.806063 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.880870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbnk\" (UniqueName: \"kubernetes.io/projected/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-kube-api-access-6rbnk\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.881112 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.881311 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-inventory\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.983788 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbnk\" (UniqueName: \"kubernetes.io/projected/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-kube-api-access-6rbnk\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.983903 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.983968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-inventory\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.991446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:26 crc kubenswrapper[4771]: I0219 23:37:26.991457 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-inventory\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:27 crc kubenswrapper[4771]: I0219 23:37:27.003484 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbnk\" (UniqueName: \"kubernetes.io/projected/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-kube-api-access-6rbnk\") pod \"configure-os-openstack-openstack-cell1-472r9\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:27 crc kubenswrapper[4771]: I0219 23:37:27.131805 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:37:27 crc kubenswrapper[4771]: I0219 23:37:27.758230 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-472r9"] Feb 19 23:37:28 crc kubenswrapper[4771]: I0219 23:37:28.714843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-472r9" event={"ID":"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd","Type":"ContainerStarted","Data":"0ec046218a6ef0e1fd73a2def71c648444ba3d3c7aede3ddc8a703f9d6f4193a"} Feb 19 23:37:28 crc kubenswrapper[4771]: I0219 23:37:28.715459 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-472r9" event={"ID":"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd","Type":"ContainerStarted","Data":"0bdf09c142e0adeb3b67cef1444134715494f815e1305bf56b12021bf12aca23"} Feb 19 23:37:28 crc kubenswrapper[4771]: I0219 23:37:28.740325 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-472r9" podStartSLOduration=2.267435373 podStartE2EDuration="2.740307067s" podCreationTimestamp="2026-02-19 23:37:26 +0000 UTC" firstStartedPulling="2026-02-19 23:37:27.764985533 +0000 UTC m=+7748.036428043" lastFinishedPulling="2026-02-19 23:37:28.237857257 +0000 UTC m=+7748.509299737" observedRunningTime="2026-02-19 23:37:28.733759386 +0000 UTC m=+7749.005201946" watchObservedRunningTime="2026-02-19 23:37:28.740307067 +0000 UTC m=+7749.011749537" Feb 19 23:37:31 crc kubenswrapper[4771]: I0219 23:37:31.852078 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b98v2" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:37:31 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:37:31 crc kubenswrapper[4771]: > Feb 19 23:37:37 crc kubenswrapper[4771]: I0219 23:37:37.438429 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:37:37 crc kubenswrapper[4771]: E0219 23:37:37.439814 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:37:40 crc kubenswrapper[4771]: I0219 23:37:40.863945 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:40 crc kubenswrapper[4771]: I0219 23:37:40.943004 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:41 crc kubenswrapper[4771]: I0219 23:37:41.129648 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b98v2"] Feb 19 23:37:42 crc kubenswrapper[4771]: I0219 23:37:42.886912 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b98v2" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="registry-server" containerID="cri-o://caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a" gracePeriod=2 Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.615983 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.711028 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-utilities\") pod \"34ca2f72-087c-4091-b944-ce63d69ac48e\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.711158 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rxm7\" (UniqueName: \"kubernetes.io/projected/34ca2f72-087c-4091-b944-ce63d69ac48e-kube-api-access-5rxm7\") pod \"34ca2f72-087c-4091-b944-ce63d69ac48e\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.711427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-catalog-content\") pod \"34ca2f72-087c-4091-b944-ce63d69ac48e\" (UID: \"34ca2f72-087c-4091-b944-ce63d69ac48e\") " Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.711756 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-utilities" (OuterVolumeSpecName: "utilities") pod "34ca2f72-087c-4091-b944-ce63d69ac48e" (UID: "34ca2f72-087c-4091-b944-ce63d69ac48e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.718757 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ca2f72-087c-4091-b944-ce63d69ac48e-kube-api-access-5rxm7" (OuterVolumeSpecName: "kube-api-access-5rxm7") pod "34ca2f72-087c-4091-b944-ce63d69ac48e" (UID: "34ca2f72-087c-4091-b944-ce63d69ac48e"). InnerVolumeSpecName "kube-api-access-5rxm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.813664 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.813696 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rxm7\" (UniqueName: \"kubernetes.io/projected/34ca2f72-087c-4091-b944-ce63d69ac48e-kube-api-access-5rxm7\") on node \"crc\" DevicePath \"\"" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.835964 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34ca2f72-087c-4091-b944-ce63d69ac48e" (UID: "34ca2f72-087c-4091-b944-ce63d69ac48e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.903870 4771 generic.go:334] "Generic (PLEG): container finished" podID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerID="caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a" exitCode=0 Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.903907 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b98v2" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.903919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b98v2" event={"ID":"34ca2f72-087c-4091-b944-ce63d69ac48e","Type":"ContainerDied","Data":"caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a"} Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.903947 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b98v2" event={"ID":"34ca2f72-087c-4091-b944-ce63d69ac48e","Type":"ContainerDied","Data":"1f999440cb1d8036f2a3e97fefe51d7de348852242af3dfcf9e9c43e1c35eb1e"} Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.903970 4771 scope.go:117] "RemoveContainer" containerID="caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.916276 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ca2f72-087c-4091-b944-ce63d69ac48e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.950507 4771 scope.go:117] "RemoveContainer" containerID="e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0" Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.951838 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b98v2"] Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.974676 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b98v2"] Feb 19 23:37:43 crc kubenswrapper[4771]: I0219 23:37:43.979728 4771 scope.go:117] "RemoveContainer" containerID="77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8" Feb 19 23:37:44 crc kubenswrapper[4771]: I0219 23:37:44.047568 4771 scope.go:117] "RemoveContainer" containerID="caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a" Feb 19 23:37:44 crc kubenswrapper[4771]: E0219 23:37:44.049498 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a\": container with ID starting with caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a not found: ID does not exist" containerID="caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a" Feb 19 23:37:44 crc kubenswrapper[4771]: I0219 23:37:44.049530 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a"} err="failed to get container status \"caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a\": rpc error: code = NotFound desc = could not find container \"caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a\": container with ID starting with caad59ad7a3b5a5276b1697508aa94b281b33b5a22aa3e1ff861b974ff78b35a not found: ID does not exist" Feb 19 23:37:44 crc kubenswrapper[4771]: I0219 23:37:44.049556 4771 scope.go:117] "RemoveContainer" containerID="e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0" Feb 19 23:37:44 crc kubenswrapper[4771]: E0219 23:37:44.050082 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0\": container with ID starting with e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0 not found: ID does not exist" containerID="e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0" Feb 19 23:37:44 crc kubenswrapper[4771]: I0219 23:37:44.050139 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0"} err="failed to get container status \"e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0\": rpc error: code = NotFound desc = could not find container \"e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0\": container with ID starting with e3326a5ff74200cf3727cbf57d2cf9f7f8822884dfcaed2402f8729fb1ee99b0 not found: ID does not exist" Feb 19 23:37:44 crc kubenswrapper[4771]: I0219 23:37:44.050175 4771 scope.go:117] "RemoveContainer" containerID="77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8" Feb 19 23:37:44 crc kubenswrapper[4771]: E0219 23:37:44.050756 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8\": container with ID starting with 77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8 not found: ID does not exist" containerID="77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8" Feb 19 23:37:44 crc kubenswrapper[4771]: I0219 23:37:44.050781 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8"} err="failed to get container status \"77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8\": rpc error: code = NotFound desc = could not find container \"77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8\": container with ID starting with 77d29f86f612db5b52b0275a593d78d733a383f602e8d1eadc9e7196fafdb1a8 not found: ID does not exist" Feb 19 23:37:44 crc kubenswrapper[4771]: I0219 23:37:44.454738 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" path="/var/lib/kubelet/pods/34ca2f72-087c-4091-b944-ce63d69ac48e/volumes" Feb 19 23:37:49 crc kubenswrapper[4771]: I0219 23:37:49.437087 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:37:49 crc kubenswrapper[4771]: I0219 23:37:49.991789 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"3fc627c9a39ad20456e6e2249cacc1ec01a4f2327e4e1367ba967c2507360bc6"} Feb 19 23:38:15 crc kubenswrapper[4771]: I0219 23:38:15.354364 4771 generic.go:334] "Generic (PLEG): container finished" podID="0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd" containerID="0ec046218a6ef0e1fd73a2def71c648444ba3d3c7aede3ddc8a703f9d6f4193a" exitCode=0 Feb 19 23:38:15 crc kubenswrapper[4771]: I0219 23:38:15.354389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-472r9" event={"ID":"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd","Type":"ContainerDied","Data":"0ec046218a6ef0e1fd73a2def71c648444ba3d3c7aede3ddc8a703f9d6f4193a"} Feb 19 23:38:16 crc kubenswrapper[4771]: I0219 23:38:16.907962 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.066354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-inventory\") pod \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.066614 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rbnk\" (UniqueName: \"kubernetes.io/projected/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-kube-api-access-6rbnk\") pod \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.066829 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-ssh-key-openstack-cell1\") pod \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\" (UID: \"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd\") " Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.071672 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-kube-api-access-6rbnk" (OuterVolumeSpecName: "kube-api-access-6rbnk") pod "0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd" (UID: "0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd"). InnerVolumeSpecName "kube-api-access-6rbnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.100305 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-inventory" (OuterVolumeSpecName: "inventory") pod "0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd" (UID: "0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.103971 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd" (UID: "0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.170522 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.170883 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rbnk\" (UniqueName: \"kubernetes.io/projected/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-kube-api-access-6rbnk\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.170905 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.382303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-472r9" event={"ID":"0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd","Type":"ContainerDied","Data":"0bdf09c142e0adeb3b67cef1444134715494f815e1305bf56b12021bf12aca23"} Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.382365 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bdf09c142e0adeb3b67cef1444134715494f815e1305bf56b12021bf12aca23" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.382391 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-472r9" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.485468 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-glrg2"] Feb 19 23:38:17 crc kubenswrapper[4771]: E0219 23:38:17.485866 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="extract-utilities" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.485882 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="extract-utilities" Feb 19 23:38:17 crc kubenswrapper[4771]: E0219 23:38:17.485898 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="extract-content" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.485905 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="extract-content" Feb 19 23:38:17 crc kubenswrapper[4771]: E0219 23:38:17.485917 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="registry-server" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.485924 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="registry-server" Feb 19 23:38:17 crc kubenswrapper[4771]: E0219 23:38:17.485943 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.485949 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.486163 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.486177 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ca2f72-087c-4091-b944-ce63d69ac48e" containerName="registry-server" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.486831 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.489307 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.490077 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.490091 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.490368 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.503431 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-glrg2"] Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.580604 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-inventory-0\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.580671 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.580752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7tbs\" (UniqueName: \"kubernetes.io/projected/933abe26-7272-485a-a57b-f52e8c3c81d7-kube-api-access-n7tbs\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.684112 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7tbs\" (UniqueName: \"kubernetes.io/projected/933abe26-7272-485a-a57b-f52e8c3c81d7-kube-api-access-n7tbs\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.684374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-inventory-0\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.684418 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.700805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.702895 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-inventory-0\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.704562 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7tbs\" (UniqueName: \"kubernetes.io/projected/933abe26-7272-485a-a57b-f52e8c3c81d7-kube-api-access-n7tbs\") pod \"ssh-known-hosts-openstack-glrg2\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:17 crc kubenswrapper[4771]: I0219 23:38:17.819150 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:18 crc kubenswrapper[4771]: I0219 23:38:18.239533 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-glrg2"] Feb 19 23:38:18 crc kubenswrapper[4771]: I0219 23:38:18.393177 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-glrg2" event={"ID":"933abe26-7272-485a-a57b-f52e8c3c81d7","Type":"ContainerStarted","Data":"c046447c876fd8373db8b1ed0f9ebd4e54d7e8184996a1d2c1e9240a01111465"} Feb 19 23:38:19 crc kubenswrapper[4771]: I0219 23:38:19.405522 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-glrg2" event={"ID":"933abe26-7272-485a-a57b-f52e8c3c81d7","Type":"ContainerStarted","Data":"580e43c466ca45e7fbfba4492305697d5768789c8c2788004bd2b534eb107969"} Feb 19 23:38:19 crc kubenswrapper[4771]: I0219 23:38:19.428772 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-glrg2" podStartSLOduration=1.963395347 podStartE2EDuration="2.428754924s" podCreationTimestamp="2026-02-19 23:38:17 +0000 UTC" firstStartedPulling="2026-02-19 23:38:18.243488187 +0000 UTC m=+7798.514930657" lastFinishedPulling="2026-02-19 23:38:18.708847764 +0000 UTC m=+7798.980290234" observedRunningTime="2026-02-19 23:38:19.427184873 +0000 UTC m=+7799.698627383" watchObservedRunningTime="2026-02-19 23:38:19.428754924 +0000 UTC m=+7799.700197414" Feb 19 23:38:27 crc kubenswrapper[4771]: I0219 23:38:27.509553 4771 generic.go:334] "Generic (PLEG): container finished" podID="933abe26-7272-485a-a57b-f52e8c3c81d7" containerID="580e43c466ca45e7fbfba4492305697d5768789c8c2788004bd2b534eb107969" exitCode=0 Feb 19 23:38:27 crc kubenswrapper[4771]: I0219 23:38:27.509661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-glrg2" event={"ID":"933abe26-7272-485a-a57b-f52e8c3c81d7","Type":"ContainerDied","Data":"580e43c466ca45e7fbfba4492305697d5768789c8c2788004bd2b534eb107969"} Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.101523 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.171922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7tbs\" (UniqueName: \"kubernetes.io/projected/933abe26-7272-485a-a57b-f52e8c3c81d7-kube-api-access-n7tbs\") pod \"933abe26-7272-485a-a57b-f52e8c3c81d7\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.172000 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-ssh-key-openstack-cell1\") pod \"933abe26-7272-485a-a57b-f52e8c3c81d7\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.172307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-inventory-0\") pod \"933abe26-7272-485a-a57b-f52e8c3c81d7\" (UID: \"933abe26-7272-485a-a57b-f52e8c3c81d7\") " Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.179181 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933abe26-7272-485a-a57b-f52e8c3c81d7-kube-api-access-n7tbs" (OuterVolumeSpecName: "kube-api-access-n7tbs") pod "933abe26-7272-485a-a57b-f52e8c3c81d7" (UID: "933abe26-7272-485a-a57b-f52e8c3c81d7"). InnerVolumeSpecName "kube-api-access-n7tbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.219745 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "933abe26-7272-485a-a57b-f52e8c3c81d7" (UID: "933abe26-7272-485a-a57b-f52e8c3c81d7"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.231130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "933abe26-7272-485a-a57b-f52e8c3c81d7" (UID: "933abe26-7272-485a-a57b-f52e8c3c81d7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.275403 4771 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.275455 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7tbs\" (UniqueName: \"kubernetes.io/projected/933abe26-7272-485a-a57b-f52e8c3c81d7-kube-api-access-n7tbs\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.275479 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/933abe26-7272-485a-a57b-f52e8c3c81d7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.534703 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-glrg2" event={"ID":"933abe26-7272-485a-a57b-f52e8c3c81d7","Type":"ContainerDied","Data":"c046447c876fd8373db8b1ed0f9ebd4e54d7e8184996a1d2c1e9240a01111465"} Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.534742 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c046447c876fd8373db8b1ed0f9ebd4e54d7e8184996a1d2c1e9240a01111465" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.534798 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-glrg2" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.667900 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7ppv7"] Feb 19 23:38:29 crc kubenswrapper[4771]: E0219 23:38:29.668821 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933abe26-7272-485a-a57b-f52e8c3c81d7" containerName="ssh-known-hosts-openstack" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.668851 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="933abe26-7272-485a-a57b-f52e8c3c81d7" containerName="ssh-known-hosts-openstack" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.669229 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="933abe26-7272-485a-a57b-f52e8c3c81d7" containerName="ssh-known-hosts-openstack" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.670368 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.672353 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.672776 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.677668 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.677686 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.696136 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7ppv7"] Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.786751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw25h\" (UniqueName: \"kubernetes.io/projected/506ed946-fa87-4e12-bfd9-0beb5708a8e8-kube-api-access-jw25h\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.786874 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-inventory\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.786926 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.889769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw25h\" (UniqueName: \"kubernetes.io/projected/506ed946-fa87-4e12-bfd9-0beb5708a8e8-kube-api-access-jw25h\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.889899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-inventory\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.889943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.894477 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.894482 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-inventory\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:29 crc kubenswrapper[4771]: I0219 23:38:29.923499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw25h\" (UniqueName: \"kubernetes.io/projected/506ed946-fa87-4e12-bfd9-0beb5708a8e8-kube-api-access-jw25h\") pod \"run-os-openstack-openstack-cell1-7ppv7\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:30 crc kubenswrapper[4771]: I0219 23:38:30.019504 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:30 crc kubenswrapper[4771]: I0219 23:38:30.637943 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7ppv7"] Feb 19 23:38:31 crc kubenswrapper[4771]: I0219 23:38:31.552724 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" event={"ID":"506ed946-fa87-4e12-bfd9-0beb5708a8e8","Type":"ContainerStarted","Data":"c0a30f39376fdb5c1a10854fbf9b3e72f2685453e2104900fd5c5b951001651a"} Feb 19 23:38:31 crc kubenswrapper[4771]: I0219 23:38:31.553192 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" event={"ID":"506ed946-fa87-4e12-bfd9-0beb5708a8e8","Type":"ContainerStarted","Data":"e982cc10d41b42c9e5af649cfdbe9a271107e3994ba4c4f64adc23ec6e9ab88a"} Feb 19 23:38:31 crc kubenswrapper[4771]: I0219 23:38:31.572982 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" podStartSLOduration=2.072899324 podStartE2EDuration="2.572965751s" podCreationTimestamp="2026-02-19 23:38:29 +0000 UTC" firstStartedPulling="2026-02-19 23:38:30.635357126 +0000 UTC m=+7810.906799596" lastFinishedPulling="2026-02-19 23:38:31.135423553 +0000 UTC m=+7811.406866023" observedRunningTime="2026-02-19 23:38:31.571241626 +0000 UTC m=+7811.842684106" watchObservedRunningTime="2026-02-19 23:38:31.572965751 +0000 UTC m=+7811.844408231" Feb 19 23:38:39 crc kubenswrapper[4771]: I0219 23:38:39.654513 4771 generic.go:334] "Generic (PLEG): container finished" podID="506ed946-fa87-4e12-bfd9-0beb5708a8e8" containerID="c0a30f39376fdb5c1a10854fbf9b3e72f2685453e2104900fd5c5b951001651a" exitCode=0 Feb 19 23:38:39 crc kubenswrapper[4771]: I0219 23:38:39.654609 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" event={"ID":"506ed946-fa87-4e12-bfd9-0beb5708a8e8","Type":"ContainerDied","Data":"c0a30f39376fdb5c1a10854fbf9b3e72f2685453e2104900fd5c5b951001651a"} Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.230765 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.294483 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-ssh-key-openstack-cell1\") pod \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.294956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw25h\" (UniqueName: \"kubernetes.io/projected/506ed946-fa87-4e12-bfd9-0beb5708a8e8-kube-api-access-jw25h\") pod \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.295014 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-inventory\") pod \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\" (UID: \"506ed946-fa87-4e12-bfd9-0beb5708a8e8\") " Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.317080 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506ed946-fa87-4e12-bfd9-0beb5708a8e8-kube-api-access-jw25h" (OuterVolumeSpecName: "kube-api-access-jw25h") pod "506ed946-fa87-4e12-bfd9-0beb5708a8e8" (UID: "506ed946-fa87-4e12-bfd9-0beb5708a8e8"). InnerVolumeSpecName "kube-api-access-jw25h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.334978 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-inventory" (OuterVolumeSpecName: "inventory") pod "506ed946-fa87-4e12-bfd9-0beb5708a8e8" (UID: "506ed946-fa87-4e12-bfd9-0beb5708a8e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.337514 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "506ed946-fa87-4e12-bfd9-0beb5708a8e8" (UID: "506ed946-fa87-4e12-bfd9-0beb5708a8e8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.398257 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.398413 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw25h\" (UniqueName: \"kubernetes.io/projected/506ed946-fa87-4e12-bfd9-0beb5708a8e8-kube-api-access-jw25h\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.398491 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506ed946-fa87-4e12-bfd9-0beb5708a8e8-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.698886 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" event={"ID":"506ed946-fa87-4e12-bfd9-0beb5708a8e8","Type":"ContainerDied","Data":"e982cc10d41b42c9e5af649cfdbe9a271107e3994ba4c4f64adc23ec6e9ab88a"} Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.698984 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e982cc10d41b42c9e5af649cfdbe9a271107e3994ba4c4f64adc23ec6e9ab88a" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.699534 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7ppv7" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.806725 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-ccdp6"] Feb 19 23:38:41 crc kubenswrapper[4771]: E0219 23:38:41.807155 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506ed946-fa87-4e12-bfd9-0beb5708a8e8" containerName="run-os-openstack-openstack-cell1" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.807175 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="506ed946-fa87-4e12-bfd9-0beb5708a8e8" containerName="run-os-openstack-openstack-cell1" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.807468 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="506ed946-fa87-4e12-bfd9-0beb5708a8e8" containerName="run-os-openstack-openstack-cell1" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.808403 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.816492 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.816860 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.817256 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.817500 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.836392 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-ccdp6"] Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.911049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hnz4\" (UniqueName: \"kubernetes.io/projected/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-kube-api-access-2hnz4\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.911130 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-inventory\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:41 crc kubenswrapper[4771]: I0219 23:38:41.911247 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.013180 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.013332 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hnz4\" (UniqueName: \"kubernetes.io/projected/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-kube-api-access-2hnz4\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.013416 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-inventory\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.017983 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-inventory\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.032139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.041476 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hnz4\" (UniqueName: \"kubernetes.io/projected/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-kube-api-access-2hnz4\") pod \"reboot-os-openstack-openstack-cell1-ccdp6\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.144882 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:38:42 crc kubenswrapper[4771]: I0219 23:38:42.809473 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-ccdp6"] Feb 19 23:38:43 crc kubenswrapper[4771]: I0219 23:38:43.728983 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" event={"ID":"10bed0e5-6103-4b5a-91b3-cf36d3c6b837","Type":"ContainerStarted","Data":"5e169a00a69226959110cc8c79811050e37afbdf8b7bdb43a8fb24d60e748992"} Feb 19 23:38:43 crc kubenswrapper[4771]: I0219 23:38:43.729387 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" event={"ID":"10bed0e5-6103-4b5a-91b3-cf36d3c6b837","Type":"ContainerStarted","Data":"68f3ee6a1d76d132900d2ef69cd8b2753d2d2a2f57fcb82963a3d3c27d74593e"} Feb 19 23:38:43 crc kubenswrapper[4771]: I0219 23:38:43.767853 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" podStartSLOduration=2.3625587550000002 podStartE2EDuration="2.767834386s" podCreationTimestamp="2026-02-19 23:38:41 +0000 UTC" firstStartedPulling="2026-02-19 23:38:42.818669059 +0000 UTC m=+7823.090111539" lastFinishedPulling="2026-02-19 23:38:43.22394469 +0000 UTC m=+7823.495387170" observedRunningTime="2026-02-19 23:38:43.749818604 +0000 UTC m=+7824.021261134" watchObservedRunningTime="2026-02-19 23:38:43.767834386 +0000 UTC m=+7824.039276856" Feb 19 23:38:59 crc kubenswrapper[4771]: I0219 23:38:59.934659 4771 generic.go:334] "Generic (PLEG): container finished" podID="10bed0e5-6103-4b5a-91b3-cf36d3c6b837" containerID="5e169a00a69226959110cc8c79811050e37afbdf8b7bdb43a8fb24d60e748992" exitCode=0 Feb 19 23:38:59 crc kubenswrapper[4771]: I0219 23:38:59.934758 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" event={"ID":"10bed0e5-6103-4b5a-91b3-cf36d3c6b837","Type":"ContainerDied","Data":"5e169a00a69226959110cc8c79811050e37afbdf8b7bdb43a8fb24d60e748992"} Feb 19 23:39:02 crc kubenswrapper[4771]: E0219 23:39:02.537759 4771 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.101s" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.050943 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.156374 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-inventory\") pod \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.156418 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hnz4\" (UniqueName: \"kubernetes.io/projected/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-kube-api-access-2hnz4\") pod \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.156536 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-ssh-key-openstack-cell1\") pod \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\" (UID: \"10bed0e5-6103-4b5a-91b3-cf36d3c6b837\") " Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.167995 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-kube-api-access-2hnz4" (OuterVolumeSpecName: "kube-api-access-2hnz4") pod "10bed0e5-6103-4b5a-91b3-cf36d3c6b837" (UID: "10bed0e5-6103-4b5a-91b3-cf36d3c6b837"). InnerVolumeSpecName "kube-api-access-2hnz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.204964 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "10bed0e5-6103-4b5a-91b3-cf36d3c6b837" (UID: "10bed0e5-6103-4b5a-91b3-cf36d3c6b837"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.208475 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-inventory" (OuterVolumeSpecName: "inventory") pod "10bed0e5-6103-4b5a-91b3-cf36d3c6b837" (UID: "10bed0e5-6103-4b5a-91b3-cf36d3c6b837"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.259543 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.259577 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hnz4\" (UniqueName: \"kubernetes.io/projected/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-kube-api-access-2hnz4\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.259588 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10bed0e5-6103-4b5a-91b3-cf36d3c6b837-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.534457 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" event={"ID":"10bed0e5-6103-4b5a-91b3-cf36d3c6b837","Type":"ContainerDied","Data":"68f3ee6a1d76d132900d2ef69cd8b2753d2d2a2f57fcb82963a3d3c27d74593e"} Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.534512 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f3ee6a1d76d132900d2ef69cd8b2753d2d2a2f57fcb82963a3d3c27d74593e" Feb 19 23:39:03 crc kubenswrapper[4771]: I0219 23:39:03.534907 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ccdp6" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.179528 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tnl7l"] Feb 19 23:39:04 crc kubenswrapper[4771]: E0219 23:39:04.180044 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bed0e5-6103-4b5a-91b3-cf36d3c6b837" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.180057 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bed0e5-6103-4b5a-91b3-cf36d3c6b837" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.180259 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="10bed0e5-6103-4b5a-91b3-cf36d3c6b837" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.181170 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.183539 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.184354 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.184519 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.184668 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.184748 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.184781 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.184884 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.184995 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.195930 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tnl7l"] Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382481 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382521 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382556 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382580 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq74f\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-kube-api-access-vq74f\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382608 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382722 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382826 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382878 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.382928 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485119 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485221 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485252 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485272 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485316 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485335 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485361 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485501 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485523 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq74f\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-kube-api-access-vq74f\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485543 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.485575 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.491062 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.491655 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.492015 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.492394 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.493554 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.494109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.494631 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.495098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.495826 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.498901 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.500630 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.508613 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.518176 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.522771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq74f\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-kube-api-access-vq74f\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.522955 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tnl7l\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:04 crc kubenswrapper[4771]: I0219 23:39:04.808782 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:05 crc kubenswrapper[4771]: I0219 23:39:05.476702 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tnl7l"] Feb 19 23:39:05 crc kubenswrapper[4771]: I0219 23:39:05.557346 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" event={"ID":"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0","Type":"ContainerStarted","Data":"140d79cd012386cbdbae39e9b8c94bc5522244e943cc8dee160845f55a3b34fd"} Feb 19 23:39:06 crc kubenswrapper[4771]: I0219 23:39:06.570655 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" event={"ID":"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0","Type":"ContainerStarted","Data":"ffe05cf305c9eafb8649c4c82aa5b3cce011c331f71a67e40963e57cd0b549b4"} Feb 19 23:39:06 crc kubenswrapper[4771]: I0219 23:39:06.608192 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" podStartSLOduration=2.170977508 podStartE2EDuration="2.608166377s" podCreationTimestamp="2026-02-19 23:39:04 +0000 UTC" firstStartedPulling="2026-02-19 23:39:05.484533415 +0000 UTC m=+7845.755975895" lastFinishedPulling="2026-02-19 23:39:05.921722284 +0000 UTC m=+7846.193164764" observedRunningTime="2026-02-19 23:39:06.597162868 +0000 UTC m=+7846.868605398" watchObservedRunningTime="2026-02-19 23:39:06.608166377 +0000 UTC m=+7846.879608877" Feb 19 23:39:45 crc kubenswrapper[4771]: E0219 23:39:45.499696 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca3ac0ad_d049_4272_b7e0_3eb490e7cef0.slice/crio-conmon-ffe05cf305c9eafb8649c4c82aa5b3cce011c331f71a67e40963e57cd0b549b4.scope\": RecentStats: unable to find data in memory cache]" Feb 19 23:39:46 crc kubenswrapper[4771]: I0219 23:39:46.133795 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" containerID="ffe05cf305c9eafb8649c4c82aa5b3cce011c331f71a67e40963e57cd0b549b4" exitCode=0 Feb 19 23:39:46 crc kubenswrapper[4771]: I0219 23:39:46.133915 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" event={"ID":"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0","Type":"ContainerDied","Data":"ffe05cf305c9eafb8649c4c82aa5b3cce011c331f71a67e40963e57cd0b549b4"} Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.686831 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733176 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-nova-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-libvirt-default-certs-0\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-metadata-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733398 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-neutron-metadata-default-certs-0\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733445 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-libvirt-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-ovn-default-certs-0\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733601 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733699 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ssh-key-openstack-cell1\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733809 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-telemetry-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733858 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq74f\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-kube-api-access-vq74f\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-bootstrap-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.733998 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-sriov-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.734072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-dhcp-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.734109 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ovn-combined-ca-bundle\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.734149 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-telemetry-default-certs-0\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.741491 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.741728 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.745581 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.745908 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.748258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.749004 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.749146 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.749986 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.750873 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.751323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.762598 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.763757 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-kube-api-access-vq74f" (OuterVolumeSpecName: "kube-api-access-vq74f") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "kube-api-access-vq74f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.763925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: E0219 23:39:47.779540 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory podName:ca3ac0ad-d049-4272-b7e0-3eb490e7cef0 nodeName:}" failed. No retries permitted until 2026-02-19 23:39:48.279508745 +0000 UTC m=+7888.550951225 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0") : error deleting /var/lib/kubelet/pods/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0/volume-subpaths: remove /var/lib/kubelet/pods/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0/volume-subpaths: no such file or directory Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.785183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837708 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837743 4771 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837756 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq74f\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-kube-api-access-vq74f\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837768 4771 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837780 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837793 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837805 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837817 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837830 4771 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837844 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837857 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837869 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837881 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:47 crc kubenswrapper[4771]: I0219 23:39:47.837893 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.157181 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" event={"ID":"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0","Type":"ContainerDied","Data":"140d79cd012386cbdbae39e9b8c94bc5522244e943cc8dee160845f55a3b34fd"} Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.157272 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140d79cd012386cbdbae39e9b8c94bc5522244e943cc8dee160845f55a3b34fd" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.157297 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tnl7l" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.338544 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gfcl7"] Feb 19 23:39:48 crc kubenswrapper[4771]: E0219 23:39:48.339334 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.339352 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.339559 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.340316 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.343545 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.350336 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory\") pod \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\" (UID: \"ca3ac0ad-d049-4272-b7e0-3eb490e7cef0\") " Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.356320 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory" (OuterVolumeSpecName: "inventory") pod "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0" (UID: "ca3ac0ad-d049-4272-b7e0-3eb490e7cef0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.372084 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gfcl7"] Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.453618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-inventory\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.453670 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.453726 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db9jk\" (UniqueName: \"kubernetes.io/projected/ece28209-ff16-4d53-9e40-7c58ae95bd0e-kube-api-access-db9jk\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.453758 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.454084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.454253 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca3ac0ad-d049-4272-b7e0-3eb490e7cef0-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.555628 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db9jk\" (UniqueName: \"kubernetes.io/projected/ece28209-ff16-4d53-9e40-7c58ae95bd0e-kube-api-access-db9jk\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.555714 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.555817 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.555892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-inventory\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.555932 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.556797 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.559761 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.560098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.561748 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-inventory\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.571893 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db9jk\" (UniqueName: \"kubernetes.io/projected/ece28209-ff16-4d53-9e40-7c58ae95bd0e-kube-api-access-db9jk\") pod \"ovn-openstack-openstack-cell1-gfcl7\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:48 crc kubenswrapper[4771]: I0219 23:39:48.732076 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:39:49 crc kubenswrapper[4771]: I0219 23:39:49.341192 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gfcl7"] Feb 19 23:39:49 crc kubenswrapper[4771]: W0219 23:39:49.353775 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece28209_ff16_4d53_9e40_7c58ae95bd0e.slice/crio-d775fdde97cfd56c812e2d3f5abe62da207cfa4cc6ae5301f519e27b0fc2f9d8 WatchSource:0}: Error finding container d775fdde97cfd56c812e2d3f5abe62da207cfa4cc6ae5301f519e27b0fc2f9d8: Status 404 returned error can't find the container with id d775fdde97cfd56c812e2d3f5abe62da207cfa4cc6ae5301f519e27b0fc2f9d8 Feb 19 23:39:50 crc kubenswrapper[4771]: I0219 23:39:50.178634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" event={"ID":"ece28209-ff16-4d53-9e40-7c58ae95bd0e","Type":"ContainerStarted","Data":"8ea6acfe1aed24bcef0c3b92a95b8222161beac62d60c471c867d3b1402e101e"} Feb 19 23:39:50 crc kubenswrapper[4771]: I0219 23:39:50.179059 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" event={"ID":"ece28209-ff16-4d53-9e40-7c58ae95bd0e","Type":"ContainerStarted","Data":"d775fdde97cfd56c812e2d3f5abe62da207cfa4cc6ae5301f519e27b0fc2f9d8"} Feb 19 23:39:50 crc kubenswrapper[4771]: I0219 23:39:50.203351 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" podStartSLOduration=1.761159765 podStartE2EDuration="2.203338225s" podCreationTimestamp="2026-02-19 23:39:48 +0000 UTC" firstStartedPulling="2026-02-19 23:39:49.363910873 +0000 UTC m=+7889.635353353" lastFinishedPulling="2026-02-19 23:39:49.806089333 +0000 UTC m=+7890.077531813" observedRunningTime="2026-02-19 23:39:50.194550235 +0000 UTC m=+7890.465992705" watchObservedRunningTime="2026-02-19 23:39:50.203338225 +0000 UTC m=+7890.474780695" Feb 19 23:40:12 crc kubenswrapper[4771]: I0219 23:40:12.957504 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:40:12 crc kubenswrapper[4771]: I0219 23:40:12.959001 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:40:42 crc kubenswrapper[4771]: I0219 23:40:42.956960 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:40:42 crc kubenswrapper[4771]: I0219 23:40:42.957535 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:40:54 crc kubenswrapper[4771]: I0219 23:40:54.927156 4771 generic.go:334] "Generic (PLEG): container finished" podID="ece28209-ff16-4d53-9e40-7c58ae95bd0e" containerID="8ea6acfe1aed24bcef0c3b92a95b8222161beac62d60c471c867d3b1402e101e" exitCode=0 Feb 19 23:40:54 crc kubenswrapper[4771]: I0219 23:40:54.927720 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" event={"ID":"ece28209-ff16-4d53-9e40-7c58ae95bd0e","Type":"ContainerDied","Data":"8ea6acfe1aed24bcef0c3b92a95b8222161beac62d60c471c867d3b1402e101e"} Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.383996 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.471420 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovncontroller-config-0\") pod \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.471586 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovn-combined-ca-bundle\") pod \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.471659 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db9jk\" (UniqueName: \"kubernetes.io/projected/ece28209-ff16-4d53-9e40-7c58ae95bd0e-kube-api-access-db9jk\") pod \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.471721 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ssh-key-openstack-cell1\") pod \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.471763 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-inventory\") pod \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\" (UID: \"ece28209-ff16-4d53-9e40-7c58ae95bd0e\") " Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.477291 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece28209-ff16-4d53-9e40-7c58ae95bd0e-kube-api-access-db9jk" (OuterVolumeSpecName: "kube-api-access-db9jk") pod "ece28209-ff16-4d53-9e40-7c58ae95bd0e" (UID: "ece28209-ff16-4d53-9e40-7c58ae95bd0e"). InnerVolumeSpecName "kube-api-access-db9jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.477410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ece28209-ff16-4d53-9e40-7c58ae95bd0e" (UID: "ece28209-ff16-4d53-9e40-7c58ae95bd0e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.502013 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ece28209-ff16-4d53-9e40-7c58ae95bd0e" (UID: "ece28209-ff16-4d53-9e40-7c58ae95bd0e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.506573 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-inventory" (OuterVolumeSpecName: "inventory") pod "ece28209-ff16-4d53-9e40-7c58ae95bd0e" (UID: "ece28209-ff16-4d53-9e40-7c58ae95bd0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.506598 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ece28209-ff16-4d53-9e40-7c58ae95bd0e" (UID: "ece28209-ff16-4d53-9e40-7c58ae95bd0e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.574928 4771 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.574958 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.574968 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db9jk\" (UniqueName: \"kubernetes.io/projected/ece28209-ff16-4d53-9e40-7c58ae95bd0e-kube-api-access-db9jk\") on node \"crc\" DevicePath \"\"" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.574976 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.574986 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ece28209-ff16-4d53-9e40-7c58ae95bd0e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.956687 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" event={"ID":"ece28209-ff16-4d53-9e40-7c58ae95bd0e","Type":"ContainerDied","Data":"d775fdde97cfd56c812e2d3f5abe62da207cfa4cc6ae5301f519e27b0fc2f9d8"} Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.957129 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d775fdde97cfd56c812e2d3f5abe62da207cfa4cc6ae5301f519e27b0fc2f9d8" Feb 19 23:40:56 crc kubenswrapper[4771]: I0219 23:40:56.956780 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gfcl7" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.126711 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jm7b8"] Feb 19 23:40:57 crc kubenswrapper[4771]: E0219 23:40:57.127196 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece28209-ff16-4d53-9e40-7c58ae95bd0e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.127212 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece28209-ff16-4d53-9e40-7c58ae95bd0e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.127389 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece28209-ff16-4d53-9e40-7c58ae95bd0e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.128198 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.130859 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.131085 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.131227 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.131454 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.131591 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.132993 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.141160 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jm7b8"] Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.188700 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.188772 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9zp\" (UniqueName: \"kubernetes.io/projected/d888a078-5730-47a9-a40c-1cf3bdb948b0-kube-api-access-4j9zp\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.188925 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.188982 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.189010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.190303 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.291765 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.291879 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9zp\" (UniqueName: \"kubernetes.io/projected/d888a078-5730-47a9-a40c-1cf3bdb948b0-kube-api-access-4j9zp\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.291996 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.292045 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.292068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.292098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.298586 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.303931 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.312612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.320728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.330586 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.330846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9zp\" (UniqueName: \"kubernetes.io/projected/d888a078-5730-47a9-a40c-1cf3bdb948b0-kube-api-access-4j9zp\") pod \"neutron-metadata-openstack-openstack-cell1-jm7b8\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:57 crc kubenswrapper[4771]: I0219 23:40:57.460527 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:40:58 crc kubenswrapper[4771]: I0219 23:40:58.070954 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jm7b8"] Feb 19 23:40:58 crc kubenswrapper[4771]: I0219 23:40:58.082463 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:40:58 crc kubenswrapper[4771]: I0219 23:40:58.980151 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" event={"ID":"d888a078-5730-47a9-a40c-1cf3bdb948b0","Type":"ContainerStarted","Data":"fdcf7d6ded0507e8e6c6fd8e09cf0ad58b2a2694acd6ce89358446b804c28ac0"} Feb 19 23:40:58 crc kubenswrapper[4771]: I0219 23:40:58.980690 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" event={"ID":"d888a078-5730-47a9-a40c-1cf3bdb948b0","Type":"ContainerStarted","Data":"d97faa9f397d1ba2012b4ecb3254ddf3c2547b48bb884a8bb8f5e29f63d02deb"} Feb 19 23:40:59 crc kubenswrapper[4771]: I0219 23:40:59.005069 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" podStartSLOduration=1.446383869 podStartE2EDuration="2.005053542s" podCreationTimestamp="2026-02-19 23:40:57 +0000 UTC" firstStartedPulling="2026-02-19 23:40:58.082145452 +0000 UTC m=+7958.353587922" lastFinishedPulling="2026-02-19 23:40:58.640815125 +0000 UTC m=+7958.912257595" observedRunningTime="2026-02-19 23:40:58.99885821 +0000 UTC m=+7959.270300680" watchObservedRunningTime="2026-02-19 23:40:59.005053542 +0000 UTC m=+7959.276496012" Feb 19 23:41:12 crc kubenswrapper[4771]: I0219 23:41:12.956824 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:41:12 crc kubenswrapper[4771]: I0219 23:41:12.957471 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:41:12 crc kubenswrapper[4771]: I0219 23:41:12.957541 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:41:12 crc kubenswrapper[4771]: I0219 23:41:12.958784 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3fc627c9a39ad20456e6e2249cacc1ec01a4f2327e4e1367ba967c2507360bc6"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:41:12 crc kubenswrapper[4771]: I0219 23:41:12.958888 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://3fc627c9a39ad20456e6e2249cacc1ec01a4f2327e4e1367ba967c2507360bc6" gracePeriod=600 Feb 19 23:41:13 crc kubenswrapper[4771]: I0219 23:41:13.139324 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="3fc627c9a39ad20456e6e2249cacc1ec01a4f2327e4e1367ba967c2507360bc6" exitCode=0 Feb 19 23:41:13 crc kubenswrapper[4771]: I0219 23:41:13.139377 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"3fc627c9a39ad20456e6e2249cacc1ec01a4f2327e4e1367ba967c2507360bc6"} Feb 19 23:41:13 crc kubenswrapper[4771]: I0219 23:41:13.139683 4771 scope.go:117] "RemoveContainer" containerID="8a170d4d5eaf60aa6dcf94aca6de25053d7ac2ab32f8a403516a48a5305a4ef2" Feb 19 23:41:14 crc kubenswrapper[4771]: I0219 23:41:14.153892 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293"} Feb 19 23:41:53 crc kubenswrapper[4771]: I0219 23:41:53.629352 4771 generic.go:334] "Generic (PLEG): container finished" podID="d888a078-5730-47a9-a40c-1cf3bdb948b0" containerID="fdcf7d6ded0507e8e6c6fd8e09cf0ad58b2a2694acd6ce89358446b804c28ac0" exitCode=0 Feb 19 23:41:53 crc kubenswrapper[4771]: I0219 23:41:53.629560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" event={"ID":"d888a078-5730-47a9-a40c-1cf3bdb948b0","Type":"ContainerDied","Data":"fdcf7d6ded0507e8e6c6fd8e09cf0ad58b2a2694acd6ce89358446b804c28ac0"} Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.149764 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.282772 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-metadata-combined-ca-bundle\") pod \"d888a078-5730-47a9-a40c-1cf3bdb948b0\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.282856 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-inventory\") pod \"d888a078-5730-47a9-a40c-1cf3bdb948b0\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.282932 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-ssh-key-openstack-cell1\") pod \"d888a078-5730-47a9-a40c-1cf3bdb948b0\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.283001 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j9zp\" (UniqueName: \"kubernetes.io/projected/d888a078-5730-47a9-a40c-1cf3bdb948b0-kube-api-access-4j9zp\") pod \"d888a078-5730-47a9-a40c-1cf3bdb948b0\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.283823 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d888a078-5730-47a9-a40c-1cf3bdb948b0\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.283964 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-nova-metadata-neutron-config-0\") pod \"d888a078-5730-47a9-a40c-1cf3bdb948b0\" (UID: \"d888a078-5730-47a9-a40c-1cf3bdb948b0\") " Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.291226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d888a078-5730-47a9-a40c-1cf3bdb948b0-kube-api-access-4j9zp" (OuterVolumeSpecName: "kube-api-access-4j9zp") pod "d888a078-5730-47a9-a40c-1cf3bdb948b0" (UID: "d888a078-5730-47a9-a40c-1cf3bdb948b0"). InnerVolumeSpecName "kube-api-access-4j9zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.318784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d888a078-5730-47a9-a40c-1cf3bdb948b0" (UID: "d888a078-5730-47a9-a40c-1cf3bdb948b0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.322787 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d888a078-5730-47a9-a40c-1cf3bdb948b0" (UID: "d888a078-5730-47a9-a40c-1cf3bdb948b0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.323705 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-inventory" (OuterVolumeSpecName: "inventory") pod "d888a078-5730-47a9-a40c-1cf3bdb948b0" (UID: "d888a078-5730-47a9-a40c-1cf3bdb948b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.334160 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d888a078-5730-47a9-a40c-1cf3bdb948b0" (UID: "d888a078-5730-47a9-a40c-1cf3bdb948b0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.350818 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d888a078-5730-47a9-a40c-1cf3bdb948b0" (UID: "d888a078-5730-47a9-a40c-1cf3bdb948b0"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.387196 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.387236 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.387253 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.387268 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.387280 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d888a078-5730-47a9-a40c-1cf3bdb948b0-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.387294 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j9zp\" (UniqueName: \"kubernetes.io/projected/d888a078-5730-47a9-a40c-1cf3bdb948b0-kube-api-access-4j9zp\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.651337 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" event={"ID":"d888a078-5730-47a9-a40c-1cf3bdb948b0","Type":"ContainerDied","Data":"d97faa9f397d1ba2012b4ecb3254ddf3c2547b48bb884a8bb8f5e29f63d02deb"} Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.651673 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d97faa9f397d1ba2012b4ecb3254ddf3c2547b48bb884a8bb8f5e29f63d02deb" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.651418 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jm7b8" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.786850 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4qbx5"] Feb 19 23:41:55 crc kubenswrapper[4771]: E0219 23:41:55.787671 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d888a078-5730-47a9-a40c-1cf3bdb948b0" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.787690 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d888a078-5730-47a9-a40c-1cf3bdb948b0" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.787930 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d888a078-5730-47a9-a40c-1cf3bdb948b0" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.788747 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.792491 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.794254 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.794251 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.794579 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.794713 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.802303 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4qbx5"] Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.900774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-inventory\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.900974 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.901182 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.902143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdd5t\" (UniqueName: \"kubernetes.io/projected/7f7c2323-0701-4459-b78c-1e92739da106-kube-api-access-fdd5t\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:55 crc kubenswrapper[4771]: I0219 23:41:55.902229 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.005114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-inventory\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.005270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.005484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.005544 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd5t\" (UniqueName: \"kubernetes.io/projected/7f7c2323-0701-4459-b78c-1e92739da106-kube-api-access-fdd5t\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.005579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.012008 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.015859 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.015957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-inventory\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.016178 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.023215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdd5t\" (UniqueName: \"kubernetes.io/projected/7f7c2323-0701-4459-b78c-1e92739da106-kube-api-access-fdd5t\") pod \"libvirt-openstack-openstack-cell1-4qbx5\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.155967 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:41:56 crc kubenswrapper[4771]: I0219 23:41:56.879864 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4qbx5"] Feb 19 23:41:57 crc kubenswrapper[4771]: I0219 23:41:57.687094 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" event={"ID":"7f7c2323-0701-4459-b78c-1e92739da106","Type":"ContainerStarted","Data":"82e7c1537300872a7ee877adec47587e3095940f91c76525fc36321248313817"} Feb 19 23:41:57 crc kubenswrapper[4771]: I0219 23:41:57.687553 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" event={"ID":"7f7c2323-0701-4459-b78c-1e92739da106","Type":"ContainerStarted","Data":"14ffbf008b6b9f36975ff416257675e8bd670471244747fb8fdf557620c16b79"} Feb 19 23:41:57 crc kubenswrapper[4771]: I0219 23:41:57.714174 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" podStartSLOduration=2.273748842 podStartE2EDuration="2.714160676s" podCreationTimestamp="2026-02-19 23:41:55 +0000 UTC" firstStartedPulling="2026-02-19 23:41:56.916398066 +0000 UTC m=+8017.187840536" lastFinishedPulling="2026-02-19 23:41:57.3568099 +0000 UTC m=+8017.628252370" observedRunningTime="2026-02-19 23:41:57.707621455 +0000 UTC m=+8017.979063925" watchObservedRunningTime="2026-02-19 23:41:57.714160676 +0000 UTC m=+8017.985603146" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.583780 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wlfqh"] Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.589636 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.626987 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wlfqh"] Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.656574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvsrd\" (UniqueName: \"kubernetes.io/projected/b3227292-55fd-44a0-a510-ca18536226b6-kube-api-access-dvsrd\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.656672 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-catalog-content\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.656875 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-utilities\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.759295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-catalog-content\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.759436 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-utilities\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.759601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvsrd\" (UniqueName: \"kubernetes.io/projected/b3227292-55fd-44a0-a510-ca18536226b6-kube-api-access-dvsrd\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.760184 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-utilities\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.760238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-catalog-content\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.790390 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvsrd\" (UniqueName: \"kubernetes.io/projected/b3227292-55fd-44a0-a510-ca18536226b6-kube-api-access-dvsrd\") pod \"community-operators-wlfqh\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:43 crc kubenswrapper[4771]: I0219 23:42:43.921991 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:44 crc kubenswrapper[4771]: I0219 23:42:44.510359 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wlfqh"] Feb 19 23:42:44 crc kubenswrapper[4771]: I0219 23:42:44.769294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerStarted","Data":"3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8"} Feb 19 23:42:44 crc kubenswrapper[4771]: I0219 23:42:44.769593 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerStarted","Data":"921fb5ce8eafca2c62bbb30b5156a5da99bde54f01dd5bc581ac49298c2bba0e"} Feb 19 23:42:45 crc kubenswrapper[4771]: I0219 23:42:45.801272 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3227292-55fd-44a0-a510-ca18536226b6" containerID="3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8" exitCode=0 Feb 19 23:42:45 crc kubenswrapper[4771]: I0219 23:42:45.801355 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerDied","Data":"3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8"} Feb 19 23:42:46 crc kubenswrapper[4771]: I0219 23:42:46.815910 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerStarted","Data":"a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac"} Feb 19 23:42:48 crc kubenswrapper[4771]: I0219 23:42:48.843485 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3227292-55fd-44a0-a510-ca18536226b6" containerID="a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac" exitCode=0 Feb 19 23:42:48 crc kubenswrapper[4771]: I0219 23:42:48.843597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerDied","Data":"a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac"} Feb 19 23:42:49 crc kubenswrapper[4771]: I0219 23:42:49.863394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerStarted","Data":"8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87"} Feb 19 23:42:49 crc kubenswrapper[4771]: I0219 23:42:49.886260 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wlfqh" podStartSLOduration=3.433965193 podStartE2EDuration="6.886233109s" podCreationTimestamp="2026-02-19 23:42:43 +0000 UTC" firstStartedPulling="2026-02-19 23:42:45.80586008 +0000 UTC m=+8066.077302590" lastFinishedPulling="2026-02-19 23:42:49.258127976 +0000 UTC m=+8069.529570506" observedRunningTime="2026-02-19 23:42:49.88319553 +0000 UTC m=+8070.154638090" watchObservedRunningTime="2026-02-19 23:42:49.886233109 +0000 UTC m=+8070.157675609" Feb 19 23:42:53 crc kubenswrapper[4771]: I0219 23:42:53.922731 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:53 crc kubenswrapper[4771]: I0219 23:42:53.924102 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:42:54 crc kubenswrapper[4771]: I0219 23:42:54.978737 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wlfqh" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="registry-server" probeResult="failure" output=< Feb 19 23:42:54 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:42:54 crc kubenswrapper[4771]: > Feb 19 23:43:04 crc kubenswrapper[4771]: I0219 23:43:04.010796 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:43:04 crc kubenswrapper[4771]: I0219 23:43:04.106631 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:43:04 crc kubenswrapper[4771]: I0219 23:43:04.279284 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wlfqh"] Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.178692 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wlfqh" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="registry-server" containerID="cri-o://8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87" gracePeriod=2 Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.859159 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.954360 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-utilities\") pod \"b3227292-55fd-44a0-a510-ca18536226b6\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.955313 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-utilities" (OuterVolumeSpecName: "utilities") pod "b3227292-55fd-44a0-a510-ca18536226b6" (UID: "b3227292-55fd-44a0-a510-ca18536226b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.955729 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-catalog-content\") pod \"b3227292-55fd-44a0-a510-ca18536226b6\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.955904 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvsrd\" (UniqueName: \"kubernetes.io/projected/b3227292-55fd-44a0-a510-ca18536226b6-kube-api-access-dvsrd\") pod \"b3227292-55fd-44a0-a510-ca18536226b6\" (UID: \"b3227292-55fd-44a0-a510-ca18536226b6\") " Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.958337 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:05 crc kubenswrapper[4771]: I0219 23:43:05.962273 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3227292-55fd-44a0-a510-ca18536226b6-kube-api-access-dvsrd" (OuterVolumeSpecName: "kube-api-access-dvsrd") pod "b3227292-55fd-44a0-a510-ca18536226b6" (UID: "b3227292-55fd-44a0-a510-ca18536226b6"). InnerVolumeSpecName "kube-api-access-dvsrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.020551 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3227292-55fd-44a0-a510-ca18536226b6" (UID: "b3227292-55fd-44a0-a510-ca18536226b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.060912 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3227292-55fd-44a0-a510-ca18536226b6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.060972 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvsrd\" (UniqueName: \"kubernetes.io/projected/b3227292-55fd-44a0-a510-ca18536226b6-kube-api-access-dvsrd\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.195122 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3227292-55fd-44a0-a510-ca18536226b6" containerID="8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87" exitCode=0 Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.195192 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerDied","Data":"8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87"} Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.195240 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlfqh" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.195247 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlfqh" event={"ID":"b3227292-55fd-44a0-a510-ca18536226b6","Type":"ContainerDied","Data":"921fb5ce8eafca2c62bbb30b5156a5da99bde54f01dd5bc581ac49298c2bba0e"} Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.195267 4771 scope.go:117] "RemoveContainer" containerID="8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.237456 4771 scope.go:117] "RemoveContainer" containerID="a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.268582 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wlfqh"] Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.276880 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wlfqh"] Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.288391 4771 scope.go:117] "RemoveContainer" containerID="3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.331993 4771 scope.go:117] "RemoveContainer" containerID="8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87" Feb 19 23:43:06 crc kubenswrapper[4771]: E0219 23:43:06.332437 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87\": container with ID starting with 8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87 not found: ID does not exist" containerID="8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.332474 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87"} err="failed to get container status \"8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87\": rpc error: code = NotFound desc = could not find container \"8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87\": container with ID starting with 8e091c33a3508e8fcc5927b6196516070818ff52d00d4fe6c8ad07c1f979cf87 not found: ID does not exist" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.332499 4771 scope.go:117] "RemoveContainer" containerID="a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac" Feb 19 23:43:06 crc kubenswrapper[4771]: E0219 23:43:06.332823 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac\": container with ID starting with a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac not found: ID does not exist" containerID="a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.332844 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac"} err="failed to get container status \"a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac\": rpc error: code = NotFound desc = could not find container \"a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac\": container with ID starting with a7b9ef12c59268fc6e2a5d9ed7e26865be42a92c11f7b30823eff6c2409d06ac not found: ID does not exist" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.332857 4771 scope.go:117] "RemoveContainer" containerID="3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8" Feb 19 23:43:06 crc kubenswrapper[4771]: E0219 23:43:06.333412 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8\": container with ID starting with 3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8 not found: ID does not exist" containerID="3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.333458 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8"} err="failed to get container status \"3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8\": rpc error: code = NotFound desc = could not find container \"3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8\": container with ID starting with 3d11af0b896b7b21139dd4f0a1f55d2641513065a2d9930975c8f392861ba8b8 not found: ID does not exist" Feb 19 23:43:06 crc kubenswrapper[4771]: I0219 23:43:06.450705 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3227292-55fd-44a0-a510-ca18536226b6" path="/var/lib/kubelet/pods/b3227292-55fd-44a0-a510-ca18536226b6/volumes" Feb 19 23:43:42 crc kubenswrapper[4771]: I0219 23:43:42.957515 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:43:42 crc kubenswrapper[4771]: I0219 23:43:42.958557 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:12 crc kubenswrapper[4771]: I0219 23:44:12.957055 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:44:12 crc kubenswrapper[4771]: I0219 23:44:12.957467 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:42 crc kubenswrapper[4771]: I0219 23:44:42.957487 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:44:42 crc kubenswrapper[4771]: I0219 23:44:42.958153 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:42 crc kubenswrapper[4771]: I0219 23:44:42.958207 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:44:42 crc kubenswrapper[4771]: I0219 23:44:42.959166 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:44:42 crc kubenswrapper[4771]: I0219 23:44:42.959236 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" gracePeriod=600 Feb 19 23:44:43 crc kubenswrapper[4771]: E0219 23:44:43.090226 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:44:43 crc kubenswrapper[4771]: I0219 23:44:43.509525 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" exitCode=0 Feb 19 23:44:43 crc kubenswrapper[4771]: I0219 23:44:43.509584 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293"} Feb 19 23:44:43 crc kubenswrapper[4771]: I0219 23:44:43.509663 4771 scope.go:117] "RemoveContainer" containerID="3fc627c9a39ad20456e6e2249cacc1ec01a4f2327e4e1367ba967c2507360bc6" Feb 19 23:44:43 crc kubenswrapper[4771]: I0219 23:44:43.510664 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:44:43 crc kubenswrapper[4771]: E0219 23:44:43.511218 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:44:54 crc kubenswrapper[4771]: I0219 23:44:54.444412 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:44:54 crc kubenswrapper[4771]: E0219 23:44:54.445883 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.211808 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr"] Feb 19 23:45:00 crc kubenswrapper[4771]: E0219 23:45:00.213447 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.213471 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4771]: E0219 23:45:00.213518 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="extract-content" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.213532 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="extract-content" Feb 19 23:45:00 crc kubenswrapper[4771]: E0219 23:45:00.213617 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="extract-utilities" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.213631 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="extract-utilities" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.214410 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3227292-55fd-44a0-a510-ca18536226b6" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.217096 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.221576 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.223044 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.236618 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr"] Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.390227 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5578d2-0f89-46ed-997e-4c3d93d018db-secret-volume\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.390517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5578d2-0f89-46ed-997e-4c3d93d018db-config-volume\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.390848 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwwg\" (UniqueName: \"kubernetes.io/projected/3a5578d2-0f89-46ed-997e-4c3d93d018db-kube-api-access-8nwwg\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.494369 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5578d2-0f89-46ed-997e-4c3d93d018db-config-volume\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.494492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwwg\" (UniqueName: \"kubernetes.io/projected/3a5578d2-0f89-46ed-997e-4c3d93d018db-kube-api-access-8nwwg\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.494758 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5578d2-0f89-46ed-997e-4c3d93d018db-secret-volume\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.495878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5578d2-0f89-46ed-997e-4c3d93d018db-config-volume\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.509579 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5578d2-0f89-46ed-997e-4c3d93d018db-secret-volume\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.516537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwwg\" (UniqueName: \"kubernetes.io/projected/3a5578d2-0f89-46ed-997e-4c3d93d018db-kube-api-access-8nwwg\") pod \"collect-profiles-29525745-nwwkr\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:00 crc kubenswrapper[4771]: I0219 23:45:00.563504 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:01 crc kubenswrapper[4771]: I0219 23:45:01.082990 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr"] Feb 19 23:45:01 crc kubenswrapper[4771]: I0219 23:45:01.720473 4771 generic.go:334] "Generic (PLEG): container finished" podID="3a5578d2-0f89-46ed-997e-4c3d93d018db" containerID="9e22d286d6ab21c9868db3555d220ccdc9ac815fb352ebe22be215f2ddf2dff8" exitCode=0 Feb 19 23:45:01 crc kubenswrapper[4771]: I0219 23:45:01.720551 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" event={"ID":"3a5578d2-0f89-46ed-997e-4c3d93d018db","Type":"ContainerDied","Data":"9e22d286d6ab21c9868db3555d220ccdc9ac815fb352ebe22be215f2ddf2dff8"} Feb 19 23:45:01 crc kubenswrapper[4771]: I0219 23:45:01.720883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" event={"ID":"3a5578d2-0f89-46ed-997e-4c3d93d018db","Type":"ContainerStarted","Data":"21fb59541cc401a97015f49dd92b810e20abe99d4343d21bf56bbdbd21f0179d"} Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.157335 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.261117 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nwwg\" (UniqueName: \"kubernetes.io/projected/3a5578d2-0f89-46ed-997e-4c3d93d018db-kube-api-access-8nwwg\") pod \"3a5578d2-0f89-46ed-997e-4c3d93d018db\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.261354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5578d2-0f89-46ed-997e-4c3d93d018db-secret-volume\") pod \"3a5578d2-0f89-46ed-997e-4c3d93d018db\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.261439 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5578d2-0f89-46ed-997e-4c3d93d018db-config-volume\") pod \"3a5578d2-0f89-46ed-997e-4c3d93d018db\" (UID: \"3a5578d2-0f89-46ed-997e-4c3d93d018db\") " Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.262315 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a5578d2-0f89-46ed-997e-4c3d93d018db-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a5578d2-0f89-46ed-997e-4c3d93d018db" (UID: "3a5578d2-0f89-46ed-997e-4c3d93d018db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.262998 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a5578d2-0f89-46ed-997e-4c3d93d018db-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.266844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5578d2-0f89-46ed-997e-4c3d93d018db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a5578d2-0f89-46ed-997e-4c3d93d018db" (UID: "3a5578d2-0f89-46ed-997e-4c3d93d018db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.268333 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5578d2-0f89-46ed-997e-4c3d93d018db-kube-api-access-8nwwg" (OuterVolumeSpecName: "kube-api-access-8nwwg") pod "3a5578d2-0f89-46ed-997e-4c3d93d018db" (UID: "3a5578d2-0f89-46ed-997e-4c3d93d018db"). InnerVolumeSpecName "kube-api-access-8nwwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.365764 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a5578d2-0f89-46ed-997e-4c3d93d018db-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.365811 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nwwg\" (UniqueName: \"kubernetes.io/projected/3a5578d2-0f89-46ed-997e-4c3d93d018db-kube-api-access-8nwwg\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.752791 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" event={"ID":"3a5578d2-0f89-46ed-997e-4c3d93d018db","Type":"ContainerDied","Data":"21fb59541cc401a97015f49dd92b810e20abe99d4343d21bf56bbdbd21f0179d"} Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.752848 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21fb59541cc401a97015f49dd92b810e20abe99d4343d21bf56bbdbd21f0179d" Feb 19 23:45:03 crc kubenswrapper[4771]: I0219 23:45:03.752895 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-nwwkr" Feb 19 23:45:04 crc kubenswrapper[4771]: I0219 23:45:04.257951 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm"] Feb 19 23:45:04 crc kubenswrapper[4771]: I0219 23:45:04.274303 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-rr5wm"] Feb 19 23:45:04 crc kubenswrapper[4771]: I0219 23:45:04.453728 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1226276-1244-42cc-8f6b-8fcf393451ab" path="/var/lib/kubelet/pods/e1226276-1244-42cc-8f6b-8fcf393451ab/volumes" Feb 19 23:45:07 crc kubenswrapper[4771]: I0219 23:45:07.437516 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:45:07 crc kubenswrapper[4771]: E0219 23:45:07.438365 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:45:21 crc kubenswrapper[4771]: I0219 23:45:21.438754 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:45:21 crc kubenswrapper[4771]: E0219 23:45:21.439676 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:45:36 crc kubenswrapper[4771]: I0219 23:45:36.439361 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:45:36 crc kubenswrapper[4771]: E0219 23:45:36.440258 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:45:46 crc kubenswrapper[4771]: I0219 23:45:46.485381 4771 scope.go:117] "RemoveContainer" containerID="d47b98d930e2a03a9f2974f41dd40d0fc4e5cf703e829ee8c2222528e0ff75d7" Feb 19 23:45:50 crc kubenswrapper[4771]: I0219 23:45:50.472450 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:45:50 crc kubenswrapper[4771]: E0219 23:45:50.474061 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:46:01 crc kubenswrapper[4771]: I0219 23:46:01.438091 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:46:01 crc kubenswrapper[4771]: E0219 23:46:01.439503 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.174400 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szzhm"] Feb 19 23:46:09 crc kubenswrapper[4771]: E0219 23:46:09.175194 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5578d2-0f89-46ed-997e-4c3d93d018db" containerName="collect-profiles" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.175206 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5578d2-0f89-46ed-997e-4c3d93d018db" containerName="collect-profiles" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.175419 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5578d2-0f89-46ed-997e-4c3d93d018db" containerName="collect-profiles" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.176860 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.205587 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szzhm"] Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.275561 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-catalog-content\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.275670 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2lmr\" (UniqueName: \"kubernetes.io/projected/5eadce87-b7dd-4096-a762-bb3a693a8b7f-kube-api-access-l2lmr\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.275749 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-utilities\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.378881 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-catalog-content\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.379144 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2lmr\" (UniqueName: \"kubernetes.io/projected/5eadce87-b7dd-4096-a762-bb3a693a8b7f-kube-api-access-l2lmr\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.379369 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-utilities\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.379404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-catalog-content\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.379842 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-utilities\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.397780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2lmr\" (UniqueName: \"kubernetes.io/projected/5eadce87-b7dd-4096-a762-bb3a693a8b7f-kube-api-access-l2lmr\") pod \"redhat-marketplace-szzhm\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:09 crc kubenswrapper[4771]: I0219 23:46:09.540616 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:10 crc kubenswrapper[4771]: I0219 23:46:10.048502 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szzhm"] Feb 19 23:46:10 crc kubenswrapper[4771]: I0219 23:46:10.563605 4771 generic.go:334] "Generic (PLEG): container finished" podID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerID="ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28" exitCode=0 Feb 19 23:46:10 crc kubenswrapper[4771]: I0219 23:46:10.563657 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szzhm" event={"ID":"5eadce87-b7dd-4096-a762-bb3a693a8b7f","Type":"ContainerDied","Data":"ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28"} Feb 19 23:46:10 crc kubenswrapper[4771]: I0219 23:46:10.563690 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szzhm" event={"ID":"5eadce87-b7dd-4096-a762-bb3a693a8b7f","Type":"ContainerStarted","Data":"78b0e3b3c1cc8cbde0e6ae02baae15add3e9de3290c09582cbcf6d5c853cc7b8"} Feb 19 23:46:10 crc kubenswrapper[4771]: I0219 23:46:10.566949 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:46:11 crc kubenswrapper[4771]: I0219 23:46:11.573918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szzhm" event={"ID":"5eadce87-b7dd-4096-a762-bb3a693a8b7f","Type":"ContainerStarted","Data":"f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c"} Feb 19 23:46:12 crc kubenswrapper[4771]: I0219 23:46:12.585754 4771 generic.go:334] "Generic (PLEG): container finished" podID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerID="f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c" exitCode=0 Feb 19 23:46:12 crc kubenswrapper[4771]: I0219 23:46:12.585862 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szzhm" event={"ID":"5eadce87-b7dd-4096-a762-bb3a693a8b7f","Type":"ContainerDied","Data":"f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c"} Feb 19 23:46:13 crc kubenswrapper[4771]: I0219 23:46:13.437734 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:46:13 crc kubenswrapper[4771]: E0219 23:46:13.438367 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:46:13 crc kubenswrapper[4771]: I0219 23:46:13.598270 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szzhm" event={"ID":"5eadce87-b7dd-4096-a762-bb3a693a8b7f","Type":"ContainerStarted","Data":"0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b"} Feb 19 23:46:13 crc kubenswrapper[4771]: I0219 23:46:13.625478 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szzhm" podStartSLOduration=2.230776857 podStartE2EDuration="4.62545929s" podCreationTimestamp="2026-02-19 23:46:09 +0000 UTC" firstStartedPulling="2026-02-19 23:46:10.565852927 +0000 UTC m=+8270.837295407" lastFinishedPulling="2026-02-19 23:46:12.96053536 +0000 UTC m=+8273.231977840" observedRunningTime="2026-02-19 23:46:13.621254478 +0000 UTC m=+8273.892697048" watchObservedRunningTime="2026-02-19 23:46:13.62545929 +0000 UTC m=+8273.896901770" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.159859 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q7t4s"] Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.163234 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.184234 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7t4s"] Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.262256 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-utilities\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.262338 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-catalog-content\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.262396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv6x4\" (UniqueName: \"kubernetes.io/projected/51650a4d-a8b3-4880-b39c-baa6181b3f4e-kube-api-access-wv6x4\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.364867 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-utilities\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.364994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-catalog-content\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.365089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv6x4\" (UniqueName: \"kubernetes.io/projected/51650a4d-a8b3-4880-b39c-baa6181b3f4e-kube-api-access-wv6x4\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.365388 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-utilities\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.365506 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-catalog-content\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.386763 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv6x4\" (UniqueName: \"kubernetes.io/projected/51650a4d-a8b3-4880-b39c-baa6181b3f4e-kube-api-access-wv6x4\") pod \"certified-operators-q7t4s\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.522141 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.541591 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.541926 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.615365 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:19 crc kubenswrapper[4771]: I0219 23:46:19.793223 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:20 crc kubenswrapper[4771]: I0219 23:46:20.103691 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7t4s"] Feb 19 23:46:20 crc kubenswrapper[4771]: I0219 23:46:20.734924 4771 generic.go:334] "Generic (PLEG): container finished" podID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerID="35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9" exitCode=0 Feb 19 23:46:20 crc kubenswrapper[4771]: I0219 23:46:20.736911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7t4s" event={"ID":"51650a4d-a8b3-4880-b39c-baa6181b3f4e","Type":"ContainerDied","Data":"35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9"} Feb 19 23:46:20 crc kubenswrapper[4771]: I0219 23:46:20.736936 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7t4s" event={"ID":"51650a4d-a8b3-4880-b39c-baa6181b3f4e","Type":"ContainerStarted","Data":"1f70fd3ae142b6c70f2a471adceb372f930264e7274635ddec4e6b22908f2a24"} Feb 19 23:46:21 crc kubenswrapper[4771]: I0219 23:46:21.923088 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szzhm"] Feb 19 23:46:21 crc kubenswrapper[4771]: I0219 23:46:21.923385 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szzhm" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="registry-server" containerID="cri-o://0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b" gracePeriod=2 Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.490151 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.656379 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-utilities\") pod \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.656450 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-catalog-content\") pod \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.656582 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2lmr\" (UniqueName: \"kubernetes.io/projected/5eadce87-b7dd-4096-a762-bb3a693a8b7f-kube-api-access-l2lmr\") pod \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\" (UID: \"5eadce87-b7dd-4096-a762-bb3a693a8b7f\") " Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.657715 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-utilities" (OuterVolumeSpecName: "utilities") pod "5eadce87-b7dd-4096-a762-bb3a693a8b7f" (UID: "5eadce87-b7dd-4096-a762-bb3a693a8b7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.665780 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eadce87-b7dd-4096-a762-bb3a693a8b7f-kube-api-access-l2lmr" (OuterVolumeSpecName: "kube-api-access-l2lmr") pod "5eadce87-b7dd-4096-a762-bb3a693a8b7f" (UID: "5eadce87-b7dd-4096-a762-bb3a693a8b7f"). InnerVolumeSpecName "kube-api-access-l2lmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.680970 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5eadce87-b7dd-4096-a762-bb3a693a8b7f" (UID: "5eadce87-b7dd-4096-a762-bb3a693a8b7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.756775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7t4s" event={"ID":"51650a4d-a8b3-4880-b39c-baa6181b3f4e","Type":"ContainerStarted","Data":"d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1"} Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.758998 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.759041 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eadce87-b7dd-4096-a762-bb3a693a8b7f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.759057 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2lmr\" (UniqueName: \"kubernetes.io/projected/5eadce87-b7dd-4096-a762-bb3a693a8b7f-kube-api-access-l2lmr\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.761762 4771 generic.go:334] "Generic (PLEG): container finished" podID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerID="0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b" exitCode=0 Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.761797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szzhm" event={"ID":"5eadce87-b7dd-4096-a762-bb3a693a8b7f","Type":"ContainerDied","Data":"0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b"} Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.761819 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szzhm" event={"ID":"5eadce87-b7dd-4096-a762-bb3a693a8b7f","Type":"ContainerDied","Data":"78b0e3b3c1cc8cbde0e6ae02baae15add3e9de3290c09582cbcf6d5c853cc7b8"} Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.761842 4771 scope.go:117] "RemoveContainer" containerID="0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.761954 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szzhm" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.802760 4771 scope.go:117] "RemoveContainer" containerID="f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.805792 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szzhm"] Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.818660 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szzhm"] Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.829440 4771 scope.go:117] "RemoveContainer" containerID="ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.883178 4771 scope.go:117] "RemoveContainer" containerID="0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b" Feb 19 23:46:22 crc kubenswrapper[4771]: E0219 23:46:22.883650 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b\": container with ID starting with 0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b not found: ID does not exist" containerID="0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.883687 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b"} err="failed to get container status \"0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b\": rpc error: code = NotFound desc = could not find container \"0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b\": container with ID starting with 0e1f49c23f49b2d9b9508932df1a4f435c16b144f3b34e7a0193783fa90b8a7b not found: ID does not exist" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.883714 4771 scope.go:117] "RemoveContainer" containerID="f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c" Feb 19 23:46:22 crc kubenswrapper[4771]: E0219 23:46:22.884243 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c\": container with ID starting with f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c not found: ID does not exist" containerID="f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.884272 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c"} err="failed to get container status \"f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c\": rpc error: code = NotFound desc = could not find container \"f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c\": container with ID starting with f44d245d8fbb0909366eaad524281b980f47fd71aedc5bdbd9b4c42e3701b70c not found: ID does not exist" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.884289 4771 scope.go:117] "RemoveContainer" containerID="ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28" Feb 19 23:46:22 crc kubenswrapper[4771]: E0219 23:46:22.884602 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28\": container with ID starting with ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28 not found: ID does not exist" containerID="ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28" Feb 19 23:46:22 crc kubenswrapper[4771]: I0219 23:46:22.884632 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28"} err="failed to get container status \"ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28\": rpc error: code = NotFound desc = could not find container \"ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28\": container with ID starting with ce520a092a56e02c468096fdca4b879c2a9dfb8f0d740b7592015cf3118cfe28 not found: ID does not exist" Feb 19 23:46:23 crc kubenswrapper[4771]: I0219 23:46:23.772121 4771 generic.go:334] "Generic (PLEG): container finished" podID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerID="d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1" exitCode=0 Feb 19 23:46:23 crc kubenswrapper[4771]: I0219 23:46:23.772232 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7t4s" event={"ID":"51650a4d-a8b3-4880-b39c-baa6181b3f4e","Type":"ContainerDied","Data":"d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1"} Feb 19 23:46:24 crc kubenswrapper[4771]: I0219 23:46:24.479530 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" path="/var/lib/kubelet/pods/5eadce87-b7dd-4096-a762-bb3a693a8b7f/volumes" Feb 19 23:46:24 crc kubenswrapper[4771]: I0219 23:46:24.785753 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7t4s" event={"ID":"51650a4d-a8b3-4880-b39c-baa6181b3f4e","Type":"ContainerStarted","Data":"57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f"} Feb 19 23:46:24 crc kubenswrapper[4771]: I0219 23:46:24.816407 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q7t4s" podStartSLOduration=2.392682166 podStartE2EDuration="5.816384521s" podCreationTimestamp="2026-02-19 23:46:19 +0000 UTC" firstStartedPulling="2026-02-19 23:46:20.739054776 +0000 UTC m=+8281.010497256" lastFinishedPulling="2026-02-19 23:46:24.162757141 +0000 UTC m=+8284.434199611" observedRunningTime="2026-02-19 23:46:24.80924441 +0000 UTC m=+8285.080686900" watchObservedRunningTime="2026-02-19 23:46:24.816384521 +0000 UTC m=+8285.087827001" Feb 19 23:46:28 crc kubenswrapper[4771]: I0219 23:46:28.437377 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:46:28 crc kubenswrapper[4771]: E0219 23:46:28.438344 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:46:29 crc kubenswrapper[4771]: I0219 23:46:29.522962 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:29 crc kubenswrapper[4771]: I0219 23:46:29.523083 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:30 crc kubenswrapper[4771]: I0219 23:46:30.601278 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-q7t4s" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:46:30 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:46:30 crc kubenswrapper[4771]: > Feb 19 23:46:38 crc kubenswrapper[4771]: I0219 23:46:38.970359 4771 generic.go:334] "Generic (PLEG): container finished" podID="7f7c2323-0701-4459-b78c-1e92739da106" containerID="82e7c1537300872a7ee877adec47587e3095940f91c76525fc36321248313817" exitCode=0 Feb 19 23:46:38 crc kubenswrapper[4771]: I0219 23:46:38.970565 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" event={"ID":"7f7c2323-0701-4459-b78c-1e92739da106","Type":"ContainerDied","Data":"82e7c1537300872a7ee877adec47587e3095940f91c76525fc36321248313817"} Feb 19 23:46:39 crc kubenswrapper[4771]: I0219 23:46:39.647837 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:39 crc kubenswrapper[4771]: I0219 23:46:39.756343 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.365800 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7t4s"] Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.489035 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.578142 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-inventory\") pod \"7f7c2323-0701-4459-b78c-1e92739da106\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.578791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-combined-ca-bundle\") pod \"7f7c2323-0701-4459-b78c-1e92739da106\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.578968 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-ssh-key-openstack-cell1\") pod \"7f7c2323-0701-4459-b78c-1e92739da106\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.578992 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-secret-0\") pod \"7f7c2323-0701-4459-b78c-1e92739da106\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.579034 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdd5t\" (UniqueName: \"kubernetes.io/projected/7f7c2323-0701-4459-b78c-1e92739da106-kube-api-access-fdd5t\") pod \"7f7c2323-0701-4459-b78c-1e92739da106\" (UID: \"7f7c2323-0701-4459-b78c-1e92739da106\") " Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.584068 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7c2323-0701-4459-b78c-1e92739da106-kube-api-access-fdd5t" (OuterVolumeSpecName: "kube-api-access-fdd5t") pod "7f7c2323-0701-4459-b78c-1e92739da106" (UID: "7f7c2323-0701-4459-b78c-1e92739da106"). InnerVolumeSpecName "kube-api-access-fdd5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.584351 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7f7c2323-0701-4459-b78c-1e92739da106" (UID: "7f7c2323-0701-4459-b78c-1e92739da106"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.614151 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7f7c2323-0701-4459-b78c-1e92739da106" (UID: "7f7c2323-0701-4459-b78c-1e92739da106"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.616172 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7f7c2323-0701-4459-b78c-1e92739da106" (UID: "7f7c2323-0701-4459-b78c-1e92739da106"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.627848 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-inventory" (OuterVolumeSpecName: "inventory") pod "7f7c2323-0701-4459-b78c-1e92739da106" (UID: "7f7c2323-0701-4459-b78c-1e92739da106"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.681929 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.681967 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.681978 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdd5t\" (UniqueName: \"kubernetes.io/projected/7f7c2323-0701-4459-b78c-1e92739da106-kube-api-access-fdd5t\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.681992 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.682004 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7c2323-0701-4459-b78c-1e92739da106-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.991400 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" event={"ID":"7f7c2323-0701-4459-b78c-1e92739da106","Type":"ContainerDied","Data":"14ffbf008b6b9f36975ff416257675e8bd670471244747fb8fdf557620c16b79"} Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.991706 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14ffbf008b6b9f36975ff416257675e8bd670471244747fb8fdf557620c16b79" Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.991628 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q7t4s" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="registry-server" containerID="cri-o://57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f" gracePeriod=2 Feb 19 23:46:40 crc kubenswrapper[4771]: I0219 23:46:40.991437 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4qbx5" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.128708 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-zvngw"] Feb 19 23:46:41 crc kubenswrapper[4771]: E0219 23:46:41.129226 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="extract-content" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.129246 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="extract-content" Feb 19 23:46:41 crc kubenswrapper[4771]: E0219 23:46:41.129267 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="registry-server" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.129276 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="registry-server" Feb 19 23:46:41 crc kubenswrapper[4771]: E0219 23:46:41.129297 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="extract-utilities" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.129306 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="extract-utilities" Feb 19 23:46:41 crc kubenswrapper[4771]: E0219 23:46:41.129329 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7c2323-0701-4459-b78c-1e92739da106" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.129337 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7c2323-0701-4459-b78c-1e92739da106" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.129576 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7c2323-0701-4459-b78c-1e92739da106" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.129599 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eadce87-b7dd-4096-a762-bb3a693a8b7f" containerName="registry-server" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.130441 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.136224 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.136354 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.136431 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.136508 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.136608 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.136683 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.136774 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.176213 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-zvngw"] Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-inventory\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296447 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296544 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgl66\" (UniqueName: \"kubernetes.io/projected/3272a901-75aa-4fba-bcff-911ee4166918-kube-api-access-wgl66\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3272a901-75aa-4fba-bcff-911ee4166918-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296891 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.296964 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.297000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.399960 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3272a901-75aa-4fba-bcff-911ee4166918-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400043 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400076 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400104 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400149 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-inventory\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400217 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgl66\" (UniqueName: \"kubernetes.io/projected/3272a901-75aa-4fba-bcff-911ee4166918-kube-api-access-wgl66\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.400774 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.401984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3272a901-75aa-4fba-bcff-911ee4166918-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.405391 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.405520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.405338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.406129 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.407334 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.407830 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.407968 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-inventory\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.407975 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.413627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.418830 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgl66\" (UniqueName: \"kubernetes.io/projected/3272a901-75aa-4fba-bcff-911ee4166918-kube-api-access-wgl66\") pod \"nova-cell1-openstack-openstack-cell1-zvngw\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.489009 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.612553 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.706763 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv6x4\" (UniqueName: \"kubernetes.io/projected/51650a4d-a8b3-4880-b39c-baa6181b3f4e-kube-api-access-wv6x4\") pod \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.706919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-catalog-content\") pod \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.707132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-utilities\") pod \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\" (UID: \"51650a4d-a8b3-4880-b39c-baa6181b3f4e\") " Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.709939 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-utilities" (OuterVolumeSpecName: "utilities") pod "51650a4d-a8b3-4880-b39c-baa6181b3f4e" (UID: "51650a4d-a8b3-4880-b39c-baa6181b3f4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.711909 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51650a4d-a8b3-4880-b39c-baa6181b3f4e-kube-api-access-wv6x4" (OuterVolumeSpecName: "kube-api-access-wv6x4") pod "51650a4d-a8b3-4880-b39c-baa6181b3f4e" (UID: "51650a4d-a8b3-4880-b39c-baa6181b3f4e"). InnerVolumeSpecName "kube-api-access-wv6x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.762269 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51650a4d-a8b3-4880-b39c-baa6181b3f4e" (UID: "51650a4d-a8b3-4880-b39c-baa6181b3f4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.811581 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.811636 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv6x4\" (UniqueName: \"kubernetes.io/projected/51650a4d-a8b3-4880-b39c-baa6181b3f4e-kube-api-access-wv6x4\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:41 crc kubenswrapper[4771]: I0219 23:46:41.811652 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a4d-a8b3-4880-b39c-baa6181b3f4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.004090 4771 generic.go:334] "Generic (PLEG): container finished" podID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerID="57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f" exitCode=0 Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.004131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7t4s" event={"ID":"51650a4d-a8b3-4880-b39c-baa6181b3f4e","Type":"ContainerDied","Data":"57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f"} Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.004155 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7t4s" event={"ID":"51650a4d-a8b3-4880-b39c-baa6181b3f4e","Type":"ContainerDied","Data":"1f70fd3ae142b6c70f2a471adceb372f930264e7274635ddec4e6b22908f2a24"} Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.004170 4771 scope.go:117] "RemoveContainer" containerID="57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.004297 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7t4s" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.044153 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7t4s"] Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.047716 4771 scope.go:117] "RemoveContainer" containerID="d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.060626 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q7t4s"] Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.082117 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-zvngw"] Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.103263 4771 scope.go:117] "RemoveContainer" containerID="35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9" Feb 19 23:46:42 crc kubenswrapper[4771]: W0219 23:46:42.111231 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3272a901_75aa_4fba_bcff_911ee4166918.slice/crio-1387642e8f70580c8ca80ae17d88eeaa36e3a627d181ea90c268c080fbf58796 WatchSource:0}: Error finding container 1387642e8f70580c8ca80ae17d88eeaa36e3a627d181ea90c268c080fbf58796: Status 404 returned error can't find the container with id 1387642e8f70580c8ca80ae17d88eeaa36e3a627d181ea90c268c080fbf58796 Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.133718 4771 scope.go:117] "RemoveContainer" containerID="57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f" Feb 19 23:46:42 crc kubenswrapper[4771]: E0219 23:46:42.138976 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f\": container with ID starting with 57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f not found: ID does not exist" containerID="57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.139125 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f"} err="failed to get container status \"57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f\": rpc error: code = NotFound desc = could not find container \"57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f\": container with ID starting with 57a2cf6ef0cf715d746b74087c18344acf5df3d40e7e05900ff52eb16ca54e1f not found: ID does not exist" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.139152 4771 scope.go:117] "RemoveContainer" containerID="d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1" Feb 19 23:46:42 crc kubenswrapper[4771]: E0219 23:46:42.142831 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1\": container with ID starting with d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1 not found: ID does not exist" containerID="d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.142876 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1"} err="failed to get container status \"d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1\": rpc error: code = NotFound desc = could not find container \"d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1\": container with ID starting with d42860c1409c39d599d53224e32af310fdcd7cb13f1971061ea18bf701edc3a1 not found: ID does not exist" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.142903 4771 scope.go:117] "RemoveContainer" containerID="35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9" Feb 19 23:46:42 crc kubenswrapper[4771]: E0219 23:46:42.159611 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9\": container with ID starting with 35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9 not found: ID does not exist" containerID="35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.159960 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9"} err="failed to get container status \"35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9\": rpc error: code = NotFound desc = could not find container \"35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9\": container with ID starting with 35ce6ae2974a2e6290284807bd0f091d722e5bb20c140bbe6048c00e90e509f9 not found: ID does not exist" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.439763 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:46:42 crc kubenswrapper[4771]: E0219 23:46:42.440315 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:46:42 crc kubenswrapper[4771]: I0219 23:46:42.451777 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" path="/var/lib/kubelet/pods/51650a4d-a8b3-4880-b39c-baa6181b3f4e/volumes" Feb 19 23:46:43 crc kubenswrapper[4771]: I0219 23:46:43.022707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" event={"ID":"3272a901-75aa-4fba-bcff-911ee4166918","Type":"ContainerStarted","Data":"ae1f1638a824adeec02842011a440b6cada7ef009c18ef6ec523e732572d6e30"} Feb 19 23:46:43 crc kubenswrapper[4771]: I0219 23:46:43.023122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" event={"ID":"3272a901-75aa-4fba-bcff-911ee4166918","Type":"ContainerStarted","Data":"1387642e8f70580c8ca80ae17d88eeaa36e3a627d181ea90c268c080fbf58796"} Feb 19 23:46:43 crc kubenswrapper[4771]: I0219 23:46:43.053816 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" podStartSLOduration=1.5329704450000001 podStartE2EDuration="2.053801193s" podCreationTimestamp="2026-02-19 23:46:41 +0000 UTC" firstStartedPulling="2026-02-19 23:46:42.133827152 +0000 UTC m=+8302.405269622" lastFinishedPulling="2026-02-19 23:46:42.65465786 +0000 UTC m=+8302.926100370" observedRunningTime="2026-02-19 23:46:43.048179184 +0000 UTC m=+8303.319621664" watchObservedRunningTime="2026-02-19 23:46:43.053801193 +0000 UTC m=+8303.325243663" Feb 19 23:46:55 crc kubenswrapper[4771]: I0219 23:46:55.437955 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:46:55 crc kubenswrapper[4771]: E0219 23:46:55.440121 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:47:08 crc kubenswrapper[4771]: I0219 23:47:08.438470 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:47:08 crc kubenswrapper[4771]: E0219 23:47:08.439543 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.778286 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l29fb"] Feb 19 23:47:19 crc kubenswrapper[4771]: E0219 23:47:19.779327 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="registry-server" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.779342 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="registry-server" Feb 19 23:47:19 crc kubenswrapper[4771]: E0219 23:47:19.779360 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="extract-utilities" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.779369 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="extract-utilities" Feb 19 23:47:19 crc kubenswrapper[4771]: E0219 23:47:19.779390 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="extract-content" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.779399 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="extract-content" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.779696 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51650a4d-a8b3-4880-b39c-baa6181b3f4e" containerName="registry-server" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.781721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.793188 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l29fb"] Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.843403 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bclv\" (UniqueName: \"kubernetes.io/projected/acbceee6-da85-4789-8caf-e3b885d21a1c-kube-api-access-4bclv\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.844870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-utilities\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.845014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-catalog-content\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.947405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-catalog-content\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.947580 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bclv\" (UniqueName: \"kubernetes.io/projected/acbceee6-da85-4789-8caf-e3b885d21a1c-kube-api-access-4bclv\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.947648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-utilities\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.948195 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-utilities\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.948461 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-catalog-content\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:19 crc kubenswrapper[4771]: I0219 23:47:19.978307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bclv\" (UniqueName: \"kubernetes.io/projected/acbceee6-da85-4789-8caf-e3b885d21a1c-kube-api-access-4bclv\") pod \"redhat-operators-l29fb\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:20 crc kubenswrapper[4771]: I0219 23:47:20.136772 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:20 crc kubenswrapper[4771]: I0219 23:47:20.627299 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l29fb"] Feb 19 23:47:21 crc kubenswrapper[4771]: I0219 23:47:21.151808 4771 generic.go:334] "Generic (PLEG): container finished" podID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerID="d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4" exitCode=0 Feb 19 23:47:21 crc kubenswrapper[4771]: I0219 23:47:21.152099 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l29fb" event={"ID":"acbceee6-da85-4789-8caf-e3b885d21a1c","Type":"ContainerDied","Data":"d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4"} Feb 19 23:47:21 crc kubenswrapper[4771]: I0219 23:47:21.152130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l29fb" event={"ID":"acbceee6-da85-4789-8caf-e3b885d21a1c","Type":"ContainerStarted","Data":"1abf98584e365a86eea5bfed4c329b172a242d42dffbb73f06ec9a9086bea1c6"} Feb 19 23:47:21 crc kubenswrapper[4771]: I0219 23:47:21.438127 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:47:21 crc kubenswrapper[4771]: E0219 23:47:21.438700 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:47:22 crc kubenswrapper[4771]: I0219 23:47:22.162514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l29fb" event={"ID":"acbceee6-da85-4789-8caf-e3b885d21a1c","Type":"ContainerStarted","Data":"79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51"} Feb 19 23:47:27 crc kubenswrapper[4771]: I0219 23:47:27.210314 4771 generic.go:334] "Generic (PLEG): container finished" podID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerID="79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51" exitCode=0 Feb 19 23:47:27 crc kubenswrapper[4771]: I0219 23:47:27.210391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l29fb" event={"ID":"acbceee6-da85-4789-8caf-e3b885d21a1c","Type":"ContainerDied","Data":"79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51"} Feb 19 23:47:28 crc kubenswrapper[4771]: I0219 23:47:28.227806 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l29fb" event={"ID":"acbceee6-da85-4789-8caf-e3b885d21a1c","Type":"ContainerStarted","Data":"f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1"} Feb 19 23:47:28 crc kubenswrapper[4771]: I0219 23:47:28.280555 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l29fb" podStartSLOduration=2.5637333 podStartE2EDuration="9.280534335s" podCreationTimestamp="2026-02-19 23:47:19 +0000 UTC" firstStartedPulling="2026-02-19 23:47:21.153607215 +0000 UTC m=+8341.425049695" lastFinishedPulling="2026-02-19 23:47:27.87040826 +0000 UTC m=+8348.141850730" observedRunningTime="2026-02-19 23:47:28.269960645 +0000 UTC m=+8348.541403145" watchObservedRunningTime="2026-02-19 23:47:28.280534335 +0000 UTC m=+8348.551976815" Feb 19 23:47:30 crc kubenswrapper[4771]: I0219 23:47:30.138129 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:30 crc kubenswrapper[4771]: I0219 23:47:30.138401 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:47:31 crc kubenswrapper[4771]: I0219 23:47:31.200373 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l29fb" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="registry-server" probeResult="failure" output=< Feb 19 23:47:31 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:47:31 crc kubenswrapper[4771]: > Feb 19 23:47:35 crc kubenswrapper[4771]: I0219 23:47:35.437951 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:47:35 crc kubenswrapper[4771]: E0219 23:47:35.439549 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:47:41 crc kubenswrapper[4771]: I0219 23:47:41.231987 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l29fb" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="registry-server" probeResult="failure" output=< Feb 19 23:47:41 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:47:41 crc kubenswrapper[4771]: > Feb 19 23:47:49 crc kubenswrapper[4771]: I0219 23:47:49.437848 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:47:49 crc kubenswrapper[4771]: E0219 23:47:49.438640 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:47:51 crc kubenswrapper[4771]: I0219 23:47:51.203884 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l29fb" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="registry-server" probeResult="failure" output=< Feb 19 23:47:51 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:47:51 crc kubenswrapper[4771]: > Feb 19 23:48:00 crc kubenswrapper[4771]: I0219 23:48:00.233951 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:48:00 crc kubenswrapper[4771]: I0219 23:48:00.322335 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:48:00 crc kubenswrapper[4771]: I0219 23:48:00.482078 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l29fb"] Feb 19 23:48:01 crc kubenswrapper[4771]: I0219 23:48:01.933628 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l29fb" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="registry-server" containerID="cri-o://f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1" gracePeriod=2 Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.482118 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.665787 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bclv\" (UniqueName: \"kubernetes.io/projected/acbceee6-da85-4789-8caf-e3b885d21a1c-kube-api-access-4bclv\") pod \"acbceee6-da85-4789-8caf-e3b885d21a1c\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.666079 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-utilities\") pod \"acbceee6-da85-4789-8caf-e3b885d21a1c\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.666242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-catalog-content\") pod \"acbceee6-da85-4789-8caf-e3b885d21a1c\" (UID: \"acbceee6-da85-4789-8caf-e3b885d21a1c\") " Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.667340 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-utilities" (OuterVolumeSpecName: "utilities") pod "acbceee6-da85-4789-8caf-e3b885d21a1c" (UID: "acbceee6-da85-4789-8caf-e3b885d21a1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.677606 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbceee6-da85-4789-8caf-e3b885d21a1c-kube-api-access-4bclv" (OuterVolumeSpecName: "kube-api-access-4bclv") pod "acbceee6-da85-4789-8caf-e3b885d21a1c" (UID: "acbceee6-da85-4789-8caf-e3b885d21a1c"). InnerVolumeSpecName "kube-api-access-4bclv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.770252 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bclv\" (UniqueName: \"kubernetes.io/projected/acbceee6-da85-4789-8caf-e3b885d21a1c-kube-api-access-4bclv\") on node \"crc\" DevicePath \"\"" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.770303 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.829534 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acbceee6-da85-4789-8caf-e3b885d21a1c" (UID: "acbceee6-da85-4789-8caf-e3b885d21a1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.872515 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acbceee6-da85-4789-8caf-e3b885d21a1c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.952949 4771 generic.go:334] "Generic (PLEG): container finished" podID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerID="f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1" exitCode=0 Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.953045 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l29fb" event={"ID":"acbceee6-da85-4789-8caf-e3b885d21a1c","Type":"ContainerDied","Data":"f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1"} Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.953095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l29fb" event={"ID":"acbceee6-da85-4789-8caf-e3b885d21a1c","Type":"ContainerDied","Data":"1abf98584e365a86eea5bfed4c329b172a242d42dffbb73f06ec9a9086bea1c6"} Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.953132 4771 scope.go:117] "RemoveContainer" containerID="f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1" Feb 19 23:48:02 crc kubenswrapper[4771]: I0219 23:48:02.953198 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l29fb" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.001577 4771 scope.go:117] "RemoveContainer" containerID="79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.010646 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l29fb"] Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.050074 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l29fb"] Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.050454 4771 scope.go:117] "RemoveContainer" containerID="d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.114877 4771 scope.go:117] "RemoveContainer" containerID="f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1" Feb 19 23:48:03 crc kubenswrapper[4771]: E0219 23:48:03.115517 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1\": container with ID starting with f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1 not found: ID does not exist" containerID="f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.115576 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1"} err="failed to get container status \"f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1\": rpc error: code = NotFound desc = could not find container \"f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1\": container with ID starting with f7cc921aa652ec4d38d0b4ee7d7efdcfed49061db1a755b67f0fbd2ad34396e1 not found: ID does not exist" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.115614 4771 scope.go:117] "RemoveContainer" containerID="79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51" Feb 19 23:48:03 crc kubenswrapper[4771]: E0219 23:48:03.116180 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51\": container with ID starting with 79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51 not found: ID does not exist" containerID="79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.116228 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51"} err="failed to get container status \"79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51\": rpc error: code = NotFound desc = could not find container \"79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51\": container with ID starting with 79affdf366324b9cb1dde16daf25cc6fff7ef0e197a605d6772bb15b5bb0bc51 not found: ID does not exist" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.116272 4771 scope.go:117] "RemoveContainer" containerID="d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4" Feb 19 23:48:03 crc kubenswrapper[4771]: E0219 23:48:03.116749 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4\": container with ID starting with d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4 not found: ID does not exist" containerID="d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.116785 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4"} err="failed to get container status \"d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4\": rpc error: code = NotFound desc = could not find container \"d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4\": container with ID starting with d1fb420e8df035ac7a44cea4de0e1c729c23616d80e0412985f92e5b0e1f64d4 not found: ID does not exist" Feb 19 23:48:03 crc kubenswrapper[4771]: I0219 23:48:03.438381 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:48:03 crc kubenswrapper[4771]: E0219 23:48:03.438881 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:48:04 crc kubenswrapper[4771]: I0219 23:48:04.460395 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" path="/var/lib/kubelet/pods/acbceee6-da85-4789-8caf-e3b885d21a1c/volumes" Feb 19 23:48:15 crc kubenswrapper[4771]: I0219 23:48:15.437873 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:48:15 crc kubenswrapper[4771]: E0219 23:48:15.438915 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:48:26 crc kubenswrapper[4771]: I0219 23:48:26.437547 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:48:26 crc kubenswrapper[4771]: E0219 23:48:26.438607 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:48:39 crc kubenswrapper[4771]: I0219 23:48:39.437373 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:48:39 crc kubenswrapper[4771]: E0219 23:48:39.438179 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:48:50 crc kubenswrapper[4771]: I0219 23:48:50.455341 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:48:50 crc kubenswrapper[4771]: E0219 23:48:50.456139 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:49:02 crc kubenswrapper[4771]: I0219 23:49:02.437490 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:49:02 crc kubenswrapper[4771]: E0219 23:49:02.438240 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:49:17 crc kubenswrapper[4771]: I0219 23:49:17.437522 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:49:17 crc kubenswrapper[4771]: E0219 23:49:17.438692 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:49:28 crc kubenswrapper[4771]: I0219 23:49:28.388282 4771 generic.go:334] "Generic (PLEG): container finished" podID="3272a901-75aa-4fba-bcff-911ee4166918" containerID="ae1f1638a824adeec02842011a440b6cada7ef009c18ef6ec523e732572d6e30" exitCode=0 Feb 19 23:49:28 crc kubenswrapper[4771]: I0219 23:49:28.388811 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" event={"ID":"3272a901-75aa-4fba-bcff-911ee4166918","Type":"ContainerDied","Data":"ae1f1638a824adeec02842011a440b6cada7ef009c18ef6ec523e732572d6e30"} Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.438467 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:49:29 crc kubenswrapper[4771]: E0219 23:49:29.439029 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.887312 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936419 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-inventory\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936493 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-3\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936561 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-1\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936604 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-1\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936730 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-2\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936799 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-ssh-key-openstack-cell1\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936858 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-combined-ca-bundle\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.936893 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-0\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.937093 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-0\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.937235 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgl66\" (UniqueName: \"kubernetes.io/projected/3272a901-75aa-4fba-bcff-911ee4166918-kube-api-access-wgl66\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.937289 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3272a901-75aa-4fba-bcff-911ee4166918-nova-cells-global-config-0\") pod \"3272a901-75aa-4fba-bcff-911ee4166918\" (UID: \"3272a901-75aa-4fba-bcff-911ee4166918\") " Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.947972 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.949630 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3272a901-75aa-4fba-bcff-911ee4166918-kube-api-access-wgl66" (OuterVolumeSpecName: "kube-api-access-wgl66") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "kube-api-access-wgl66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.970451 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.974692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3272a901-75aa-4fba-bcff-911ee4166918-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.975616 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.983513 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.985368 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.985867 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.993911 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-inventory" (OuterVolumeSpecName: "inventory") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.998516 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:29 crc kubenswrapper[4771]: I0219 23:49:29.998901 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3272a901-75aa-4fba-bcff-911ee4166918" (UID: "3272a901-75aa-4fba-bcff-911ee4166918"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040414 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgl66\" (UniqueName: \"kubernetes.io/projected/3272a901-75aa-4fba-bcff-911ee4166918-kube-api-access-wgl66\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040557 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3272a901-75aa-4fba-bcff-911ee4166918-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040637 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040701 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040754 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040804 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040861 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040912 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.040970 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.041042 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.041101 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3272a901-75aa-4fba-bcff-911ee4166918-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.411401 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" event={"ID":"3272a901-75aa-4fba-bcff-911ee4166918","Type":"ContainerDied","Data":"1387642e8f70580c8ca80ae17d88eeaa36e3a627d181ea90c268c080fbf58796"} Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.411462 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1387642e8f70580c8ca80ae17d88eeaa36e3a627d181ea90c268c080fbf58796" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.411508 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-zvngw" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.536189 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-6648c"] Feb 19 23:49:30 crc kubenswrapper[4771]: E0219 23:49:30.536633 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="extract-content" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.536645 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="extract-content" Feb 19 23:49:30 crc kubenswrapper[4771]: E0219 23:49:30.536673 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="registry-server" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.536679 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="registry-server" Feb 19 23:49:30 crc kubenswrapper[4771]: E0219 23:49:30.536688 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3272a901-75aa-4fba-bcff-911ee4166918" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.536695 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3272a901-75aa-4fba-bcff-911ee4166918" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:49:30 crc kubenswrapper[4771]: E0219 23:49:30.536702 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="extract-utilities" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.536708 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="extract-utilities" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.536902 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbceee6-da85-4789-8caf-e3b885d21a1c" containerName="registry-server" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.536917 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3272a901-75aa-4fba-bcff-911ee4166918" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.537652 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.540095 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.540173 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.541804 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.543913 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.544527 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.558504 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-6648c"] Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.654732 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.655225 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.655541 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzfjq\" (UniqueName: \"kubernetes.io/projected/24464846-7ee5-4dd2-8362-39c28ecaef08-kube-api-access-gzfjq\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.655773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.656124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-inventory\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.656391 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.656637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.758581 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.759073 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.759296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.759480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.759771 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzfjq\" (UniqueName: \"kubernetes.io/projected/24464846-7ee5-4dd2-8362-39c28ecaef08-kube-api-access-gzfjq\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.760005 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.760244 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-inventory\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.765366 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.765861 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.765903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.770589 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-inventory\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.771177 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.776820 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.789874 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzfjq\" (UniqueName: \"kubernetes.io/projected/24464846-7ee5-4dd2-8362-39c28ecaef08-kube-api-access-gzfjq\") pod \"telemetry-openstack-openstack-cell1-6648c\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:30 crc kubenswrapper[4771]: I0219 23:49:30.869347 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:49:31 crc kubenswrapper[4771]: I0219 23:49:31.463187 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-6648c"] Feb 19 23:49:32 crc kubenswrapper[4771]: I0219 23:49:32.434153 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-6648c" event={"ID":"24464846-7ee5-4dd2-8362-39c28ecaef08","Type":"ContainerStarted","Data":"a8319d79245b639bf559ed2f6b6de689fee113ab89de4de7b8d78c7df9a329c5"} Feb 19 23:49:32 crc kubenswrapper[4771]: I0219 23:49:32.434512 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-6648c" event={"ID":"24464846-7ee5-4dd2-8362-39c28ecaef08","Type":"ContainerStarted","Data":"927425728d37295e9488188a4d6d57de051b3f0257d069d4f84b5b48c32ae6ee"} Feb 19 23:49:32 crc kubenswrapper[4771]: I0219 23:49:32.461978 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-6648c" podStartSLOduration=2.052962132 podStartE2EDuration="2.461961198s" podCreationTimestamp="2026-02-19 23:49:30 +0000 UTC" firstStartedPulling="2026-02-19 23:49:31.472251692 +0000 UTC m=+8471.743694172" lastFinishedPulling="2026-02-19 23:49:31.881250768 +0000 UTC m=+8472.152693238" observedRunningTime="2026-02-19 23:49:32.45750467 +0000 UTC m=+8472.728947140" watchObservedRunningTime="2026-02-19 23:49:32.461961198 +0000 UTC m=+8472.733403668" Feb 19 23:49:42 crc kubenswrapper[4771]: I0219 23:49:42.437423 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:49:42 crc kubenswrapper[4771]: E0219 23:49:42.439674 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:49:55 crc kubenswrapper[4771]: I0219 23:49:55.439414 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:49:56 crc kubenswrapper[4771]: I0219 23:49:56.745032 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"7fa155cb0383a0fbdfae911759c38f2f628afe45f731553dbcbc941a641c2f0a"} Feb 19 23:51:43 crc kubenswrapper[4771]: I0219 23:51:43.664424 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-df6b66f66-cg8dd" podUID="2e92976d-8555-4275-bf37-3d1e2f56aea1" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 19 23:52:12 crc kubenswrapper[4771]: I0219 23:52:12.957201 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:52:12 crc kubenswrapper[4771]: I0219 23:52:12.957943 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:52:42 crc kubenswrapper[4771]: I0219 23:52:42.957331 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:52:42 crc kubenswrapper[4771]: I0219 23:52:42.958995 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.158892 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-55t2d"] Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.162199 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.191111 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-55t2d"] Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.292128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-utilities\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.292348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-catalog-content\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.292463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtpr\" (UniqueName: \"kubernetes.io/projected/6c4fd759-bb28-42d3-9456-53e485094d96-kube-api-access-cmtpr\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.394143 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-utilities\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.394372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-catalog-content\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.394445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtpr\" (UniqueName: \"kubernetes.io/projected/6c4fd759-bb28-42d3-9456-53e485094d96-kube-api-access-cmtpr\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.394878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-catalog-content\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.395226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-utilities\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.422533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtpr\" (UniqueName: \"kubernetes.io/projected/6c4fd759-bb28-42d3-9456-53e485094d96-kube-api-access-cmtpr\") pod \"community-operators-55t2d\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:47 crc kubenswrapper[4771]: I0219 23:52:47.485793 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:48 crc kubenswrapper[4771]: I0219 23:52:48.032481 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-55t2d"] Feb 19 23:52:48 crc kubenswrapper[4771]: I0219 23:52:48.870321 4771 generic.go:334] "Generic (PLEG): container finished" podID="6c4fd759-bb28-42d3-9456-53e485094d96" containerID="ed5f9ae028bd3d07fd764b9bcaa6d0130cf6886950a67ed7f5d6dc2eff6216ac" exitCode=0 Feb 19 23:52:48 crc kubenswrapper[4771]: I0219 23:52:48.870373 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55t2d" event={"ID":"6c4fd759-bb28-42d3-9456-53e485094d96","Type":"ContainerDied","Data":"ed5f9ae028bd3d07fd764b9bcaa6d0130cf6886950a67ed7f5d6dc2eff6216ac"} Feb 19 23:52:48 crc kubenswrapper[4771]: I0219 23:52:48.870405 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55t2d" event={"ID":"6c4fd759-bb28-42d3-9456-53e485094d96","Type":"ContainerStarted","Data":"68ba5ea13932ed731d065c5a5728814c523ea4f07ca86fe359669114b51b6647"} Feb 19 23:52:48 crc kubenswrapper[4771]: I0219 23:52:48.872848 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:52:49 crc kubenswrapper[4771]: I0219 23:52:49.882499 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55t2d" event={"ID":"6c4fd759-bb28-42d3-9456-53e485094d96","Type":"ContainerStarted","Data":"908d28370e93e6940de18246e18e9875563dc3881d5788dcec84ebd64163ea01"} Feb 19 23:52:51 crc kubenswrapper[4771]: I0219 23:52:51.914119 4771 generic.go:334] "Generic (PLEG): container finished" podID="6c4fd759-bb28-42d3-9456-53e485094d96" containerID="908d28370e93e6940de18246e18e9875563dc3881d5788dcec84ebd64163ea01" exitCode=0 Feb 19 23:52:51 crc kubenswrapper[4771]: I0219 23:52:51.914718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55t2d" event={"ID":"6c4fd759-bb28-42d3-9456-53e485094d96","Type":"ContainerDied","Data":"908d28370e93e6940de18246e18e9875563dc3881d5788dcec84ebd64163ea01"} Feb 19 23:52:52 crc kubenswrapper[4771]: I0219 23:52:52.926274 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55t2d" event={"ID":"6c4fd759-bb28-42d3-9456-53e485094d96","Type":"ContainerStarted","Data":"d2475e2475c29c83500463040b64e558745eb329fe01e4b98b038136560ec2f4"} Feb 19 23:52:52 crc kubenswrapper[4771]: I0219 23:52:52.959375 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-55t2d" podStartSLOduration=2.4880797230000002 podStartE2EDuration="5.959357099s" podCreationTimestamp="2026-02-19 23:52:47 +0000 UTC" firstStartedPulling="2026-02-19 23:52:48.872565691 +0000 UTC m=+8669.144008171" lastFinishedPulling="2026-02-19 23:52:52.343843067 +0000 UTC m=+8672.615285547" observedRunningTime="2026-02-19 23:52:52.956306958 +0000 UTC m=+8673.227749438" watchObservedRunningTime="2026-02-19 23:52:52.959357099 +0000 UTC m=+8673.230799579" Feb 19 23:52:57 crc kubenswrapper[4771]: I0219 23:52:57.486532 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:57 crc kubenswrapper[4771]: I0219 23:52:57.487657 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:57 crc kubenswrapper[4771]: I0219 23:52:57.565690 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:58 crc kubenswrapper[4771]: I0219 23:52:58.078743 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:52:58 crc kubenswrapper[4771]: I0219 23:52:58.160371 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-55t2d"] Feb 19 23:53:00 crc kubenswrapper[4771]: I0219 23:53:00.005225 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-55t2d" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="registry-server" containerID="cri-o://d2475e2475c29c83500463040b64e558745eb329fe01e4b98b038136560ec2f4" gracePeriod=2 Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.024073 4771 generic.go:334] "Generic (PLEG): container finished" podID="6c4fd759-bb28-42d3-9456-53e485094d96" containerID="d2475e2475c29c83500463040b64e558745eb329fe01e4b98b038136560ec2f4" exitCode=0 Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.024453 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55t2d" event={"ID":"6c4fd759-bb28-42d3-9456-53e485094d96","Type":"ContainerDied","Data":"d2475e2475c29c83500463040b64e558745eb329fe01e4b98b038136560ec2f4"} Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.344096 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.458739 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-catalog-content\") pod \"6c4fd759-bb28-42d3-9456-53e485094d96\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.460184 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-utilities\") pod \"6c4fd759-bb28-42d3-9456-53e485094d96\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.460643 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmtpr\" (UniqueName: \"kubernetes.io/projected/6c4fd759-bb28-42d3-9456-53e485094d96-kube-api-access-cmtpr\") pod \"6c4fd759-bb28-42d3-9456-53e485094d96\" (UID: \"6c4fd759-bb28-42d3-9456-53e485094d96\") " Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.460849 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-utilities" (OuterVolumeSpecName: "utilities") pod "6c4fd759-bb28-42d3-9456-53e485094d96" (UID: "6c4fd759-bb28-42d3-9456-53e485094d96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.464269 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.469574 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4fd759-bb28-42d3-9456-53e485094d96-kube-api-access-cmtpr" (OuterVolumeSpecName: "kube-api-access-cmtpr") pod "6c4fd759-bb28-42d3-9456-53e485094d96" (UID: "6c4fd759-bb28-42d3-9456-53e485094d96"). InnerVolumeSpecName "kube-api-access-cmtpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.527366 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c4fd759-bb28-42d3-9456-53e485094d96" (UID: "6c4fd759-bb28-42d3-9456-53e485094d96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.566485 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c4fd759-bb28-42d3-9456-53e485094d96-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:01 crc kubenswrapper[4771]: I0219 23:53:01.566546 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmtpr\" (UniqueName: \"kubernetes.io/projected/6c4fd759-bb28-42d3-9456-53e485094d96-kube-api-access-cmtpr\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.042116 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55t2d" event={"ID":"6c4fd759-bb28-42d3-9456-53e485094d96","Type":"ContainerDied","Data":"68ba5ea13932ed731d065c5a5728814c523ea4f07ca86fe359669114b51b6647"} Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.042186 4771 scope.go:117] "RemoveContainer" containerID="d2475e2475c29c83500463040b64e558745eb329fe01e4b98b038136560ec2f4" Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.042391 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55t2d" Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.081918 4771 scope.go:117] "RemoveContainer" containerID="908d28370e93e6940de18246e18e9875563dc3881d5788dcec84ebd64163ea01" Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.101883 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-55t2d"] Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.115413 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-55t2d"] Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.119428 4771 scope.go:117] "RemoveContainer" containerID="ed5f9ae028bd3d07fd764b9bcaa6d0130cf6886950a67ed7f5d6dc2eff6216ac" Feb 19 23:53:02 crc kubenswrapper[4771]: I0219 23:53:02.451162 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" path="/var/lib/kubelet/pods/6c4fd759-bb28-42d3-9456-53e485094d96/volumes" Feb 19 23:53:07 crc kubenswrapper[4771]: I0219 23:53:07.122441 4771 generic.go:334] "Generic (PLEG): container finished" podID="24464846-7ee5-4dd2-8362-39c28ecaef08" containerID="a8319d79245b639bf559ed2f6b6de689fee113ab89de4de7b8d78c7df9a329c5" exitCode=0 Feb 19 23:53:07 crc kubenswrapper[4771]: I0219 23:53:07.122655 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-6648c" event={"ID":"24464846-7ee5-4dd2-8362-39c28ecaef08","Type":"ContainerDied","Data":"a8319d79245b639bf559ed2f6b6de689fee113ab89de4de7b8d78c7df9a329c5"} Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.799324 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.954959 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ssh-key-openstack-cell1\") pod \"24464846-7ee5-4dd2-8362-39c28ecaef08\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.955227 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzfjq\" (UniqueName: \"kubernetes.io/projected/24464846-7ee5-4dd2-8362-39c28ecaef08-kube-api-access-gzfjq\") pod \"24464846-7ee5-4dd2-8362-39c28ecaef08\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.955409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-2\") pod \"24464846-7ee5-4dd2-8362-39c28ecaef08\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.955433 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-telemetry-combined-ca-bundle\") pod \"24464846-7ee5-4dd2-8362-39c28ecaef08\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.955481 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-1\") pod \"24464846-7ee5-4dd2-8362-39c28ecaef08\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.955511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-0\") pod \"24464846-7ee5-4dd2-8362-39c28ecaef08\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.955569 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-inventory\") pod \"24464846-7ee5-4dd2-8362-39c28ecaef08\" (UID: \"24464846-7ee5-4dd2-8362-39c28ecaef08\") " Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.960749 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "24464846-7ee5-4dd2-8362-39c28ecaef08" (UID: "24464846-7ee5-4dd2-8362-39c28ecaef08"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.960868 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24464846-7ee5-4dd2-8362-39c28ecaef08-kube-api-access-gzfjq" (OuterVolumeSpecName: "kube-api-access-gzfjq") pod "24464846-7ee5-4dd2-8362-39c28ecaef08" (UID: "24464846-7ee5-4dd2-8362-39c28ecaef08"). InnerVolumeSpecName "kube-api-access-gzfjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.986281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "24464846-7ee5-4dd2-8362-39c28ecaef08" (UID: "24464846-7ee5-4dd2-8362-39c28ecaef08"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.987431 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "24464846-7ee5-4dd2-8362-39c28ecaef08" (UID: "24464846-7ee5-4dd2-8362-39c28ecaef08"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.994448 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "24464846-7ee5-4dd2-8362-39c28ecaef08" (UID: "24464846-7ee5-4dd2-8362-39c28ecaef08"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:53:08 crc kubenswrapper[4771]: I0219 23:53:08.995990 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "24464846-7ee5-4dd2-8362-39c28ecaef08" (UID: "24464846-7ee5-4dd2-8362-39c28ecaef08"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.004045 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-inventory" (OuterVolumeSpecName: "inventory") pod "24464846-7ee5-4dd2-8362-39c28ecaef08" (UID: "24464846-7ee5-4dd2-8362-39c28ecaef08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.058037 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.058075 4771 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.058088 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.058097 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.058108 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.058116 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/24464846-7ee5-4dd2-8362-39c28ecaef08-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.058126 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzfjq\" (UniqueName: \"kubernetes.io/projected/24464846-7ee5-4dd2-8362-39c28ecaef08-kube-api-access-gzfjq\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.141585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-6648c" event={"ID":"24464846-7ee5-4dd2-8362-39c28ecaef08","Type":"ContainerDied","Data":"927425728d37295e9488188a4d6d57de051b3f0257d069d4f84b5b48c32ae6ee"} Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.141619 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="927425728d37295e9488188a4d6d57de051b3f0257d069d4f84b5b48c32ae6ee" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.141680 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-6648c" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.312393 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-t97tm"] Feb 19 23:53:09 crc kubenswrapper[4771]: E0219 23:53:09.312902 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24464846-7ee5-4dd2-8362-39c28ecaef08" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.312929 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="24464846-7ee5-4dd2-8362-39c28ecaef08" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:53:09 crc kubenswrapper[4771]: E0219 23:53:09.312972 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="extract-utilities" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.312983 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="extract-utilities" Feb 19 23:53:09 crc kubenswrapper[4771]: E0219 23:53:09.312999 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="extract-content" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.313006 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="extract-content" Feb 19 23:53:09 crc kubenswrapper[4771]: E0219 23:53:09.313042 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="registry-server" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.313051 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="registry-server" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.313325 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="24464846-7ee5-4dd2-8362-39c28ecaef08" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.313360 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4fd759-bb28-42d3-9456-53e485094d96" containerName="registry-server" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.314240 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.316482 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.316652 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.318241 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.318411 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.318748 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.326621 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-t97tm"] Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.468714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.469768 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.469873 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.469896 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffq6k\" (UniqueName: \"kubernetes.io/projected/39a05c5e-7535-49be-888f-60ca7c4ec532-kube-api-access-ffq6k\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.469976 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.571524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.571572 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffq6k\" (UniqueName: \"kubernetes.io/projected/39a05c5e-7535-49be-888f-60ca7c4ec532-kube-api-access-ffq6k\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.571629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.571726 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.571892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.577450 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.577767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.578855 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.582709 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.600225 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffq6k\" (UniqueName: \"kubernetes.io/projected/39a05c5e-7535-49be-888f-60ca7c4ec532-kube-api-access-ffq6k\") pod \"neutron-sriov-openstack-openstack-cell1-t97tm\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:09 crc kubenswrapper[4771]: I0219 23:53:09.631491 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:53:10 crc kubenswrapper[4771]: I0219 23:53:10.224970 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-t97tm"] Feb 19 23:53:11 crc kubenswrapper[4771]: I0219 23:53:11.167582 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" event={"ID":"39a05c5e-7535-49be-888f-60ca7c4ec532","Type":"ContainerStarted","Data":"ef113fa16eeb922dcacd31aabe5cd567cb92e1eae866ea3bee67db4e3fe2c836"} Feb 19 23:53:11 crc kubenswrapper[4771]: I0219 23:53:11.167942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" event={"ID":"39a05c5e-7535-49be-888f-60ca7c4ec532","Type":"ContainerStarted","Data":"ffd3e2a2429c077d4b4c57e554554ec8dddfd51eaed50008cebfc201e350bc11"} Feb 19 23:53:11 crc kubenswrapper[4771]: I0219 23:53:11.198974 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" podStartSLOduration=1.800015343 podStartE2EDuration="2.198955736s" podCreationTimestamp="2026-02-19 23:53:09 +0000 UTC" firstStartedPulling="2026-02-19 23:53:10.229949027 +0000 UTC m=+8690.501391497" lastFinishedPulling="2026-02-19 23:53:10.62888942 +0000 UTC m=+8690.900331890" observedRunningTime="2026-02-19 23:53:11.194132617 +0000 UTC m=+8691.465575097" watchObservedRunningTime="2026-02-19 23:53:11.198955736 +0000 UTC m=+8691.470398216" Feb 19 23:53:12 crc kubenswrapper[4771]: I0219 23:53:12.957149 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:53:12 crc kubenswrapper[4771]: I0219 23:53:12.957485 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:53:12 crc kubenswrapper[4771]: I0219 23:53:12.957551 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:53:12 crc kubenswrapper[4771]: I0219 23:53:12.958731 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fa155cb0383a0fbdfae911759c38f2f628afe45f731553dbcbc941a641c2f0a"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:53:12 crc kubenswrapper[4771]: I0219 23:53:12.958842 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://7fa155cb0383a0fbdfae911759c38f2f628afe45f731553dbcbc941a641c2f0a" gracePeriod=600 Feb 19 23:53:13 crc kubenswrapper[4771]: I0219 23:53:13.201858 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="7fa155cb0383a0fbdfae911759c38f2f628afe45f731553dbcbc941a641c2f0a" exitCode=0 Feb 19 23:53:13 crc kubenswrapper[4771]: I0219 23:53:13.202382 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"7fa155cb0383a0fbdfae911759c38f2f628afe45f731553dbcbc941a641c2f0a"} Feb 19 23:53:13 crc kubenswrapper[4771]: I0219 23:53:13.202446 4771 scope.go:117] "RemoveContainer" containerID="8609fbe81e891a0243af6c64a25e46867f8e57dfe6da9de2d3abd74f11fd1293" Feb 19 23:53:14 crc kubenswrapper[4771]: I0219 23:53:14.218001 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11"} Feb 19 23:54:18 crc kubenswrapper[4771]: I0219 23:54:18.025976 4771 generic.go:334] "Generic (PLEG): container finished" podID="39a05c5e-7535-49be-888f-60ca7c4ec532" containerID="ef113fa16eeb922dcacd31aabe5cd567cb92e1eae866ea3bee67db4e3fe2c836" exitCode=0 Feb 19 23:54:18 crc kubenswrapper[4771]: I0219 23:54:18.026070 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" event={"ID":"39a05c5e-7535-49be-888f-60ca7c4ec532","Type":"ContainerDied","Data":"ef113fa16eeb922dcacd31aabe5cd567cb92e1eae866ea3bee67db4e3fe2c836"} Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.635173 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.788208 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-ssh-key-openstack-cell1\") pod \"39a05c5e-7535-49be-888f-60ca7c4ec532\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.788303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffq6k\" (UniqueName: \"kubernetes.io/projected/39a05c5e-7535-49be-888f-60ca7c4ec532-kube-api-access-ffq6k\") pod \"39a05c5e-7535-49be-888f-60ca7c4ec532\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.788459 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-agent-neutron-config-0\") pod \"39a05c5e-7535-49be-888f-60ca7c4ec532\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.788532 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-inventory\") pod \"39a05c5e-7535-49be-888f-60ca7c4ec532\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.788598 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-combined-ca-bundle\") pod \"39a05c5e-7535-49be-888f-60ca7c4ec532\" (UID: \"39a05c5e-7535-49be-888f-60ca7c4ec532\") " Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.794467 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a05c5e-7535-49be-888f-60ca7c4ec532-kube-api-access-ffq6k" (OuterVolumeSpecName: "kube-api-access-ffq6k") pod "39a05c5e-7535-49be-888f-60ca7c4ec532" (UID: "39a05c5e-7535-49be-888f-60ca7c4ec532"). InnerVolumeSpecName "kube-api-access-ffq6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.802932 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "39a05c5e-7535-49be-888f-60ca7c4ec532" (UID: "39a05c5e-7535-49be-888f-60ca7c4ec532"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.828074 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "39a05c5e-7535-49be-888f-60ca7c4ec532" (UID: "39a05c5e-7535-49be-888f-60ca7c4ec532"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.828909 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-inventory" (OuterVolumeSpecName: "inventory") pod "39a05c5e-7535-49be-888f-60ca7c4ec532" (UID: "39a05c5e-7535-49be-888f-60ca7c4ec532"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.829000 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "39a05c5e-7535-49be-888f-60ca7c4ec532" (UID: "39a05c5e-7535-49be-888f-60ca7c4ec532"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.892004 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.892059 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffq6k\" (UniqueName: \"kubernetes.io/projected/39a05c5e-7535-49be-888f-60ca7c4ec532-kube-api-access-ffq6k\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.892073 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.892089 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:19 crc kubenswrapper[4771]: I0219 23:54:19.892104 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a05c5e-7535-49be-888f-60ca7c4ec532-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.052378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" event={"ID":"39a05c5e-7535-49be-888f-60ca7c4ec532","Type":"ContainerDied","Data":"ffd3e2a2429c077d4b4c57e554554ec8dddfd51eaed50008cebfc201e350bc11"} Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.052758 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd3e2a2429c077d4b4c57e554554ec8dddfd51eaed50008cebfc201e350bc11" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.052469 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-t97tm" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.162229 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-tszv4"] Feb 19 23:54:20 crc kubenswrapper[4771]: E0219 23:54:20.162733 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a05c5e-7535-49be-888f-60ca7c4ec532" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.162752 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a05c5e-7535-49be-888f-60ca7c4ec532" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.163000 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a05c5e-7535-49be-888f-60ca7c4ec532" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.163851 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.165692 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.166645 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.166767 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.167089 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.167571 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.198101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.198157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.198193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm46d\" (UniqueName: \"kubernetes.io/projected/8d83e7d4-689a-4333-9566-348cae7948f2-kube-api-access-xm46d\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.198218 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.198326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.206136 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-tszv4"] Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.299861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm46d\" (UniqueName: \"kubernetes.io/projected/8d83e7d4-689a-4333-9566-348cae7948f2-kube-api-access-xm46d\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.299919 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.300503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.300676 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.300819 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.304883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.305111 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.306956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.311496 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.319612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm46d\" (UniqueName: \"kubernetes.io/projected/8d83e7d4-689a-4333-9566-348cae7948f2-kube-api-access-xm46d\") pod \"neutron-dhcp-openstack-openstack-cell1-tszv4\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.503440 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:54:20 crc kubenswrapper[4771]: I0219 23:54:20.511059 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:54:21 crc kubenswrapper[4771]: I0219 23:54:21.123425 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-tszv4"] Feb 19 23:54:21 crc kubenswrapper[4771]: I0219 23:54:21.560094 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:54:22 crc kubenswrapper[4771]: I0219 23:54:22.075633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" event={"ID":"8d83e7d4-689a-4333-9566-348cae7948f2","Type":"ContainerStarted","Data":"b2b626a1c08b504e182887110e376d2c4694b7109bd61a9f006c2883862144a6"} Feb 19 23:54:22 crc kubenswrapper[4771]: I0219 23:54:22.076051 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" event={"ID":"8d83e7d4-689a-4333-9566-348cae7948f2","Type":"ContainerStarted","Data":"d2d02379922be2caf139f0af93dbef635c61b791bbef8529c4e7941dde8bdfec"} Feb 19 23:54:22 crc kubenswrapper[4771]: I0219 23:54:22.117503 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" podStartSLOduration=1.703348885 podStartE2EDuration="2.117483055s" podCreationTimestamp="2026-02-19 23:54:20 +0000 UTC" firstStartedPulling="2026-02-19 23:54:21.143851352 +0000 UTC m=+8761.415293822" lastFinishedPulling="2026-02-19 23:54:21.557985522 +0000 UTC m=+8761.829427992" observedRunningTime="2026-02-19 23:54:22.100287195 +0000 UTC m=+8762.371729735" watchObservedRunningTime="2026-02-19 23:54:22.117483055 +0000 UTC m=+8762.388925535" Feb 19 23:55:34 crc kubenswrapper[4771]: I0219 23:55:34.002985 4771 generic.go:334] "Generic (PLEG): container finished" podID="8d83e7d4-689a-4333-9566-348cae7948f2" containerID="b2b626a1c08b504e182887110e376d2c4694b7109bd61a9f006c2883862144a6" exitCode=0 Feb 19 23:55:34 crc kubenswrapper[4771]: I0219 23:55:34.003093 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" event={"ID":"8d83e7d4-689a-4333-9566-348cae7948f2","Type":"ContainerDied","Data":"b2b626a1c08b504e182887110e376d2c4694b7109bd61a9f006c2883862144a6"} Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.613443 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.685983 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm46d\" (UniqueName: \"kubernetes.io/projected/8d83e7d4-689a-4333-9566-348cae7948f2-kube-api-access-xm46d\") pod \"8d83e7d4-689a-4333-9566-348cae7948f2\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.686078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-inventory\") pod \"8d83e7d4-689a-4333-9566-348cae7948f2\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.686477 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-agent-neutron-config-0\") pod \"8d83e7d4-689a-4333-9566-348cae7948f2\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.686533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-combined-ca-bundle\") pod \"8d83e7d4-689a-4333-9566-348cae7948f2\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.686609 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-ssh-key-openstack-cell1\") pod \"8d83e7d4-689a-4333-9566-348cae7948f2\" (UID: \"8d83e7d4-689a-4333-9566-348cae7948f2\") " Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.695918 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "8d83e7d4-689a-4333-9566-348cae7948f2" (UID: "8d83e7d4-689a-4333-9566-348cae7948f2"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.698702 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d83e7d4-689a-4333-9566-348cae7948f2-kube-api-access-xm46d" (OuterVolumeSpecName: "kube-api-access-xm46d") pod "8d83e7d4-689a-4333-9566-348cae7948f2" (UID: "8d83e7d4-689a-4333-9566-348cae7948f2"). InnerVolumeSpecName "kube-api-access-xm46d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.732938 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-inventory" (OuterVolumeSpecName: "inventory") pod "8d83e7d4-689a-4333-9566-348cae7948f2" (UID: "8d83e7d4-689a-4333-9566-348cae7948f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.733756 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8d83e7d4-689a-4333-9566-348cae7948f2" (UID: "8d83e7d4-689a-4333-9566-348cae7948f2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.751652 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "8d83e7d4-689a-4333-9566-348cae7948f2" (UID: "8d83e7d4-689a-4333-9566-348cae7948f2"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.789291 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm46d\" (UniqueName: \"kubernetes.io/projected/8d83e7d4-689a-4333-9566-348cae7948f2-kube-api-access-xm46d\") on node \"crc\" DevicePath \"\"" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.789333 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.789350 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.789363 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:55:35 crc kubenswrapper[4771]: I0219 23:55:35.789376 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8d83e7d4-689a-4333-9566-348cae7948f2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:55:36 crc kubenswrapper[4771]: I0219 23:55:36.032541 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" event={"ID":"8d83e7d4-689a-4333-9566-348cae7948f2","Type":"ContainerDied","Data":"d2d02379922be2caf139f0af93dbef635c61b791bbef8529c4e7941dde8bdfec"} Feb 19 23:55:36 crc kubenswrapper[4771]: I0219 23:55:36.032578 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2d02379922be2caf139f0af93dbef635c61b791bbef8529c4e7941dde8bdfec" Feb 19 23:55:36 crc kubenswrapper[4771]: I0219 23:55:36.032637 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-tszv4" Feb 19 23:55:42 crc kubenswrapper[4771]: I0219 23:55:42.956665 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:55:42 crc kubenswrapper[4771]: I0219 23:55:42.957394 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.089084 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.089970 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="cb4fdebe-aba4-4847-ba12-c20a793ca719" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" gracePeriod=30 Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.654633 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.655342 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="2c591c46-e4c4-4402-a868-a4d4dde101b3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9e024790acb94f007987f7b27d3b53e1297a5172b31221baada6e80d27f6232f" gracePeriod=30 Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.785269 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.785790 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="069fb38a-cbd9-4bb9-8441-682d97292af1" containerName="nova-scheduler-scheduler" containerID="cri-o://ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" gracePeriod=30 Feb 19 23:56:06 crc kubenswrapper[4771]: E0219 23:56:06.806756 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.807372 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.807586 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-log" containerID="cri-o://6a5c974672114f685f86c13e4161538433eb963951f5922fc576ce2a839ec846" gracePeriod=30 Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.807693 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-api" containerID="cri-o://dad4a04c947f88686d8a1958e783ebbcd808b0638945c37446ccb842f6bfda10" gracePeriod=30 Feb 19 23:56:06 crc kubenswrapper[4771]: E0219 23:56:06.816684 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:56:06 crc kubenswrapper[4771]: E0219 23:56:06.821557 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:56:06 crc kubenswrapper[4771]: E0219 23:56:06.821626 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="cb4fdebe-aba4-4847-ba12-c20a793ca719" containerName="nova-cell0-conductor-conductor" Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.834743 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.834963 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-log" containerID="cri-o://48e8dc4d58568b5589e042f4e32c0eb8d77975bd450146d13402de9361ed4871" gracePeriod=30 Feb 19 23:56:06 crc kubenswrapper[4771]: I0219 23:56:06.835110 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-metadata" containerID="cri-o://ad2946ab4ef001da23936de729acc2e1f8d7a36e735134c565d78c56c9146957" gracePeriod=30 Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.159947 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl"] Feb 19 23:56:07 crc kubenswrapper[4771]: E0219 23:56:07.161666 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d83e7d4-689a-4333-9566-348cae7948f2" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.161777 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d83e7d4-689a-4333-9566-348cae7948f2" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.162085 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d83e7d4-689a-4333-9566-348cae7948f2" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.163099 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.170238 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.171832 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.171911 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-8pjmx" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.172009 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.173319 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.173574 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.174577 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.179237 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl"] Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228557 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228739 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/078da17d-fe59-470f-9ab1-e265491c5997-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228772 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228834 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228872 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228902 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.228975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4w79\" (UniqueName: \"kubernetes.io/projected/078da17d-fe59-470f-9ab1-e265491c5997-kube-api-access-r4w79\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.229004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.229099 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330138 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/078da17d-fe59-470f-9ab1-e265491c5997-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330368 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330401 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330431 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330460 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330516 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330548 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330586 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4w79\" (UniqueName: \"kubernetes.io/projected/078da17d-fe59-470f-9ab1-e265491c5997-kube-api-access-r4w79\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.330615 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.332531 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/078da17d-fe59-470f-9ab1-e265491c5997-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.336270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.336776 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.336787 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.337222 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.338041 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.338472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.339392 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.340509 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.341712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.366746 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4w79\" (UniqueName: \"kubernetes.io/projected/078da17d-fe59-470f-9ab1-e265491c5997-kube-api-access-r4w79\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.462772 4771 generic.go:334] "Generic (PLEG): container finished" podID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerID="6a5c974672114f685f86c13e4161538433eb963951f5922fc576ce2a839ec846" exitCode=143 Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.462871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b665f9c9-6923-456e-8e7e-01e55da9d3af","Type":"ContainerDied","Data":"6a5c974672114f685f86c13e4161538433eb963951f5922fc576ce2a839ec846"} Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.466389 4771 generic.go:334] "Generic (PLEG): container finished" podID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerID="48e8dc4d58568b5589e042f4e32c0eb8d77975bd450146d13402de9361ed4871" exitCode=143 Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.466476 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81","Type":"ContainerDied","Data":"48e8dc4d58568b5589e042f4e32c0eb8d77975bd450146d13402de9361ed4871"} Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.469160 4771 generic.go:334] "Generic (PLEG): container finished" podID="2c591c46-e4c4-4402-a868-a4d4dde101b3" containerID="9e024790acb94f007987f7b27d3b53e1297a5172b31221baada6e80d27f6232f" exitCode=0 Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.469213 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c591c46-e4c4-4402-a868-a4d4dde101b3","Type":"ContainerDied","Data":"9e024790acb94f007987f7b27d3b53e1297a5172b31221baada6e80d27f6232f"} Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.482621 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:56:07 crc kubenswrapper[4771]: E0219 23:56:07.741831 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:56:07 crc kubenswrapper[4771]: E0219 23:56:07.743276 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.745170 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:07 crc kubenswrapper[4771]: E0219 23:56:07.745193 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:56:07 crc kubenswrapper[4771]: E0219 23:56:07.745235 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="069fb38a-cbd9-4bb9-8441-682d97292af1" containerName="nova-scheduler-scheduler" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.845502 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr5m6\" (UniqueName: \"kubernetes.io/projected/2c591c46-e4c4-4402-a868-a4d4dde101b3-kube-api-access-tr5m6\") pod \"2c591c46-e4c4-4402-a868-a4d4dde101b3\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.845624 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-combined-ca-bundle\") pod \"2c591c46-e4c4-4402-a868-a4d4dde101b3\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.845752 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-config-data\") pod \"2c591c46-e4c4-4402-a868-a4d4dde101b3\" (UID: \"2c591c46-e4c4-4402-a868-a4d4dde101b3\") " Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.851308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c591c46-e4c4-4402-a868-a4d4dde101b3-kube-api-access-tr5m6" (OuterVolumeSpecName: "kube-api-access-tr5m6") pod "2c591c46-e4c4-4402-a868-a4d4dde101b3" (UID: "2c591c46-e4c4-4402-a868-a4d4dde101b3"). InnerVolumeSpecName "kube-api-access-tr5m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.880968 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl"] Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.885349 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c591c46-e4c4-4402-a868-a4d4dde101b3" (UID: "2c591c46-e4c4-4402-a868-a4d4dde101b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.889103 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-config-data" (OuterVolumeSpecName: "config-data") pod "2c591c46-e4c4-4402-a868-a4d4dde101b3" (UID: "2c591c46-e4c4-4402-a868-a4d4dde101b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.948279 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.948318 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c591c46-e4c4-4402-a868-a4d4dde101b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:07 crc kubenswrapper[4771]: I0219 23:56:07.948331 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr5m6\" (UniqueName: \"kubernetes.io/projected/2c591c46-e4c4-4402-a868-a4d4dde101b3-kube-api-access-tr5m6\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.477890 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" event={"ID":"078da17d-fe59-470f-9ab1-e265491c5997","Type":"ContainerStarted","Data":"15549d082b2ab8e5bef73db8d56c3cedd8eb46f6b9db239ae51ace6dd2dfe3cf"} Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.479737 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2c591c46-e4c4-4402-a868-a4d4dde101b3","Type":"ContainerDied","Data":"a75e073b3a072396ba96f5f0f832cc1fb30a3bccc431da10e23e686b29d3e7fb"} Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.479762 4771 scope.go:117] "RemoveContainer" containerID="9e024790acb94f007987f7b27d3b53e1297a5172b31221baada6e80d27f6232f" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.479814 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.510400 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.522007 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.535961 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:56:08 crc kubenswrapper[4771]: E0219 23:56:08.536476 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c591c46-e4c4-4402-a868-a4d4dde101b3" containerName="nova-cell1-conductor-conductor" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.536494 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c591c46-e4c4-4402-a868-a4d4dde101b3" containerName="nova-cell1-conductor-conductor" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.536668 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c591c46-e4c4-4402-a868-a4d4dde101b3" containerName="nova-cell1-conductor-conductor" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.537472 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.539576 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.546550 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.563651 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3056f313-9503-4088-afaa-ea7e28203a49-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.563875 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3056f313-9503-4088-afaa-ea7e28203a49-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.563999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7q2w\" (UniqueName: \"kubernetes.io/projected/3056f313-9503-4088-afaa-ea7e28203a49-kube-api-access-w7q2w\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.665747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3056f313-9503-4088-afaa-ea7e28203a49-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.665886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3056f313-9503-4088-afaa-ea7e28203a49-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.665960 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7q2w\" (UniqueName: \"kubernetes.io/projected/3056f313-9503-4088-afaa-ea7e28203a49-kube-api-access-w7q2w\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.670898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3056f313-9503-4088-afaa-ea7e28203a49-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.673055 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3056f313-9503-4088-afaa-ea7e28203a49-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.683626 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7q2w\" (UniqueName: \"kubernetes.io/projected/3056f313-9503-4088-afaa-ea7e28203a49-kube-api-access-w7q2w\") pod \"nova-cell1-conductor-0\" (UID: \"3056f313-9503-4088-afaa-ea7e28203a49\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:08 crc kubenswrapper[4771]: I0219 23:56:08.856621 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:09 crc kubenswrapper[4771]: I0219 23:56:09.381845 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:56:09 crc kubenswrapper[4771]: I0219 23:56:09.491630 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3056f313-9503-4088-afaa-ea7e28203a49","Type":"ContainerStarted","Data":"c9abb65d0eff14ca19bab18ec999b043a0f439cf67f540fc7e3990cb680c51c5"} Feb 19 23:56:09 crc kubenswrapper[4771]: I0219 23:56:09.493208 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" event={"ID":"078da17d-fe59-470f-9ab1-e265491c5997","Type":"ContainerStarted","Data":"8ee3e743d56e6e45e6945c3fa34f0e480eac2ea58abf26b3ed0010f914053b78"} Feb 19 23:56:09 crc kubenswrapper[4771]: I0219 23:56:09.516847 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" podStartSLOduration=2.137911295 podStartE2EDuration="2.516830201s" podCreationTimestamp="2026-02-19 23:56:07 +0000 UTC" firstStartedPulling="2026-02-19 23:56:07.887219642 +0000 UTC m=+8868.158662102" lastFinishedPulling="2026-02-19 23:56:08.266138528 +0000 UTC m=+8868.537581008" observedRunningTime="2026-02-19 23:56:09.514475137 +0000 UTC m=+8869.785917617" watchObservedRunningTime="2026-02-19 23:56:09.516830201 +0000 UTC m=+8869.788272671" Feb 19 23:56:09 crc kubenswrapper[4771]: I0219 23:56:09.987059 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.103:8775/\": read tcp 10.217.0.2:42350->10.217.1.103:8775: read: connection reset by peer" Feb 19 23:56:09 crc kubenswrapper[4771]: I0219 23:56:09.987111 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.103:8775/\": read tcp 10.217.0.2:42352->10.217.1.103:8775: read: connection reset by peer" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.479085 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c591c46-e4c4-4402-a868-a4d4dde101b3" path="/var/lib/kubelet/pods/2c591c46-e4c4-4402-a868-a4d4dde101b3/volumes" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.512191 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3056f313-9503-4088-afaa-ea7e28203a49","Type":"ContainerStarted","Data":"95d83f11acb85b3469a24222c850304f8d0e0bb03dd842ed823e0112e2e632d5"} Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.512429 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.533143 4771 generic.go:334] "Generic (PLEG): container finished" podID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerID="dad4a04c947f88686d8a1958e783ebbcd808b0638945c37446ccb842f6bfda10" exitCode=0 Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.533256 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b665f9c9-6923-456e-8e7e-01e55da9d3af","Type":"ContainerDied","Data":"dad4a04c947f88686d8a1958e783ebbcd808b0638945c37446ccb842f6bfda10"} Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.533281 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b665f9c9-6923-456e-8e7e-01e55da9d3af","Type":"ContainerDied","Data":"69dd320a71f64bd79198e5ffb2eb3f857f73532cc4923eb2cdc9819fc501d46a"} Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.533293 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69dd320a71f64bd79198e5ffb2eb3f857f73532cc4923eb2cdc9819fc501d46a" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.535401 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.535385416 podStartE2EDuration="2.535385416s" podCreationTimestamp="2026-02-19 23:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:56:10.525701077 +0000 UTC m=+8870.797143557" watchObservedRunningTime="2026-02-19 23:56:10.535385416 +0000 UTC m=+8870.806827876" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.545417 4771 generic.go:334] "Generic (PLEG): container finished" podID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerID="ad2946ab4ef001da23936de729acc2e1f8d7a36e735134c565d78c56c9146957" exitCode=0 Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.546317 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81","Type":"ContainerDied","Data":"ad2946ab4ef001da23936de729acc2e1f8d7a36e735134c565d78c56c9146957"} Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.577762 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.587969 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.612869 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl9xz\" (UniqueName: \"kubernetes.io/projected/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-kube-api-access-dl9xz\") pod \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.612964 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-config-data\") pod \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613014 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-logs\") pod \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613069 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-nova-metadata-tls-certs\") pod \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613089 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-public-tls-certs\") pod \"b665f9c9-6923-456e-8e7e-01e55da9d3af\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-config-data\") pod \"b665f9c9-6923-456e-8e7e-01e55da9d3af\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-combined-ca-bundle\") pod \"b665f9c9-6923-456e-8e7e-01e55da9d3af\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613244 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-combined-ca-bundle\") pod \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\" (UID: \"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613289 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd67m\" (UniqueName: \"kubernetes.io/projected/b665f9c9-6923-456e-8e7e-01e55da9d3af-kube-api-access-kd67m\") pod \"b665f9c9-6923-456e-8e7e-01e55da9d3af\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613305 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b665f9c9-6923-456e-8e7e-01e55da9d3af-logs\") pod \"b665f9c9-6923-456e-8e7e-01e55da9d3af\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-internal-tls-certs\") pod \"b665f9c9-6923-456e-8e7e-01e55da9d3af\" (UID: \"b665f9c9-6923-456e-8e7e-01e55da9d3af\") " Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613488 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-logs" (OuterVolumeSpecName: "logs") pod "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" (UID: "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.613988 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.614875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b665f9c9-6923-456e-8e7e-01e55da9d3af-logs" (OuterVolumeSpecName: "logs") pod "b665f9c9-6923-456e-8e7e-01e55da9d3af" (UID: "b665f9c9-6923-456e-8e7e-01e55da9d3af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.623139 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-kube-api-access-dl9xz" (OuterVolumeSpecName: "kube-api-access-dl9xz") pod "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" (UID: "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81"). InnerVolumeSpecName "kube-api-access-dl9xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.641821 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b665f9c9-6923-456e-8e7e-01e55da9d3af-kube-api-access-kd67m" (OuterVolumeSpecName: "kube-api-access-kd67m") pod "b665f9c9-6923-456e-8e7e-01e55da9d3af" (UID: "b665f9c9-6923-456e-8e7e-01e55da9d3af"). InnerVolumeSpecName "kube-api-access-kd67m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.686074 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b665f9c9-6923-456e-8e7e-01e55da9d3af" (UID: "b665f9c9-6923-456e-8e7e-01e55da9d3af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.690261 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-config-data" (OuterVolumeSpecName: "config-data") pod "b665f9c9-6923-456e-8e7e-01e55da9d3af" (UID: "b665f9c9-6923-456e-8e7e-01e55da9d3af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.716355 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd67m\" (UniqueName: \"kubernetes.io/projected/b665f9c9-6923-456e-8e7e-01e55da9d3af-kube-api-access-kd67m\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.716385 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b665f9c9-6923-456e-8e7e-01e55da9d3af-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.716394 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl9xz\" (UniqueName: \"kubernetes.io/projected/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-kube-api-access-dl9xz\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.716404 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.716413 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.722842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-config-data" (OuterVolumeSpecName: "config-data") pod "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" (UID: "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.724907 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" (UID: "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.735219 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b665f9c9-6923-456e-8e7e-01e55da9d3af" (UID: "b665f9c9-6923-456e-8e7e-01e55da9d3af"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.771501 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" (UID: "d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.773772 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b665f9c9-6923-456e-8e7e-01e55da9d3af" (UID: "b665f9c9-6923-456e-8e7e-01e55da9d3af"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.819438 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.819474 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.819484 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.819493 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:10 crc kubenswrapper[4771]: I0219 23:56:10.819504 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b665f9c9-6923-456e-8e7e-01e55da9d3af-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.144649 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.229948 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-config-data\") pod \"cb4fdebe-aba4-4847-ba12-c20a793ca719\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.230108 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttt98\" (UniqueName: \"kubernetes.io/projected/cb4fdebe-aba4-4847-ba12-c20a793ca719-kube-api-access-ttt98\") pod \"cb4fdebe-aba4-4847-ba12-c20a793ca719\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.230307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-combined-ca-bundle\") pod \"cb4fdebe-aba4-4847-ba12-c20a793ca719\" (UID: \"cb4fdebe-aba4-4847-ba12-c20a793ca719\") " Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.252221 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4fdebe-aba4-4847-ba12-c20a793ca719-kube-api-access-ttt98" (OuterVolumeSpecName: "kube-api-access-ttt98") pod "cb4fdebe-aba4-4847-ba12-c20a793ca719" (UID: "cb4fdebe-aba4-4847-ba12-c20a793ca719"). InnerVolumeSpecName "kube-api-access-ttt98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.259768 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-config-data" (OuterVolumeSpecName: "config-data") pod "cb4fdebe-aba4-4847-ba12-c20a793ca719" (UID: "cb4fdebe-aba4-4847-ba12-c20a793ca719"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.266279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb4fdebe-aba4-4847-ba12-c20a793ca719" (UID: "cb4fdebe-aba4-4847-ba12-c20a793ca719"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.332256 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.332284 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4fdebe-aba4-4847-ba12-c20a793ca719-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.332294 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttt98\" (UniqueName: \"kubernetes.io/projected/cb4fdebe-aba4-4847-ba12-c20a793ca719-kube-api-access-ttt98\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.554675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81","Type":"ContainerDied","Data":"ea78ba87e6c4e734ea3dcaa663905d09b4454b40fc80062194ae696d7b848117"} Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.554723 4771 scope.go:117] "RemoveContainer" containerID="ad2946ab4ef001da23936de729acc2e1f8d7a36e735134c565d78c56c9146957" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.554855 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.564207 4771 generic.go:334] "Generic (PLEG): container finished" podID="cb4fdebe-aba4-4847-ba12-c20a793ca719" containerID="a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" exitCode=0 Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.564270 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb4fdebe-aba4-4847-ba12-c20a793ca719","Type":"ContainerDied","Data":"a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74"} Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.564300 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.564303 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.564312 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb4fdebe-aba4-4847-ba12-c20a793ca719","Type":"ContainerDied","Data":"2f855b1f474a3afda26bb4b3c15130befb5929b6908ab3f9a55768143388e8ec"} Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.594140 4771 scope.go:117] "RemoveContainer" containerID="48e8dc4d58568b5589e042f4e32c0eb8d77975bd450146d13402de9361ed4871" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.622082 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.626344 4771 scope.go:117] "RemoveContainer" containerID="a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.634767 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.665314 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: E0219 23:56:11.665753 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-log" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.665771 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-log" Feb 19 23:56:11 crc kubenswrapper[4771]: E0219 23:56:11.665787 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4fdebe-aba4-4847-ba12-c20a793ca719" containerName="nova-cell0-conductor-conductor" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.665794 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4fdebe-aba4-4847-ba12-c20a793ca719" containerName="nova-cell0-conductor-conductor" Feb 19 23:56:11 crc kubenswrapper[4771]: E0219 23:56:11.665808 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-metadata" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.665813 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-metadata" Feb 19 23:56:11 crc kubenswrapper[4771]: E0219 23:56:11.665826 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-log" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.665831 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-log" Feb 19 23:56:11 crc kubenswrapper[4771]: E0219 23:56:11.665850 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-api" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.665855 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-api" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.666042 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-metadata" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.666053 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4fdebe-aba4-4847-ba12-c20a793ca719" containerName="nova-cell0-conductor-conductor" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.666064 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-log" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.666073 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" containerName="nova-api-api" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.666084 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" containerName="nova-metadata-log" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.666735 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.684429 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.684496 4771 scope.go:117] "RemoveContainer" containerID="a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.684575 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: E0219 23:56:11.685304 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74\": container with ID starting with a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74 not found: ID does not exist" containerID="a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.685355 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74"} err="failed to get container status \"a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74\": rpc error: code = NotFound desc = could not find container \"a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74\": container with ID starting with a19d343ed877063935b4f4300b15c5476e13eb763cfc0aedbaf1273078261e74 not found: ID does not exist" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.702600 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.728331 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.761632 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.774996 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.776907 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.780673 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.780704 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.791955 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.804472 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.806397 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.808337 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.809839 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.810005 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.813388 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.846918 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.847116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.847336 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6bb\" (UniqueName: \"kubernetes.io/projected/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-kube-api-access-4d6bb\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.855914 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.949368 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.949491 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71b1081b-1c98-4691-980a-7035c4996dc3-logs\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.949899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.949951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950043 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6bb\" (UniqueName: \"kubernetes.io/projected/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-kube-api-access-4d6bb\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-config-data\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950103 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx4tw\" (UniqueName: \"kubernetes.io/projected/af59e577-5041-4237-9405-d2bc65fa79f2-kube-api-access-vx4tw\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950330 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-config-data\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950621 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950695 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl5h7\" (UniqueName: \"kubernetes.io/projected/71b1081b-1c98-4691-980a-7035c4996dc3-kube-api-access-hl5h7\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.950717 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af59e577-5041-4237-9405-d2bc65fa79f2-logs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.957711 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.967370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6bb\" (UniqueName: \"kubernetes.io/projected/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-kube-api-access-4d6bb\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:11 crc kubenswrapper[4771]: I0219 23:56:11.967551 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d41e848-b21f-4f38-96a2-dc7ae49ffb0b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.044365 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052632 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-config-data\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052718 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af59e577-5041-4237-9405-d2bc65fa79f2-logs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052734 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl5h7\" (UniqueName: \"kubernetes.io/projected/71b1081b-1c98-4691-980a-7035c4996dc3-kube-api-access-hl5h7\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052767 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052787 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71b1081b-1c98-4691-980a-7035c4996dc3-logs\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052822 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052854 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-config-data\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052873 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.052921 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx4tw\" (UniqueName: \"kubernetes.io/projected/af59e577-5041-4237-9405-d2bc65fa79f2-kube-api-access-vx4tw\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.053740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71b1081b-1c98-4691-980a-7035c4996dc3-logs\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.054113 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af59e577-5041-4237-9405-d2bc65fa79f2-logs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.056374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-public-tls-certs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.056857 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.056994 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.057009 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-config-data\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.059387 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b1081b-1c98-4691-980a-7035c4996dc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.062668 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.063337 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af59e577-5041-4237-9405-d2bc65fa79f2-config-data\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.071709 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx4tw\" (UniqueName: \"kubernetes.io/projected/af59e577-5041-4237-9405-d2bc65fa79f2-kube-api-access-vx4tw\") pod \"nova-api-0\" (UID: \"af59e577-5041-4237-9405-d2bc65fa79f2\") " pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.074536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl5h7\" (UniqueName: \"kubernetes.io/projected/71b1081b-1c98-4691-980a-7035c4996dc3-kube-api-access-hl5h7\") pod \"nova-metadata-0\" (UID: \"71b1081b-1c98-4691-980a-7035c4996dc3\") " pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.092538 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.149104 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.452762 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b665f9c9-6923-456e-8e7e-01e55da9d3af" path="/var/lib/kubelet/pods/b665f9c9-6923-456e-8e7e-01e55da9d3af/volumes" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.454510 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4fdebe-aba4-4847-ba12-c20a793ca719" path="/var/lib/kubelet/pods/cb4fdebe-aba4-4847-ba12-c20a793ca719/volumes" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.455403 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81" path="/var/lib/kubelet/pods/d2c0051f-548d-4a65-b4c1-1c7ccaa3ae81/volumes" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.580566 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:56:12 crc kubenswrapper[4771]: W0219 23:56:12.685861 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b1081b_1c98_4691_980a_7035c4996dc3.slice/crio-3721089bda518d54d5003cbeea5dcc10c8b4b64e4b1d01b24f87e3462f9f35e9 WatchSource:0}: Error finding container 3721089bda518d54d5003cbeea5dcc10c8b4b64e4b1d01b24f87e3462f9f35e9: Status 404 returned error can't find the container with id 3721089bda518d54d5003cbeea5dcc10c8b4b64e4b1d01b24f87e3462f9f35e9 Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.689673 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:56:12 crc kubenswrapper[4771]: E0219 23:56:12.713245 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:56:12 crc kubenswrapper[4771]: E0219 23:56:12.714636 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:56:12 crc kubenswrapper[4771]: E0219 23:56:12.715903 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:56:12 crc kubenswrapper[4771]: E0219 23:56:12.715957 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="069fb38a-cbd9-4bb9-8441-682d97292af1" containerName="nova-scheduler-scheduler" Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.733816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:56:12 crc kubenswrapper[4771]: W0219 23:56:12.734328 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf59e577_5041_4237_9405_d2bc65fa79f2.slice/crio-524c0c8d03fdcf3ac7409b5797f8f160153b20dcae2ff6af06e9f22952e1844d WatchSource:0}: Error finding container 524c0c8d03fdcf3ac7409b5797f8f160153b20dcae2ff6af06e9f22952e1844d: Status 404 returned error can't find the container with id 524c0c8d03fdcf3ac7409b5797f8f160153b20dcae2ff6af06e9f22952e1844d Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.957035 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:56:12 crc kubenswrapper[4771]: I0219 23:56:12.957105 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.599822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b","Type":"ContainerStarted","Data":"b1a237f42f81744f4b699bacfe3d0e30f7808cac4ea277f59fc40c374c4fc028"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.600116 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6d41e848-b21f-4f38-96a2-dc7ae49ffb0b","Type":"ContainerStarted","Data":"f44304008d0fec0ca08bef724f890ff91235449d735cac56bc4486dc55f33afd"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.600156 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.601997 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af59e577-5041-4237-9405-d2bc65fa79f2","Type":"ContainerStarted","Data":"3125a61249d11a031e5bd84cf79559bc050786b60f00240c3a03a3435f2765a1"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.602043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af59e577-5041-4237-9405-d2bc65fa79f2","Type":"ContainerStarted","Data":"cb90b9999b5bb7cf5540ef570369285d51c61a0fc00de9df71be26b7f38871d6"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.602054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"af59e577-5041-4237-9405-d2bc65fa79f2","Type":"ContainerStarted","Data":"524c0c8d03fdcf3ac7409b5797f8f160153b20dcae2ff6af06e9f22952e1844d"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.604036 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71b1081b-1c98-4691-980a-7035c4996dc3","Type":"ContainerStarted","Data":"5374035f265bab540a624a49bf339352bce9f4b52266426f735e1435c4fd66e7"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.604061 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71b1081b-1c98-4691-980a-7035c4996dc3","Type":"ContainerStarted","Data":"f51c0bfd98d4b257a4d894a2d4c859e7cfbe8acea1bf7cc36e18913c992e0036"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.604069 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71b1081b-1c98-4691-980a-7035c4996dc3","Type":"ContainerStarted","Data":"3721089bda518d54d5003cbeea5dcc10c8b4b64e4b1d01b24f87e3462f9f35e9"} Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.620935 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.620918011 podStartE2EDuration="2.620918011s" podCreationTimestamp="2026-02-19 23:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:56:13.611659883 +0000 UTC m=+8873.883102363" watchObservedRunningTime="2026-02-19 23:56:13.620918011 +0000 UTC m=+8873.892360481" Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.644658 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.644640967 podStartE2EDuration="2.644640967s" podCreationTimestamp="2026-02-19 23:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:56:13.62834294 +0000 UTC m=+8873.899785430" watchObservedRunningTime="2026-02-19 23:56:13.644640967 +0000 UTC m=+8873.916083437" Feb 19 23:56:13 crc kubenswrapper[4771]: I0219 23:56:13.654012 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.653992847 podStartE2EDuration="2.653992847s" podCreationTimestamp="2026-02-19 23:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:56:13.646426424 +0000 UTC m=+8873.917868904" watchObservedRunningTime="2026-02-19 23:56:13.653992847 +0000 UTC m=+8873.925435317" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.093100 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.093800 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.093828 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.213677 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.305370 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-combined-ca-bundle\") pod \"069fb38a-cbd9-4bb9-8441-682d97292af1\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.305463 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-config-data\") pod \"069fb38a-cbd9-4bb9-8441-682d97292af1\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.305504 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlrsd\" (UniqueName: \"kubernetes.io/projected/069fb38a-cbd9-4bb9-8441-682d97292af1-kube-api-access-vlrsd\") pod \"069fb38a-cbd9-4bb9-8441-682d97292af1\" (UID: \"069fb38a-cbd9-4bb9-8441-682d97292af1\") " Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.312688 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069fb38a-cbd9-4bb9-8441-682d97292af1-kube-api-access-vlrsd" (OuterVolumeSpecName: "kube-api-access-vlrsd") pod "069fb38a-cbd9-4bb9-8441-682d97292af1" (UID: "069fb38a-cbd9-4bb9-8441-682d97292af1"). InnerVolumeSpecName "kube-api-access-vlrsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.344900 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "069fb38a-cbd9-4bb9-8441-682d97292af1" (UID: "069fb38a-cbd9-4bb9-8441-682d97292af1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.357421 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-config-data" (OuterVolumeSpecName: "config-data") pod "069fb38a-cbd9-4bb9-8441-682d97292af1" (UID: "069fb38a-cbd9-4bb9-8441-682d97292af1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.408408 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.408444 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069fb38a-cbd9-4bb9-8441-682d97292af1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.408454 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlrsd\" (UniqueName: \"kubernetes.io/projected/069fb38a-cbd9-4bb9-8441-682d97292af1-kube-api-access-vlrsd\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.665241 4771 generic.go:334] "Generic (PLEG): container finished" podID="069fb38a-cbd9-4bb9-8441-682d97292af1" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" exitCode=0 Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.665367 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.665402 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"069fb38a-cbd9-4bb9-8441-682d97292af1","Type":"ContainerDied","Data":"ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed"} Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.667740 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"069fb38a-cbd9-4bb9-8441-682d97292af1","Type":"ContainerDied","Data":"33eed73d7be7a0e3e3e7d8a1c0c32eacfe66ccc7860fb13bbe489a57908588ef"} Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.667819 4771 scope.go:117] "RemoveContainer" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.721617 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.729055 4771 scope.go:117] "RemoveContainer" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" Feb 19 23:56:17 crc kubenswrapper[4771]: E0219 23:56:17.731258 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed\": container with ID starting with ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed not found: ID does not exist" containerID="ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.731306 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed"} err="failed to get container status \"ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed\": rpc error: code = NotFound desc = could not find container \"ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed\": container with ID starting with ba88e296a97471a92dc4dc3af05eeccd9e8d493366ee61771c1d7eb92168f6ed not found: ID does not exist" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.743825 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.752501 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:56:17 crc kubenswrapper[4771]: E0219 23:56:17.753098 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069fb38a-cbd9-4bb9-8441-682d97292af1" containerName="nova-scheduler-scheduler" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.753118 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="069fb38a-cbd9-4bb9-8441-682d97292af1" containerName="nova-scheduler-scheduler" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.753399 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="069fb38a-cbd9-4bb9-8441-682d97292af1" containerName="nova-scheduler-scheduler" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.754270 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.758065 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.761587 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.820232 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e25fc39c-df45-4537-b973-ec200e5b7a00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.820291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e25fc39c-df45-4537-b973-ec200e5b7a00-config-data\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.820393 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txx4z\" (UniqueName: \"kubernetes.io/projected/e25fc39c-df45-4537-b973-ec200e5b7a00-kube-api-access-txx4z\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.922978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txx4z\" (UniqueName: \"kubernetes.io/projected/e25fc39c-df45-4537-b973-ec200e5b7a00-kube-api-access-txx4z\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.923509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e25fc39c-df45-4537-b973-ec200e5b7a00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.923646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e25fc39c-df45-4537-b973-ec200e5b7a00-config-data\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.930693 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e25fc39c-df45-4537-b973-ec200e5b7a00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.931063 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e25fc39c-df45-4537-b973-ec200e5b7a00-config-data\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:17 crc kubenswrapper[4771]: I0219 23:56:17.957758 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txx4z\" (UniqueName: \"kubernetes.io/projected/e25fc39c-df45-4537-b973-ec200e5b7a00-kube-api-access-txx4z\") pod \"nova-scheduler-0\" (UID: \"e25fc39c-df45-4537-b973-ec200e5b7a00\") " pod="openstack/nova-scheduler-0" Feb 19 23:56:18 crc kubenswrapper[4771]: I0219 23:56:18.133393 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:56:18 crc kubenswrapper[4771]: I0219 23:56:18.447483 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="069fb38a-cbd9-4bb9-8441-682d97292af1" path="/var/lib/kubelet/pods/069fb38a-cbd9-4bb9-8441-682d97292af1/volumes" Feb 19 23:56:18 crc kubenswrapper[4771]: I0219 23:56:18.620075 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:56:18 crc kubenswrapper[4771]: W0219 23:56:18.623666 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode25fc39c_df45_4537_b973_ec200e5b7a00.slice/crio-fed4415b723f9eff6c164768cace18ff647c01812ef1c2b924f7316d67d5785c WatchSource:0}: Error finding container fed4415b723f9eff6c164768cace18ff647c01812ef1c2b924f7316d67d5785c: Status 404 returned error can't find the container with id fed4415b723f9eff6c164768cace18ff647c01812ef1c2b924f7316d67d5785c Feb 19 23:56:18 crc kubenswrapper[4771]: I0219 23:56:18.678922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e25fc39c-df45-4537-b973-ec200e5b7a00","Type":"ContainerStarted","Data":"fed4415b723f9eff6c164768cace18ff647c01812ef1c2b924f7316d67d5785c"} Feb 19 23:56:18 crc kubenswrapper[4771]: I0219 23:56:18.903120 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 23:56:19 crc kubenswrapper[4771]: I0219 23:56:19.705639 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e25fc39c-df45-4537-b973-ec200e5b7a00","Type":"ContainerStarted","Data":"89b5edb7fc90cc8958eba74b3bcffbebbc4c0982f5049efbb48c7a612b312315"} Feb 19 23:56:19 crc kubenswrapper[4771]: I0219 23:56:19.728271 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.728243997 podStartE2EDuration="2.728243997s" podCreationTimestamp="2026-02-19 23:56:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:56:19.725166774 +0000 UTC m=+8879.996609254" watchObservedRunningTime="2026-02-19 23:56:19.728243997 +0000 UTC m=+8879.999686497" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.093522 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.095518 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.150183 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.150531 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.604244 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gzqlz"] Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.608911 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.638082 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gzqlz"] Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.652856 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jzd\" (UniqueName: \"kubernetes.io/projected/52b36fd0-a585-4757-88e1-01bda6410c0f-kube-api-access-r4jzd\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.653094 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-catalog-content\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.653249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-utilities\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.755651 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-catalog-content\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.755834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-utilities\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.755967 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jzd\" (UniqueName: \"kubernetes.io/projected/52b36fd0-a585-4757-88e1-01bda6410c0f-kube-api-access-r4jzd\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.756377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-catalog-content\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.756406 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-utilities\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.775268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jzd\" (UniqueName: \"kubernetes.io/projected/52b36fd0-a585-4757-88e1-01bda6410c0f-kube-api-access-r4jzd\") pod \"certified-operators-gzqlz\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:22 crc kubenswrapper[4771]: I0219 23:56:22.944038 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:23 crc kubenswrapper[4771]: I0219 23:56:23.112511 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71b1081b-1c98-4691-980a-7035c4996dc3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:56:23 crc kubenswrapper[4771]: I0219 23:56:23.112835 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71b1081b-1c98-4691-980a-7035c4996dc3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:56:23 crc kubenswrapper[4771]: I0219 23:56:23.134129 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 23:56:23 crc kubenswrapper[4771]: I0219 23:56:23.176814 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="af59e577-5041-4237-9405-d2bc65fa79f2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:56:23 crc kubenswrapper[4771]: I0219 23:56:23.176927 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="af59e577-5041-4237-9405-d2bc65fa79f2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 23:56:23 crc kubenswrapper[4771]: W0219 23:56:23.531429 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b36fd0_a585_4757_88e1_01bda6410c0f.slice/crio-5a4e1ed1d21cf25bd996d2c67659c45453e9a0def4e9639bc97dcebdb3a091bf WatchSource:0}: Error finding container 5a4e1ed1d21cf25bd996d2c67659c45453e9a0def4e9639bc97dcebdb3a091bf: Status 404 returned error can't find the container with id 5a4e1ed1d21cf25bd996d2c67659c45453e9a0def4e9639bc97dcebdb3a091bf Feb 19 23:56:23 crc kubenswrapper[4771]: I0219 23:56:23.531549 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gzqlz"] Feb 19 23:56:23 crc kubenswrapper[4771]: I0219 23:56:23.756631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gzqlz" event={"ID":"52b36fd0-a585-4757-88e1-01bda6410c0f","Type":"ContainerStarted","Data":"5a4e1ed1d21cf25bd996d2c67659c45453e9a0def4e9639bc97dcebdb3a091bf"} Feb 19 23:56:24 crc kubenswrapper[4771]: I0219 23:56:24.773603 4771 generic.go:334] "Generic (PLEG): container finished" podID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerID="a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e" exitCode=0 Feb 19 23:56:24 crc kubenswrapper[4771]: I0219 23:56:24.773826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gzqlz" event={"ID":"52b36fd0-a585-4757-88e1-01bda6410c0f","Type":"ContainerDied","Data":"a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e"} Feb 19 23:56:25 crc kubenswrapper[4771]: I0219 23:56:25.792296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gzqlz" event={"ID":"52b36fd0-a585-4757-88e1-01bda6410c0f","Type":"ContainerStarted","Data":"8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c"} Feb 19 23:56:27 crc kubenswrapper[4771]: I0219 23:56:27.817725 4771 generic.go:334] "Generic (PLEG): container finished" podID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerID="8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c" exitCode=0 Feb 19 23:56:27 crc kubenswrapper[4771]: I0219 23:56:27.817854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gzqlz" event={"ID":"52b36fd0-a585-4757-88e1-01bda6410c0f","Type":"ContainerDied","Data":"8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c"} Feb 19 23:56:27 crc kubenswrapper[4771]: I0219 23:56:27.957481 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mr8r2"] Feb 19 23:56:27 crc kubenswrapper[4771]: I0219 23:56:27.961391 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.001779 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr8r2"] Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.007207 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-catalog-content\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.007299 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-utilities\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.007382 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxps\" (UniqueName: \"kubernetes.io/projected/bfc8f91e-9ac4-4582-9296-101ac7e7c052-kube-api-access-jqxps\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.109614 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-catalog-content\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.109714 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-utilities\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.109803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxps\" (UniqueName: \"kubernetes.io/projected/bfc8f91e-9ac4-4582-9296-101ac7e7c052-kube-api-access-jqxps\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.110107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-catalog-content\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.110567 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-utilities\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.135408 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.151572 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxps\" (UniqueName: \"kubernetes.io/projected/bfc8f91e-9ac4-4582-9296-101ac7e7c052-kube-api-access-jqxps\") pod \"redhat-marketplace-mr8r2\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.185369 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.287514 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.633274 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr8r2"] Feb 19 23:56:28 crc kubenswrapper[4771]: W0219 23:56:28.634476 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc8f91e_9ac4_4582_9296_101ac7e7c052.slice/crio-7bac783546a21ed6b24d9ecbd49af89fca34144966ba82232c6ecace19eb6c41 WatchSource:0}: Error finding container 7bac783546a21ed6b24d9ecbd49af89fca34144966ba82232c6ecace19eb6c41: Status 404 returned error can't find the container with id 7bac783546a21ed6b24d9ecbd49af89fca34144966ba82232c6ecace19eb6c41 Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.827882 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr8r2" event={"ID":"bfc8f91e-9ac4-4582-9296-101ac7e7c052","Type":"ContainerStarted","Data":"7bac783546a21ed6b24d9ecbd49af89fca34144966ba82232c6ecace19eb6c41"} Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.834290 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gzqlz" event={"ID":"52b36fd0-a585-4757-88e1-01bda6410c0f","Type":"ContainerStarted","Data":"62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6"} Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.861633 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gzqlz" podStartSLOduration=3.403744078 podStartE2EDuration="6.861610124s" podCreationTimestamp="2026-02-19 23:56:22 +0000 UTC" firstStartedPulling="2026-02-19 23:56:24.776696897 +0000 UTC m=+8885.048139377" lastFinishedPulling="2026-02-19 23:56:28.234562943 +0000 UTC m=+8888.506005423" observedRunningTime="2026-02-19 23:56:28.851458082 +0000 UTC m=+8889.122900552" watchObservedRunningTime="2026-02-19 23:56:28.861610124 +0000 UTC m=+8889.133052604" Feb 19 23:56:28 crc kubenswrapper[4771]: I0219 23:56:28.867945 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 23:56:29 crc kubenswrapper[4771]: I0219 23:56:29.847457 4771 generic.go:334] "Generic (PLEG): container finished" podID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerID="ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f" exitCode=0 Feb 19 23:56:29 crc kubenswrapper[4771]: I0219 23:56:29.847584 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr8r2" event={"ID":"bfc8f91e-9ac4-4582-9296-101ac7e7c052","Type":"ContainerDied","Data":"ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f"} Feb 19 23:56:31 crc kubenswrapper[4771]: I0219 23:56:31.877098 4771 generic.go:334] "Generic (PLEG): container finished" podID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerID="615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772" exitCode=0 Feb 19 23:56:31 crc kubenswrapper[4771]: I0219 23:56:31.877170 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr8r2" event={"ID":"bfc8f91e-9ac4-4582-9296-101ac7e7c052","Type":"ContainerDied","Data":"615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772"} Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.105198 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.105635 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.110916 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.113406 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.160983 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.161504 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.174904 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.187599 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.887548 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr8r2" event={"ID":"bfc8f91e-9ac4-4582-9296-101ac7e7c052","Type":"ContainerStarted","Data":"64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f"} Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.888011 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.902757 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.929536 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mr8r2" podStartSLOduration=3.479855868 podStartE2EDuration="5.929509016s" podCreationTimestamp="2026-02-19 23:56:27 +0000 UTC" firstStartedPulling="2026-02-19 23:56:29.850780692 +0000 UTC m=+8890.122223162" lastFinishedPulling="2026-02-19 23:56:32.30043384 +0000 UTC m=+8892.571876310" observedRunningTime="2026-02-19 23:56:32.909132561 +0000 UTC m=+8893.180575111" watchObservedRunningTime="2026-02-19 23:56:32.929509016 +0000 UTC m=+8893.200951516" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.945216 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:32 crc kubenswrapper[4771]: I0219 23:56:32.945569 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:33 crc kubenswrapper[4771]: I0219 23:56:33.023790 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:33 crc kubenswrapper[4771]: I0219 23:56:33.973499 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:35 crc kubenswrapper[4771]: I0219 23:56:35.141956 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gzqlz"] Feb 19 23:56:36 crc kubenswrapper[4771]: I0219 23:56:36.947124 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gzqlz" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="registry-server" containerID="cri-o://62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6" gracePeriod=2 Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.489342 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.536129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4jzd\" (UniqueName: \"kubernetes.io/projected/52b36fd0-a585-4757-88e1-01bda6410c0f-kube-api-access-r4jzd\") pod \"52b36fd0-a585-4757-88e1-01bda6410c0f\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.536368 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-catalog-content\") pod \"52b36fd0-a585-4757-88e1-01bda6410c0f\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.536432 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-utilities\") pod \"52b36fd0-a585-4757-88e1-01bda6410c0f\" (UID: \"52b36fd0-a585-4757-88e1-01bda6410c0f\") " Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.537747 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-utilities" (OuterVolumeSpecName: "utilities") pod "52b36fd0-a585-4757-88e1-01bda6410c0f" (UID: "52b36fd0-a585-4757-88e1-01bda6410c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.548957 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b36fd0-a585-4757-88e1-01bda6410c0f-kube-api-access-r4jzd" (OuterVolumeSpecName: "kube-api-access-r4jzd") pod "52b36fd0-a585-4757-88e1-01bda6410c0f" (UID: "52b36fd0-a585-4757-88e1-01bda6410c0f"). InnerVolumeSpecName "kube-api-access-r4jzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.604001 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52b36fd0-a585-4757-88e1-01bda6410c0f" (UID: "52b36fd0-a585-4757-88e1-01bda6410c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.639109 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.639143 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b36fd0-a585-4757-88e1-01bda6410c0f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.639155 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4jzd\" (UniqueName: \"kubernetes.io/projected/52b36fd0-a585-4757-88e1-01bda6410c0f-kube-api-access-r4jzd\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.965862 4771 generic.go:334] "Generic (PLEG): container finished" podID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerID="62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6" exitCode=0 Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.965977 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gzqlz" event={"ID":"52b36fd0-a585-4757-88e1-01bda6410c0f","Type":"ContainerDied","Data":"62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6"} Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.966049 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gzqlz" event={"ID":"52b36fd0-a585-4757-88e1-01bda6410c0f","Type":"ContainerDied","Data":"5a4e1ed1d21cf25bd996d2c67659c45453e9a0def4e9639bc97dcebdb3a091bf"} Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.966069 4771 scope.go:117] "RemoveContainer" containerID="62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.966163 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gzqlz" Feb 19 23:56:37 crc kubenswrapper[4771]: I0219 23:56:37.990706 4771 scope.go:117] "RemoveContainer" containerID="8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.029110 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gzqlz"] Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.041665 4771 scope.go:117] "RemoveContainer" containerID="a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.042640 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gzqlz"] Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.068757 4771 scope.go:117] "RemoveContainer" containerID="62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6" Feb 19 23:56:38 crc kubenswrapper[4771]: E0219 23:56:38.069270 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6\": container with ID starting with 62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6 not found: ID does not exist" containerID="62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.069328 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6"} err="failed to get container status \"62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6\": rpc error: code = NotFound desc = could not find container \"62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6\": container with ID starting with 62b8bd1d0a4d79dcbb2fa3ecc9b7fceec3c23f63924901d47d527e1913d7e4a6 not found: ID does not exist" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.069363 4771 scope.go:117] "RemoveContainer" containerID="8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c" Feb 19 23:56:38 crc kubenswrapper[4771]: E0219 23:56:38.069781 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c\": container with ID starting with 8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c not found: ID does not exist" containerID="8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.069876 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c"} err="failed to get container status \"8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c\": rpc error: code = NotFound desc = could not find container \"8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c\": container with ID starting with 8bc4ae5e22bb5ed40368f750f4cfe568f30c67f3a598b9e5d797a2f2b8bbed1c not found: ID does not exist" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.069932 4771 scope.go:117] "RemoveContainer" containerID="a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e" Feb 19 23:56:38 crc kubenswrapper[4771]: E0219 23:56:38.070350 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e\": container with ID starting with a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e not found: ID does not exist" containerID="a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.070399 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e"} err="failed to get container status \"a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e\": rpc error: code = NotFound desc = could not find container \"a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e\": container with ID starting with a78b6e4849605229f42c90fb08545cc943a5ea649e3b67c52a4ed0b89ba8fc2e not found: ID does not exist" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.288067 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.288119 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.362943 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:38 crc kubenswrapper[4771]: I0219 23:56:38.459972 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" path="/var/lib/kubelet/pods/52b36fd0-a585-4757-88e1-01bda6410c0f/volumes" Feb 19 23:56:39 crc kubenswrapper[4771]: I0219 23:56:39.058000 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:40 crc kubenswrapper[4771]: I0219 23:56:40.747729 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr8r2"] Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.014395 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mr8r2" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="registry-server" containerID="cri-o://64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f" gracePeriod=2 Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.638723 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.745303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-utilities\") pod \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.745478 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-catalog-content\") pod \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.745624 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxps\" (UniqueName: \"kubernetes.io/projected/bfc8f91e-9ac4-4582-9296-101ac7e7c052-kube-api-access-jqxps\") pod \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\" (UID: \"bfc8f91e-9ac4-4582-9296-101ac7e7c052\") " Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.747118 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-utilities" (OuterVolumeSpecName: "utilities") pod "bfc8f91e-9ac4-4582-9296-101ac7e7c052" (UID: "bfc8f91e-9ac4-4582-9296-101ac7e7c052"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.766071 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc8f91e-9ac4-4582-9296-101ac7e7c052-kube-api-access-jqxps" (OuterVolumeSpecName: "kube-api-access-jqxps") pod "bfc8f91e-9ac4-4582-9296-101ac7e7c052" (UID: "bfc8f91e-9ac4-4582-9296-101ac7e7c052"). InnerVolumeSpecName "kube-api-access-jqxps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.774874 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfc8f91e-9ac4-4582-9296-101ac7e7c052" (UID: "bfc8f91e-9ac4-4582-9296-101ac7e7c052"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.848638 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.848676 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfc8f91e-9ac4-4582-9296-101ac7e7c052-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:41 crc kubenswrapper[4771]: I0219 23:56:41.848692 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxps\" (UniqueName: \"kubernetes.io/projected/bfc8f91e-9ac4-4582-9296-101ac7e7c052-kube-api-access-jqxps\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.028743 4771 generic.go:334] "Generic (PLEG): container finished" podID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerID="64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f" exitCode=0 Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.028789 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr8r2" event={"ID":"bfc8f91e-9ac4-4582-9296-101ac7e7c052","Type":"ContainerDied","Data":"64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f"} Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.028820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr8r2" event={"ID":"bfc8f91e-9ac4-4582-9296-101ac7e7c052","Type":"ContainerDied","Data":"7bac783546a21ed6b24d9ecbd49af89fca34144966ba82232c6ecace19eb6c41"} Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.028842 4771 scope.go:117] "RemoveContainer" containerID="64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.028901 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr8r2" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.061125 4771 scope.go:117] "RemoveContainer" containerID="615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.082539 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr8r2"] Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.095481 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr8r2"] Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.109358 4771 scope.go:117] "RemoveContainer" containerID="ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.156459 4771 scope.go:117] "RemoveContainer" containerID="64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f" Feb 19 23:56:42 crc kubenswrapper[4771]: E0219 23:56:42.157311 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f\": container with ID starting with 64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f not found: ID does not exist" containerID="64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.157343 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f"} err="failed to get container status \"64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f\": rpc error: code = NotFound desc = could not find container \"64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f\": container with ID starting with 64d768df0246cc22b3471b83080f8d8c23748698690460b3f940e64a00af6f6f not found: ID does not exist" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.157363 4771 scope.go:117] "RemoveContainer" containerID="615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772" Feb 19 23:56:42 crc kubenswrapper[4771]: E0219 23:56:42.157790 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772\": container with ID starting with 615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772 not found: ID does not exist" containerID="615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.157906 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772"} err="failed to get container status \"615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772\": rpc error: code = NotFound desc = could not find container \"615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772\": container with ID starting with 615d63fa12494be7ba707db7f496c88fd0efca516fb3c4a9a33c7cc364d7f772 not found: ID does not exist" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.157962 4771 scope.go:117] "RemoveContainer" containerID="ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f" Feb 19 23:56:42 crc kubenswrapper[4771]: E0219 23:56:42.158317 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f\": container with ID starting with ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f not found: ID does not exist" containerID="ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.158345 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f"} err="failed to get container status \"ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f\": rpc error: code = NotFound desc = could not find container \"ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f\": container with ID starting with ce58d75ad284799dc81733b41ca8b517861e4e2285d9902040c2945615cc5f7f not found: ID does not exist" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.454476 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" path="/var/lib/kubelet/pods/bfc8f91e-9ac4-4582-9296-101ac7e7c052/volumes" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.956918 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.957048 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.957141 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.960898 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:56:42 crc kubenswrapper[4771]: I0219 23:56:42.961069 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" gracePeriod=600 Feb 19 23:56:43 crc kubenswrapper[4771]: E0219 23:56:43.090642 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:56:44 crc kubenswrapper[4771]: I0219 23:56:44.058404 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" exitCode=0 Feb 19 23:56:44 crc kubenswrapper[4771]: I0219 23:56:44.058507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11"} Feb 19 23:56:44 crc kubenswrapper[4771]: I0219 23:56:44.058879 4771 scope.go:117] "RemoveContainer" containerID="7fa155cb0383a0fbdfae911759c38f2f628afe45f731553dbcbc941a641c2f0a" Feb 19 23:56:44 crc kubenswrapper[4771]: I0219 23:56:44.059814 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:56:44 crc kubenswrapper[4771]: E0219 23:56:44.060308 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:56:46 crc kubenswrapper[4771]: I0219 23:56:46.931097 4771 scope.go:117] "RemoveContainer" containerID="dad4a04c947f88686d8a1958e783ebbcd808b0638945c37446ccb842f6bfda10" Feb 19 23:56:46 crc kubenswrapper[4771]: I0219 23:56:46.974144 4771 scope.go:117] "RemoveContainer" containerID="6a5c974672114f685f86c13e4161538433eb963951f5922fc576ce2a839ec846" Feb 19 23:56:55 crc kubenswrapper[4771]: I0219 23:56:55.437499 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:56:55 crc kubenswrapper[4771]: E0219 23:56:55.438635 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:57:06 crc kubenswrapper[4771]: I0219 23:57:06.437582 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:57:06 crc kubenswrapper[4771]: E0219 23:57:06.438506 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:57:19 crc kubenswrapper[4771]: I0219 23:57:19.438315 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:57:19 crc kubenswrapper[4771]: E0219 23:57:19.439352 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:57:34 crc kubenswrapper[4771]: I0219 23:57:34.439486 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:57:34 crc kubenswrapper[4771]: E0219 23:57:34.440760 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:57:45 crc kubenswrapper[4771]: I0219 23:57:45.438861 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:57:45 crc kubenswrapper[4771]: E0219 23:57:45.439588 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.516866 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qdg5s"] Feb 19 23:57:52 crc kubenswrapper[4771]: E0219 23:57:52.518054 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="registry-server" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518076 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="registry-server" Feb 19 23:57:52 crc kubenswrapper[4771]: E0219 23:57:52.518104 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="registry-server" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518117 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="registry-server" Feb 19 23:57:52 crc kubenswrapper[4771]: E0219 23:57:52.518156 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="extract-utilities" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518169 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="extract-utilities" Feb 19 23:57:52 crc kubenswrapper[4771]: E0219 23:57:52.518197 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="extract-content" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518211 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="extract-content" Feb 19 23:57:52 crc kubenswrapper[4771]: E0219 23:57:52.518251 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="extract-content" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518265 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="extract-content" Feb 19 23:57:52 crc kubenswrapper[4771]: E0219 23:57:52.518311 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="extract-utilities" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518323 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="extract-utilities" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518670 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc8f91e-9ac4-4582-9296-101ac7e7c052" containerName="registry-server" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.518715 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b36fd0-a585-4757-88e1-01bda6410c0f" containerName="registry-server" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.525217 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.538807 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdg5s"] Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.627258 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-catalog-content\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.627770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jtbs\" (UniqueName: \"kubernetes.io/projected/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-kube-api-access-5jtbs\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.628004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-utilities\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.730522 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jtbs\" (UniqueName: \"kubernetes.io/projected/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-kube-api-access-5jtbs\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.730698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-utilities\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.730754 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-catalog-content\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.731435 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-utilities\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.731597 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-catalog-content\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.758799 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jtbs\" (UniqueName: \"kubernetes.io/projected/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-kube-api-access-5jtbs\") pod \"redhat-operators-qdg5s\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:52 crc kubenswrapper[4771]: I0219 23:57:52.877053 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:57:53 crc kubenswrapper[4771]: I0219 23:57:53.391930 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdg5s"] Feb 19 23:57:53 crc kubenswrapper[4771]: I0219 23:57:53.858703 4771 generic.go:334] "Generic (PLEG): container finished" podID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerID="9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250" exitCode=0 Feb 19 23:57:53 crc kubenswrapper[4771]: I0219 23:57:53.858986 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdg5s" event={"ID":"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e","Type":"ContainerDied","Data":"9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250"} Feb 19 23:57:53 crc kubenswrapper[4771]: I0219 23:57:53.859009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdg5s" event={"ID":"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e","Type":"ContainerStarted","Data":"68f5660c1d5ae9bda7db7fe0d3a3b2a1dba7e6e2c5130d527268ef5714d2cbad"} Feb 19 23:57:53 crc kubenswrapper[4771]: I0219 23:57:53.861009 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:57:55 crc kubenswrapper[4771]: I0219 23:57:55.887415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdg5s" event={"ID":"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e","Type":"ContainerStarted","Data":"df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd"} Feb 19 23:57:59 crc kubenswrapper[4771]: I0219 23:57:59.437645 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:57:59 crc kubenswrapper[4771]: E0219 23:57:59.438554 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:57:59 crc kubenswrapper[4771]: I0219 23:57:59.940801 4771 generic.go:334] "Generic (PLEG): container finished" podID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerID="df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd" exitCode=0 Feb 19 23:57:59 crc kubenswrapper[4771]: I0219 23:57:59.940853 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdg5s" event={"ID":"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e","Type":"ContainerDied","Data":"df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd"} Feb 19 23:58:01 crc kubenswrapper[4771]: I0219 23:58:01.991132 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdg5s" event={"ID":"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e","Type":"ContainerStarted","Data":"d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12"} Feb 19 23:58:02 crc kubenswrapper[4771]: I0219 23:58:02.023780 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qdg5s" podStartSLOduration=2.896092464 podStartE2EDuration="10.023727681s" podCreationTimestamp="2026-02-19 23:57:52 +0000 UTC" firstStartedPulling="2026-02-19 23:57:53.86082037 +0000 UTC m=+8974.132262840" lastFinishedPulling="2026-02-19 23:58:00.988455587 +0000 UTC m=+8981.259898057" observedRunningTime="2026-02-19 23:58:02.012449658 +0000 UTC m=+8982.283892158" watchObservedRunningTime="2026-02-19 23:58:02.023727681 +0000 UTC m=+8982.295170181" Feb 19 23:58:02 crc kubenswrapper[4771]: I0219 23:58:02.877918 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:58:02 crc kubenswrapper[4771]: I0219 23:58:02.878231 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:58:03 crc kubenswrapper[4771]: I0219 23:58:03.967230 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qdg5s" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:58:03 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 19 23:58:03 crc kubenswrapper[4771]: > Feb 19 23:58:11 crc kubenswrapper[4771]: I0219 23:58:11.437928 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:58:11 crc kubenswrapper[4771]: E0219 23:58:11.440279 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:58:12 crc kubenswrapper[4771]: I0219 23:58:12.948282 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:58:13 crc kubenswrapper[4771]: I0219 23:58:13.019925 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:58:13 crc kubenswrapper[4771]: I0219 23:58:13.207345 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qdg5s"] Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.123725 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qdg5s" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="registry-server" containerID="cri-o://d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12" gracePeriod=2 Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.709759 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.795400 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-utilities\") pod \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.795499 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jtbs\" (UniqueName: \"kubernetes.io/projected/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-kube-api-access-5jtbs\") pod \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.795617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-catalog-content\") pod \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\" (UID: \"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e\") " Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.796617 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-utilities" (OuterVolumeSpecName: "utilities") pod "2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" (UID: "2d878ffc-64ac-4997-9a5e-02c1eecdcc5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.804773 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-kube-api-access-5jtbs" (OuterVolumeSpecName: "kube-api-access-5jtbs") pod "2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" (UID: "2d878ffc-64ac-4997-9a5e-02c1eecdcc5e"). InnerVolumeSpecName "kube-api-access-5jtbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.908871 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.908915 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jtbs\" (UniqueName: \"kubernetes.io/projected/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-kube-api-access-5jtbs\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:14 crc kubenswrapper[4771]: I0219 23:58:14.920449 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" (UID: "2d878ffc-64ac-4997-9a5e-02c1eecdcc5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.010471 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.136937 4771 generic.go:334] "Generic (PLEG): container finished" podID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerID="d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12" exitCode=0 Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.136997 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdg5s" event={"ID":"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e","Type":"ContainerDied","Data":"d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12"} Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.137051 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdg5s" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.137068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdg5s" event={"ID":"2d878ffc-64ac-4997-9a5e-02c1eecdcc5e","Type":"ContainerDied","Data":"68f5660c1d5ae9bda7db7fe0d3a3b2a1dba7e6e2c5130d527268ef5714d2cbad"} Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.137097 4771 scope.go:117] "RemoveContainer" containerID="d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.162281 4771 scope.go:117] "RemoveContainer" containerID="df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.192107 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qdg5s"] Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.198267 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qdg5s"] Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.213971 4771 scope.go:117] "RemoveContainer" containerID="9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.268651 4771 scope.go:117] "RemoveContainer" containerID="d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12" Feb 19 23:58:15 crc kubenswrapper[4771]: E0219 23:58:15.269132 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12\": container with ID starting with d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12 not found: ID does not exist" containerID="d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.269172 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12"} err="failed to get container status \"d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12\": rpc error: code = NotFound desc = could not find container \"d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12\": container with ID starting with d1f25159ff4e8194ccf1db8629f0885f499c9a5b051ba4f4e6892aeb8b52ab12 not found: ID does not exist" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.269197 4771 scope.go:117] "RemoveContainer" containerID="df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd" Feb 19 23:58:15 crc kubenswrapper[4771]: E0219 23:58:15.269554 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd\": container with ID starting with df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd not found: ID does not exist" containerID="df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.269615 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd"} err="failed to get container status \"df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd\": rpc error: code = NotFound desc = could not find container \"df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd\": container with ID starting with df048ae3c75f7bff58e75d7279cda4cd42387869c782cb0801ad08d17940e9fd not found: ID does not exist" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.269647 4771 scope.go:117] "RemoveContainer" containerID="9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250" Feb 19 23:58:15 crc kubenswrapper[4771]: E0219 23:58:15.270154 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250\": container with ID starting with 9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250 not found: ID does not exist" containerID="9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250" Feb 19 23:58:15 crc kubenswrapper[4771]: I0219 23:58:15.270212 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250"} err="failed to get container status \"9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250\": rpc error: code = NotFound desc = could not find container \"9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250\": container with ID starting with 9f967b474b9a832c9207c7d3dcf9942ff92f6d7dbe35245aee0374e83e894250 not found: ID does not exist" Feb 19 23:58:16 crc kubenswrapper[4771]: I0219 23:58:16.450415 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" path="/var/lib/kubelet/pods/2d878ffc-64ac-4997-9a5e-02c1eecdcc5e/volumes" Feb 19 23:58:26 crc kubenswrapper[4771]: I0219 23:58:26.437582 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:58:26 crc kubenswrapper[4771]: E0219 23:58:26.438368 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:58:41 crc kubenswrapper[4771]: I0219 23:58:41.437214 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:58:41 crc kubenswrapper[4771]: E0219 23:58:41.438145 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:58:53 crc kubenswrapper[4771]: I0219 23:58:53.437899 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:58:53 crc kubenswrapper[4771]: E0219 23:58:53.438716 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:59:07 crc kubenswrapper[4771]: I0219 23:59:07.436919 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:59:07 crc kubenswrapper[4771]: E0219 23:59:07.437843 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:59:22 crc kubenswrapper[4771]: I0219 23:59:22.438123 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:59:22 crc kubenswrapper[4771]: E0219 23:59:22.439582 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:59:33 crc kubenswrapper[4771]: I0219 23:59:33.437851 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:59:33 crc kubenswrapper[4771]: E0219 23:59:33.438545 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 19 23:59:38 crc kubenswrapper[4771]: I0219 23:59:38.243147 4771 generic.go:334] "Generic (PLEG): container finished" podID="078da17d-fe59-470f-9ab1-e265491c5997" containerID="8ee3e743d56e6e45e6945c3fa34f0e480eac2ea58abf26b3ed0010f914053b78" exitCode=0 Feb 19 23:59:38 crc kubenswrapper[4771]: I0219 23:59:38.243234 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" event={"ID":"078da17d-fe59-470f-9ab1-e265491c5997","Type":"ContainerDied","Data":"8ee3e743d56e6e45e6945c3fa34f0e480eac2ea58abf26b3ed0010f914053b78"} Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.765574 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770096 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770146 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-combined-ca-bundle\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770199 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-ssh-key-openstack-cell1\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770231 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-0\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-3\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770301 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/078da17d-fe59-470f-9ab1-e265491c5997-nova-cells-global-config-0\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770318 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4w79\" (UniqueName: \"kubernetes.io/projected/078da17d-fe59-470f-9ab1-e265491c5997-kube-api-access-r4w79\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-0\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770371 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-2\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770392 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-inventory\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.770416 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-1\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.775317 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.776465 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/078da17d-fe59-470f-9ab1-e265491c5997-kube-api-access-r4w79" (OuterVolumeSpecName: "kube-api-access-r4w79") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "kube-api-access-r4w79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.829539 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.832452 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/078da17d-fe59-470f-9ab1-e265491c5997-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.838601 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.841837 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.851222 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-inventory" (OuterVolumeSpecName: "inventory") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.858243 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.860289 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872742 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/078da17d-fe59-470f-9ab1-e265491c5997-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872766 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4w79\" (UniqueName: \"kubernetes.io/projected/078da17d-fe59-470f-9ab1-e265491c5997-kube-api-access-r4w79\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872774 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872782 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872810 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872820 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872830 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872845 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.872889 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:39 crc kubenswrapper[4771]: E0219 23:59:39.877330 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1 podName:078da17d-fe59-470f-9ab1-e265491c5997 nodeName:}" failed. No retries permitted until 2026-02-19 23:59:40.377304525 +0000 UTC m=+9080.648746985 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "nova-migration-ssh-key-1" (UniqueName: "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997") : error deleting /var/lib/kubelet/pods/078da17d-fe59-470f-9ab1-e265491c5997/volume-subpaths: remove /var/lib/kubelet/pods/078da17d-fe59-470f-9ab1-e265491c5997/volume-subpaths: no such file or directory Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.880197 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:39 crc kubenswrapper[4771]: I0219 23:59:39.974578 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:40 crc kubenswrapper[4771]: I0219 23:59:40.272471 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" event={"ID":"078da17d-fe59-470f-9ab1-e265491c5997","Type":"ContainerDied","Data":"15549d082b2ab8e5bef73db8d56c3cedd8eb46f6b9db239ae51ace6dd2dfe3cf"} Feb 19 23:59:40 crc kubenswrapper[4771]: I0219 23:59:40.272854 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15549d082b2ab8e5bef73db8d56c3cedd8eb46f6b9db239ae51ace6dd2dfe3cf" Feb 19 23:59:40 crc kubenswrapper[4771]: I0219 23:59:40.272945 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl" Feb 19 23:59:40 crc kubenswrapper[4771]: I0219 23:59:40.383609 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1\") pod \"078da17d-fe59-470f-9ab1-e265491c5997\" (UID: \"078da17d-fe59-470f-9ab1-e265491c5997\") " Feb 19 23:59:40 crc kubenswrapper[4771]: I0219 23:59:40.387257 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "078da17d-fe59-470f-9ab1-e265491c5997" (UID: "078da17d-fe59-470f-9ab1-e265491c5997"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:59:40 crc kubenswrapper[4771]: I0219 23:59:40.486091 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/078da17d-fe59-470f-9ab1-e265491c5997-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:59:48 crc kubenswrapper[4771]: I0219 23:59:48.437161 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 19 23:59:48 crc kubenswrapper[4771]: E0219 23:59:48.438977 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.152850 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-purge-29525760-xhvxn"] Feb 20 00:00:00 crc kubenswrapper[4771]: E0220 00:00:00.153868 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.153884 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4771]: E0220 00:00:00.153927 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="extract-content" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.153935 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="extract-content" Feb 20 00:00:00 crc kubenswrapper[4771]: E0220 00:00:00.153952 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078da17d-fe59-470f-9ab1-e265491c5997" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.153961 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="078da17d-fe59-470f-9ab1-e265491c5997" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 00:00:00 crc kubenswrapper[4771]: E0220 00:00:00.153994 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="extract-utilities" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.154003 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="extract-utilities" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.154290 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="078da17d-fe59-470f-9ab1-e265491c5997" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.154307 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d878ffc-64ac-4997-9a5e-02c1eecdcc5e" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.155232 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.158313 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.169061 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-purge-29525760-wg5bq"] Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.170537 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.172517 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.195601 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29525760-wg5bq"] Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.195672 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29525760-xhvxn"] Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.254159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.254223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-config-data\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.254402 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-scripts\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.254486 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4qh5\" (UniqueName: \"kubernetes.io/projected/92423829-83ed-4681-8053-35aa047de5dd-kube-api-access-v4qh5\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.286147 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27"] Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.287571 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.289972 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29525760-xljdh"] Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.291418 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.291994 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.292228 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.292840 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.293242 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.306502 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29525760-xljdh"] Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.316946 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27"] Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.355755 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.355799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-config-data\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.355875 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-scripts\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.355923 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-config-data\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.355955 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-scripts\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.356003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4qh5\" (UniqueName: \"kubernetes.io/projected/92423829-83ed-4681-8053-35aa047de5dd-kube-api-access-v4qh5\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.356083 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmg2p\" (UniqueName: \"kubernetes.io/projected/b7ea1045-2fa9-4fde-b5c8-73704c77d222-kube-api-access-lmg2p\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.356115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.361648 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.364352 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-config-data\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.366924 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-scripts\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.387225 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4qh5\" (UniqueName: \"kubernetes.io/projected/92423829-83ed-4681-8053-35aa047de5dd-kube-api-access-v4qh5\") pod \"nova-cell0-db-purge-29525760-xhvxn\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.449279 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:00:00 crc kubenswrapper[4771]: E0220 00:00:00.449655 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.457599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/356e6991-3883-4124-a24b-7d662e50b107-serviceca\") pod \"image-pruner-29525760-xljdh\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.457684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmg2p\" (UniqueName: \"kubernetes.io/projected/b7ea1045-2fa9-4fde-b5c8-73704c77d222-kube-api-access-lmg2p\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.457782 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.457971 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75cc9bca-2c37-491f-8611-eb3a0c698f1b-secret-volume\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.458067 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75cc9bca-2c37-491f-8611-eb3a0c698f1b-config-volume\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.459874 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgqb\" (UniqueName: \"kubernetes.io/projected/75cc9bca-2c37-491f-8611-eb3a0c698f1b-kube-api-access-nxgqb\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.459997 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-scripts\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.460157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-config-data\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.460230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk8bn\" (UniqueName: \"kubernetes.io/projected/356e6991-3883-4124-a24b-7d662e50b107-kube-api-access-hk8bn\") pod \"image-pruner-29525760-xljdh\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.463073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.463176 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-scripts\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.463585 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-config-data\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.475373 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmg2p\" (UniqueName: \"kubernetes.io/projected/b7ea1045-2fa9-4fde-b5c8-73704c77d222-kube-api-access-lmg2p\") pod \"nova-cell1-db-purge-29525760-wg5bq\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.562450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk8bn\" (UniqueName: \"kubernetes.io/projected/356e6991-3883-4124-a24b-7d662e50b107-kube-api-access-hk8bn\") pod \"image-pruner-29525760-xljdh\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.562544 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/356e6991-3883-4124-a24b-7d662e50b107-serviceca\") pod \"image-pruner-29525760-xljdh\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.562675 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75cc9bca-2c37-491f-8611-eb3a0c698f1b-secret-volume\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.562710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75cc9bca-2c37-491f-8611-eb3a0c698f1b-config-volume\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.562743 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgqb\" (UniqueName: \"kubernetes.io/projected/75cc9bca-2c37-491f-8611-eb3a0c698f1b-kube-api-access-nxgqb\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.564881 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75cc9bca-2c37-491f-8611-eb3a0c698f1b-config-volume\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.568135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/356e6991-3883-4124-a24b-7d662e50b107-serviceca\") pod \"image-pruner-29525760-xljdh\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.569665 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75cc9bca-2c37-491f-8611-eb3a0c698f1b-secret-volume\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.580463 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk8bn\" (UniqueName: \"kubernetes.io/projected/356e6991-3883-4124-a24b-7d662e50b107-kube-api-access-hk8bn\") pod \"image-pruner-29525760-xljdh\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.584598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgqb\" (UniqueName: \"kubernetes.io/projected/75cc9bca-2c37-491f-8611-eb3a0c698f1b-kube-api-access-nxgqb\") pod \"collect-profiles-29525760-cjn27\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.691849 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.707111 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.769626 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:00 crc kubenswrapper[4771]: I0220 00:00:00.790586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.192221 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29525760-xhvxn"] Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.276395 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29525760-wg5bq"] Feb 20 00:00:01 crc kubenswrapper[4771]: W0220 00:00:01.278251 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7ea1045_2fa9_4fde_b5c8_73704c77d222.slice/crio-54191f0864570beb8e8254a792075e4d25a34c2579d331cf1508500071de1c57 WatchSource:0}: Error finding container 54191f0864570beb8e8254a792075e4d25a34c2579d331cf1508500071de1c57: Status 404 returned error can't find the container with id 54191f0864570beb8e8254a792075e4d25a34c2579d331cf1508500071de1c57 Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.404010 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29525760-xljdh"] Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.418066 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27"] Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.542255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-xljdh" event={"ID":"356e6991-3883-4124-a24b-7d662e50b107","Type":"ContainerStarted","Data":"9cb456410181fcb6344975bc7fc8b4ca219b684b61114a1f6f2df7c8d971ec43"} Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.543506 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" event={"ID":"92423829-83ed-4681-8053-35aa047de5dd","Type":"ContainerStarted","Data":"756e10521444dd8a89eca431bfd497dcf18594d2bd8f29658d9bebab5c1a6b4f"} Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.543531 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" event={"ID":"92423829-83ed-4681-8053-35aa047de5dd","Type":"ContainerStarted","Data":"1547c33c0777a284ca28561141287cf993b0711dafd07059e4e7ec707b152f82"} Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.544304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" event={"ID":"75cc9bca-2c37-491f-8611-eb3a0c698f1b","Type":"ContainerStarted","Data":"50b4aa847540eb5dc96e66498eb1d57af08a01c0a7ec9ffff9a0784a0ca89568"} Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.545447 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" event={"ID":"b7ea1045-2fa9-4fde-b5c8-73704c77d222","Type":"ContainerStarted","Data":"7da30c0923ab8c454418b686a49e278d56194bc946b7e724eadd48b5bd0244ed"} Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.545470 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" event={"ID":"b7ea1045-2fa9-4fde-b5c8-73704c77d222","Type":"ContainerStarted","Data":"54191f0864570beb8e8254a792075e4d25a34c2579d331cf1508500071de1c57"} Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.576349 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" podStartSLOduration=1.5763266599999999 podStartE2EDuration="1.57632666s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.55913046 +0000 UTC m=+9101.830572920" watchObservedRunningTime="2026-02-20 00:00:01.57632666 +0000 UTC m=+9101.847769140" Feb 20 00:00:01 crc kubenswrapper[4771]: I0220 00:00:01.593686 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" podStartSLOduration=1.593664655 podStartE2EDuration="1.593664655s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.573779263 +0000 UTC m=+9101.845221733" watchObservedRunningTime="2026-02-20 00:00:01.593664655 +0000 UTC m=+9101.865107135" Feb 20 00:00:02 crc kubenswrapper[4771]: I0220 00:00:02.558556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-xljdh" event={"ID":"356e6991-3883-4124-a24b-7d662e50b107","Type":"ContainerStarted","Data":"2f23eb4d400f5ce9c3513827fcd2bc953fbff0a52c63d5cf64c13a878e6e4525"} Feb 20 00:00:02 crc kubenswrapper[4771]: I0220 00:00:02.563061 4771 generic.go:334] "Generic (PLEG): container finished" podID="75cc9bca-2c37-491f-8611-eb3a0c698f1b" containerID="6e705c3b6f4a2000b4831480bc4cf4e89ecee7081957019b61fbd3690cae4ff3" exitCode=0 Feb 20 00:00:02 crc kubenswrapper[4771]: I0220 00:00:02.564355 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" event={"ID":"75cc9bca-2c37-491f-8611-eb3a0c698f1b","Type":"ContainerDied","Data":"6e705c3b6f4a2000b4831480bc4cf4e89ecee7081957019b61fbd3690cae4ff3"} Feb 20 00:00:02 crc kubenswrapper[4771]: I0220 00:00:02.576756 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29525760-xljdh" podStartSLOduration=2.5767394 podStartE2EDuration="2.5767394s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:02.575849316 +0000 UTC m=+9102.847291806" watchObservedRunningTime="2026-02-20 00:00:02.5767394 +0000 UTC m=+9102.848181880" Feb 20 00:00:03 crc kubenswrapper[4771]: I0220 00:00:03.576509 4771 generic.go:334] "Generic (PLEG): container finished" podID="356e6991-3883-4124-a24b-7d662e50b107" containerID="2f23eb4d400f5ce9c3513827fcd2bc953fbff0a52c63d5cf64c13a878e6e4525" exitCode=0 Feb 20 00:00:03 crc kubenswrapper[4771]: I0220 00:00:03.578087 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-xljdh" event={"ID":"356e6991-3883-4124-a24b-7d662e50b107","Type":"ContainerDied","Data":"2f23eb4d400f5ce9c3513827fcd2bc953fbff0a52c63d5cf64c13a878e6e4525"} Feb 20 00:00:03 crc kubenswrapper[4771]: I0220 00:00:03.944093 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.054381 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75cc9bca-2c37-491f-8611-eb3a0c698f1b-secret-volume\") pod \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.054429 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75cc9bca-2c37-491f-8611-eb3a0c698f1b-config-volume\") pod \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.054471 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxgqb\" (UniqueName: \"kubernetes.io/projected/75cc9bca-2c37-491f-8611-eb3a0c698f1b-kube-api-access-nxgqb\") pod \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\" (UID: \"75cc9bca-2c37-491f-8611-eb3a0c698f1b\") " Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.055327 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75cc9bca-2c37-491f-8611-eb3a0c698f1b-config-volume" (OuterVolumeSpecName: "config-volume") pod "75cc9bca-2c37-491f-8611-eb3a0c698f1b" (UID: "75cc9bca-2c37-491f-8611-eb3a0c698f1b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.072207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75cc9bca-2c37-491f-8611-eb3a0c698f1b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75cc9bca-2c37-491f-8611-eb3a0c698f1b" (UID: "75cc9bca-2c37-491f-8611-eb3a0c698f1b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.072253 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cc9bca-2c37-491f-8611-eb3a0c698f1b-kube-api-access-nxgqb" (OuterVolumeSpecName: "kube-api-access-nxgqb") pod "75cc9bca-2c37-491f-8611-eb3a0c698f1b" (UID: "75cc9bca-2c37-491f-8611-eb3a0c698f1b"). InnerVolumeSpecName "kube-api-access-nxgqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.156770 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75cc9bca-2c37-491f-8611-eb3a0c698f1b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.157172 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75cc9bca-2c37-491f-8611-eb3a0c698f1b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.157263 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxgqb\" (UniqueName: \"kubernetes.io/projected/75cc9bca-2c37-491f-8611-eb3a0c698f1b-kube-api-access-nxgqb\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.591842 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" event={"ID":"75cc9bca-2c37-491f-8611-eb3a0c698f1b","Type":"ContainerDied","Data":"50b4aa847540eb5dc96e66498eb1d57af08a01c0a7ec9ffff9a0784a0ca89568"} Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.592831 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50b4aa847540eb5dc96e66498eb1d57af08a01c0a7ec9ffff9a0784a0ca89568" Feb 20 00:00:04 crc kubenswrapper[4771]: I0220 00:00:04.591904 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-cjn27" Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.045162 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq"] Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.059883 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-rc6cq"] Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.090811 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.179435 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk8bn\" (UniqueName: \"kubernetes.io/projected/356e6991-3883-4124-a24b-7d662e50b107-kube-api-access-hk8bn\") pod \"356e6991-3883-4124-a24b-7d662e50b107\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.180315 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/356e6991-3883-4124-a24b-7d662e50b107-serviceca\") pod \"356e6991-3883-4124-a24b-7d662e50b107\" (UID: \"356e6991-3883-4124-a24b-7d662e50b107\") " Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.181846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/356e6991-3883-4124-a24b-7d662e50b107-serviceca" (OuterVolumeSpecName: "serviceca") pod "356e6991-3883-4124-a24b-7d662e50b107" (UID: "356e6991-3883-4124-a24b-7d662e50b107"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.187355 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356e6991-3883-4124-a24b-7d662e50b107-kube-api-access-hk8bn" (OuterVolumeSpecName: "kube-api-access-hk8bn") pod "356e6991-3883-4124-a24b-7d662e50b107" (UID: "356e6991-3883-4124-a24b-7d662e50b107"). InnerVolumeSpecName "kube-api-access-hk8bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.283704 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk8bn\" (UniqueName: \"kubernetes.io/projected/356e6991-3883-4124-a24b-7d662e50b107-kube-api-access-hk8bn\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.284009 4771 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/356e6991-3883-4124-a24b-7d662e50b107-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.622196 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-xljdh" event={"ID":"356e6991-3883-4124-a24b-7d662e50b107","Type":"ContainerDied","Data":"9cb456410181fcb6344975bc7fc8b4ca219b684b61114a1f6f2df7c8d971ec43"} Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.622243 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb456410181fcb6344975bc7fc8b4ca219b684b61114a1f6f2df7c8d971ec43" Feb 20 00:00:05 crc kubenswrapper[4771]: I0220 00:00:05.623914 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-xljdh" Feb 20 00:00:06 crc kubenswrapper[4771]: I0220 00:00:06.460721 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba" path="/var/lib/kubelet/pods/00a4fd33-3aaf-4ae3-9fe1-8a020bfef7ba/volumes" Feb 20 00:00:06 crc kubenswrapper[4771]: I0220 00:00:06.661677 4771 generic.go:334] "Generic (PLEG): container finished" podID="92423829-83ed-4681-8053-35aa047de5dd" containerID="756e10521444dd8a89eca431bfd497dcf18594d2bd8f29658d9bebab5c1a6b4f" exitCode=0 Feb 20 00:00:06 crc kubenswrapper[4771]: I0220 00:00:06.661735 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" event={"ID":"92423829-83ed-4681-8053-35aa047de5dd","Type":"ContainerDied","Data":"756e10521444dd8a89eca431bfd497dcf18594d2bd8f29658d9bebab5c1a6b4f"} Feb 20 00:00:07 crc kubenswrapper[4771]: I0220 00:00:07.678198 4771 generic.go:334] "Generic (PLEG): container finished" podID="b7ea1045-2fa9-4fde-b5c8-73704c77d222" containerID="7da30c0923ab8c454418b686a49e278d56194bc946b7e724eadd48b5bd0244ed" exitCode=0 Feb 20 00:00:07 crc kubenswrapper[4771]: I0220 00:00:07.678323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" event={"ID":"b7ea1045-2fa9-4fde-b5c8-73704c77d222","Type":"ContainerDied","Data":"7da30c0923ab8c454418b686a49e278d56194bc946b7e724eadd48b5bd0244ed"} Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.121480 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.262004 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-scripts\") pod \"92423829-83ed-4681-8053-35aa047de5dd\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.262108 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-combined-ca-bundle\") pod \"92423829-83ed-4681-8053-35aa047de5dd\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.262426 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4qh5\" (UniqueName: \"kubernetes.io/projected/92423829-83ed-4681-8053-35aa047de5dd-kube-api-access-v4qh5\") pod \"92423829-83ed-4681-8053-35aa047de5dd\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.262494 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-config-data\") pod \"92423829-83ed-4681-8053-35aa047de5dd\" (UID: \"92423829-83ed-4681-8053-35aa047de5dd\") " Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.270588 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92423829-83ed-4681-8053-35aa047de5dd-kube-api-access-v4qh5" (OuterVolumeSpecName: "kube-api-access-v4qh5") pod "92423829-83ed-4681-8053-35aa047de5dd" (UID: "92423829-83ed-4681-8053-35aa047de5dd"). InnerVolumeSpecName "kube-api-access-v4qh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.277841 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-scripts" (OuterVolumeSpecName: "scripts") pod "92423829-83ed-4681-8053-35aa047de5dd" (UID: "92423829-83ed-4681-8053-35aa047de5dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.313183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-config-data" (OuterVolumeSpecName: "config-data") pod "92423829-83ed-4681-8053-35aa047de5dd" (UID: "92423829-83ed-4681-8053-35aa047de5dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.315324 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92423829-83ed-4681-8053-35aa047de5dd" (UID: "92423829-83ed-4681-8053-35aa047de5dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.364843 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4qh5\" (UniqueName: \"kubernetes.io/projected/92423829-83ed-4681-8053-35aa047de5dd-kube-api-access-v4qh5\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.365204 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.365213 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.365227 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92423829-83ed-4681-8053-35aa047de5dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.710438 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.711716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-xhvxn" event={"ID":"92423829-83ed-4681-8053-35aa047de5dd","Type":"ContainerDied","Data":"1547c33c0777a284ca28561141287cf993b0711dafd07059e4e7ec707b152f82"} Feb 20 00:00:08 crc kubenswrapper[4771]: I0220 00:00:08.711756 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1547c33c0777a284ca28561141287cf993b0711dafd07059e4e7ec707b152f82" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.079550 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.189592 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmg2p\" (UniqueName: \"kubernetes.io/projected/b7ea1045-2fa9-4fde-b5c8-73704c77d222-kube-api-access-lmg2p\") pod \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.189721 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-config-data\") pod \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.189861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-combined-ca-bundle\") pod \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.189976 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-scripts\") pod \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\" (UID: \"b7ea1045-2fa9-4fde-b5c8-73704c77d222\") " Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.198219 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-scripts" (OuterVolumeSpecName: "scripts") pod "b7ea1045-2fa9-4fde-b5c8-73704c77d222" (UID: "b7ea1045-2fa9-4fde-b5c8-73704c77d222"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.200046 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ea1045-2fa9-4fde-b5c8-73704c77d222-kube-api-access-lmg2p" (OuterVolumeSpecName: "kube-api-access-lmg2p") pod "b7ea1045-2fa9-4fde-b5c8-73704c77d222" (UID: "b7ea1045-2fa9-4fde-b5c8-73704c77d222"). InnerVolumeSpecName "kube-api-access-lmg2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.226749 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7ea1045-2fa9-4fde-b5c8-73704c77d222" (UID: "b7ea1045-2fa9-4fde-b5c8-73704c77d222"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.264045 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-config-data" (OuterVolumeSpecName: "config-data") pod "b7ea1045-2fa9-4fde-b5c8-73704c77d222" (UID: "b7ea1045-2fa9-4fde-b5c8-73704c77d222"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.292983 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.293039 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.293065 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmg2p\" (UniqueName: \"kubernetes.io/projected/b7ea1045-2fa9-4fde-b5c8-73704c77d222-kube-api-access-lmg2p\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.293078 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ea1045-2fa9-4fde-b5c8-73704c77d222-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.724303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" event={"ID":"b7ea1045-2fa9-4fde-b5c8-73704c77d222","Type":"ContainerDied","Data":"54191f0864570beb8e8254a792075e4d25a34c2579d331cf1508500071de1c57"} Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.724371 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54191f0864570beb8e8254a792075e4d25a34c2579d331cf1508500071de1c57" Feb 20 00:00:09 crc kubenswrapper[4771]: I0220 00:00:09.724470 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-wg5bq" Feb 20 00:00:12 crc kubenswrapper[4771]: I0220 00:00:12.438335 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:00:12 crc kubenswrapper[4771]: E0220 00:00:12.440835 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:00:14 crc kubenswrapper[4771]: E0220 00:00:14.099523 4771 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.248:43696->38.102.83.248:36635: read tcp 38.102.83.248:43696->38.102.83.248:36635: read: connection reset by peer Feb 20 00:00:26 crc kubenswrapper[4771]: I0220 00:00:26.437954 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:00:26 crc kubenswrapper[4771]: E0220 00:00:26.438792 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:00:40 crc kubenswrapper[4771]: I0220 00:00:40.446628 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:00:40 crc kubenswrapper[4771]: E0220 00:00:40.447869 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:00:47 crc kubenswrapper[4771]: I0220 00:00:47.282276 4771 scope.go:117] "RemoveContainer" containerID="abaca83184754c69711f5832c292163c8d6f2070dd9563df084f01ba1189de58" Feb 20 00:00:53 crc kubenswrapper[4771]: I0220 00:00:53.437587 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:00:53 crc kubenswrapper[4771]: E0220 00:00:53.438721 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.166214 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-purge-29525761-t9hpm"] Feb 20 00:01:00 crc kubenswrapper[4771]: E0220 00:01:00.167767 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92423829-83ed-4681-8053-35aa047de5dd" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.167800 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="92423829-83ed-4681-8053-35aa047de5dd" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4771]: E0220 00:01:00.167835 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cc9bca-2c37-491f-8611-eb3a0c698f1b" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.167848 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cc9bca-2c37-491f-8611-eb3a0c698f1b" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4771]: E0220 00:01:00.167893 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356e6991-3883-4124-a24b-7d662e50b107" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.167906 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="356e6991-3883-4124-a24b-7d662e50b107" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4771]: E0220 00:01:00.167925 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ea1045-2fa9-4fde-b5c8-73704c77d222" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.167937 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ea1045-2fa9-4fde-b5c8-73704c77d222" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.168323 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="92423829-83ed-4681-8053-35aa047de5dd" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.168356 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ea1045-2fa9-4fde-b5c8-73704c77d222" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.168378 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="356e6991-3883-4124-a24b-7d662e50b107" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.168398 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="75cc9bca-2c37-491f-8611-eb3a0c698f1b" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.169863 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.180743 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-purge-29525761-n7dz6"] Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.183552 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.193935 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-purge-29525761-tmxpz"] Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.197438 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.200874 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.207512 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525761-d2rl4"] Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.210255 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.225160 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-purge-29525761-t9hpm"] Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.232156 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29525761-tmxpz"] Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.254295 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29525761-n7dz6"] Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.265183 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525761-d2rl4"] Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-combined-ca-bundle\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294511 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpkf\" (UniqueName: \"kubernetes.io/projected/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-kube-api-access-xnpkf\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-db-purge-config-data\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294603 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-config-data\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294664 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-combined-ca-bundle\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294684 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-db-purge-config-data\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjj9\" (UniqueName: \"kubernetes.io/projected/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-kube-api-access-pbjj9\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.294750 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-config-data\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.396596 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-combined-ca-bundle\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.396658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-db-purge-config-data\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.396693 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbgt4\" (UniqueName: \"kubernetes.io/projected/635e94aa-984b-4340-b733-898071f7a59c-kube-api-access-zbgt4\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.396724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjj9\" (UniqueName: \"kubernetes.io/projected/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-kube-api-access-pbjj9\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.396863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2wz\" (UniqueName: \"kubernetes.io/projected/869aec74-18fd-4c1d-a474-8408353d1208-kube-api-access-wf2wz\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.396944 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-config-data\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-combined-ca-bundle\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397147 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-db-purge-config-data\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397179 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpkf\" (UniqueName: \"kubernetes.io/projected/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-kube-api-access-xnpkf\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397265 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-config-data\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-db-purge-config-data\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-config-data\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-fernet-keys\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397562 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-config-data\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397627 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-combined-ca-bundle\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.397687 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-combined-ca-bundle\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.401950 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-combined-ca-bundle\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.402848 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-db-purge-config-data\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.403760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-config-data\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.403945 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-config-data\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.404054 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-db-purge-config-data\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.404127 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-combined-ca-bundle\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.415535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpkf\" (UniqueName: \"kubernetes.io/projected/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-kube-api-access-xnpkf\") pod \"cinder-db-purge-29525761-n7dz6\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.416342 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjj9\" (UniqueName: \"kubernetes.io/projected/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-kube-api-access-pbjj9\") pod \"heat-db-purge-29525761-t9hpm\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.499879 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2wz\" (UniqueName: \"kubernetes.io/projected/869aec74-18fd-4c1d-a474-8408353d1208-kube-api-access-wf2wz\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.500430 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-db-purge-config-data\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.500591 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-config-data\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.500750 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-fernet-keys\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.500836 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-config-data\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.500968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-combined-ca-bundle\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.501153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-combined-ca-bundle\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.501337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbgt4\" (UniqueName: \"kubernetes.io/projected/635e94aa-984b-4340-b733-898071f7a59c-kube-api-access-zbgt4\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.500481 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.504822 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-combined-ca-bundle\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.505239 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-combined-ca-bundle\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.505381 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-config-data\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.505538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-config-data\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.506595 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-fernet-keys\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.506749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-db-purge-config-data\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.529650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbgt4\" (UniqueName: \"kubernetes.io/projected/635e94aa-984b-4340-b733-898071f7a59c-kube-api-access-zbgt4\") pod \"keystone-cron-29525761-d2rl4\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.530417 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2wz\" (UniqueName: \"kubernetes.io/projected/869aec74-18fd-4c1d-a474-8408353d1208-kube-api-access-wf2wz\") pod \"glance-db-purge-29525761-tmxpz\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.539965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.557518 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.575395 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:00 crc kubenswrapper[4771]: I0220 00:01:00.984723 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-purge-29525761-t9hpm"] Feb 20 00:01:01 crc kubenswrapper[4771]: W0220 00:01:01.071899 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cf7e327_cc23_4ebf_a435_b22c21a1d4ec.slice/crio-930c6207e7595153d3a5c4bd47a7c1ec300c2e9ebb4d85d0754f99b43d4a2db1 WatchSource:0}: Error finding container 930c6207e7595153d3a5c4bd47a7c1ec300c2e9ebb4d85d0754f99b43d4a2db1: Status 404 returned error can't find the container with id 930c6207e7595153d3a5c4bd47a7c1ec300c2e9ebb4d85d0754f99b43d4a2db1 Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.074070 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29525761-n7dz6"] Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.153770 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29525761-tmxpz"] Feb 20 00:01:01 crc kubenswrapper[4771]: W0220 00:01:01.160747 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod635e94aa_984b_4340_b733_898071f7a59c.slice/crio-196a104ec66e0d01b68a72cb88a72105d0f9e04b5ffbebf6085294f6f7ee8119 WatchSource:0}: Error finding container 196a104ec66e0d01b68a72cb88a72105d0f9e04b5ffbebf6085294f6f7ee8119: Status 404 returned error can't find the container with id 196a104ec66e0d01b68a72cb88a72105d0f9e04b5ffbebf6085294f6f7ee8119 Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.166692 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525761-d2rl4"] Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.381908 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-tmxpz" event={"ID":"869aec74-18fd-4c1d-a474-8408353d1208","Type":"ContainerStarted","Data":"7a8b4b0365edb72e796bcd5023c968aad04245cffba3e300a650e69cf4c65381"} Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.383161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d2rl4" event={"ID":"635e94aa-984b-4340-b733-898071f7a59c","Type":"ContainerStarted","Data":"50e3fe1f67fac8fff1e9fc0bda871ece55ff3653ca7718ebfb801bc33c12fa78"} Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.383195 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d2rl4" event={"ID":"635e94aa-984b-4340-b733-898071f7a59c","Type":"ContainerStarted","Data":"196a104ec66e0d01b68a72cb88a72105d0f9e04b5ffbebf6085294f6f7ee8119"} Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.386531 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-n7dz6" event={"ID":"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec","Type":"ContainerStarted","Data":"930c6207e7595153d3a5c4bd47a7c1ec300c2e9ebb4d85d0754f99b43d4a2db1"} Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.388256 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-t9hpm" event={"ID":"7e3d4f2d-0556-4a9d-8f84-e62136a71eac","Type":"ContainerStarted","Data":"870d72bbd93672def877cb591c39ee40620ac831d70a3f56bf42ed80f2549839"} Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.388296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-t9hpm" event={"ID":"7e3d4f2d-0556-4a9d-8f84-e62136a71eac","Type":"ContainerStarted","Data":"8dff5e47c9441c39b3d419bc10254cfef33bc90f847f04461979947952a98a84"} Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.400440 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525761-d2rl4" podStartSLOduration=1.400419038 podStartE2EDuration="1.400419038s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:01.398566928 +0000 UTC m=+9161.670009418" watchObservedRunningTime="2026-02-20 00:01:01.400419038 +0000 UTC m=+9161.671861508" Feb 20 00:01:01 crc kubenswrapper[4771]: I0220 00:01:01.429177 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-purge-29525761-t9hpm" podStartSLOduration=1.429160607 podStartE2EDuration="1.429160607s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:01.422331444 +0000 UTC m=+9161.693773934" watchObservedRunningTime="2026-02-20 00:01:01.429160607 +0000 UTC m=+9161.700603077" Feb 20 00:01:02 crc kubenswrapper[4771]: I0220 00:01:02.398842 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-n7dz6" event={"ID":"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec","Type":"ContainerStarted","Data":"728ac3c348faf0fa71cdc5b9b189f64d6696fa0e96267327ab4cd2075646af50"} Feb 20 00:01:02 crc kubenswrapper[4771]: I0220 00:01:02.401503 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-tmxpz" event={"ID":"869aec74-18fd-4c1d-a474-8408353d1208","Type":"ContainerStarted","Data":"e93433a8e42b26c3bc9ec2a2d10a1a663c04d6cd447d4b775d492fd3e342ee99"} Feb 20 00:01:02 crc kubenswrapper[4771]: I0220 00:01:02.422880 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-purge-29525761-n7dz6" podStartSLOduration=2.422857627 podStartE2EDuration="2.422857627s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.421366207 +0000 UTC m=+9162.692808707" watchObservedRunningTime="2026-02-20 00:01:02.422857627 +0000 UTC m=+9162.694300107" Feb 20 00:01:02 crc kubenswrapper[4771]: I0220 00:01:02.440007 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-purge-29525761-tmxpz" podStartSLOduration=2.439992116 podStartE2EDuration="2.439992116s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.439297148 +0000 UTC m=+9162.710739638" watchObservedRunningTime="2026-02-20 00:01:02.439992116 +0000 UTC m=+9162.711434576" Feb 20 00:01:04 crc kubenswrapper[4771]: I0220 00:01:04.422894 4771 generic.go:334] "Generic (PLEG): container finished" podID="869aec74-18fd-4c1d-a474-8408353d1208" containerID="e93433a8e42b26c3bc9ec2a2d10a1a663c04d6cd447d4b775d492fd3e342ee99" exitCode=0 Feb 20 00:01:04 crc kubenswrapper[4771]: I0220 00:01:04.422994 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-tmxpz" event={"ID":"869aec74-18fd-4c1d-a474-8408353d1208","Type":"ContainerDied","Data":"e93433a8e42b26c3bc9ec2a2d10a1a663c04d6cd447d4b775d492fd3e342ee99"} Feb 20 00:01:04 crc kubenswrapper[4771]: I0220 00:01:04.425595 4771 generic.go:334] "Generic (PLEG): container finished" podID="635e94aa-984b-4340-b733-898071f7a59c" containerID="50e3fe1f67fac8fff1e9fc0bda871ece55ff3653ca7718ebfb801bc33c12fa78" exitCode=0 Feb 20 00:01:04 crc kubenswrapper[4771]: I0220 00:01:04.425679 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d2rl4" event={"ID":"635e94aa-984b-4340-b733-898071f7a59c","Type":"ContainerDied","Data":"50e3fe1f67fac8fff1e9fc0bda871ece55ff3653ca7718ebfb801bc33c12fa78"} Feb 20 00:01:04 crc kubenswrapper[4771]: I0220 00:01:04.429006 4771 generic.go:334] "Generic (PLEG): container finished" podID="7e3d4f2d-0556-4a9d-8f84-e62136a71eac" containerID="870d72bbd93672def877cb591c39ee40620ac831d70a3f56bf42ed80f2549839" exitCode=0 Feb 20 00:01:04 crc kubenswrapper[4771]: I0220 00:01:04.429063 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-t9hpm" event={"ID":"7e3d4f2d-0556-4a9d-8f84-e62136a71eac","Type":"ContainerDied","Data":"870d72bbd93672def877cb591c39ee40620ac831d70a3f56bf42ed80f2549839"} Feb 20 00:01:04 crc kubenswrapper[4771]: I0220 00:01:04.438859 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:01:04 crc kubenswrapper[4771]: E0220 00:01:04.439334 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.050881 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.058374 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.063261 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.127849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbjj9\" (UniqueName: \"kubernetes.io/projected/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-kube-api-access-pbjj9\") pod \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-combined-ca-bundle\") pod \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128227 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-combined-ca-bundle\") pod \"869aec74-18fd-4c1d-a474-8408353d1208\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128253 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-config-data\") pod \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128287 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-db-purge-config-data\") pod \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\" (UID: \"7e3d4f2d-0556-4a9d-8f84-e62136a71eac\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128314 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf2wz\" (UniqueName: \"kubernetes.io/projected/869aec74-18fd-4c1d-a474-8408353d1208-kube-api-access-wf2wz\") pod \"869aec74-18fd-4c1d-a474-8408353d1208\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128356 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-config-data\") pod \"635e94aa-984b-4340-b733-898071f7a59c\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128436 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbgt4\" (UniqueName: \"kubernetes.io/projected/635e94aa-984b-4340-b733-898071f7a59c-kube-api-access-zbgt4\") pod \"635e94aa-984b-4340-b733-898071f7a59c\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128486 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-fernet-keys\") pod \"635e94aa-984b-4340-b733-898071f7a59c\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128530 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-config-data\") pod \"869aec74-18fd-4c1d-a474-8408353d1208\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128573 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-combined-ca-bundle\") pod \"635e94aa-984b-4340-b733-898071f7a59c\" (UID: \"635e94aa-984b-4340-b733-898071f7a59c\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.128611 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-db-purge-config-data\") pod \"869aec74-18fd-4c1d-a474-8408353d1208\" (UID: \"869aec74-18fd-4c1d-a474-8408353d1208\") " Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.134580 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-kube-api-access-pbjj9" (OuterVolumeSpecName: "kube-api-access-pbjj9") pod "7e3d4f2d-0556-4a9d-8f84-e62136a71eac" (UID: "7e3d4f2d-0556-4a9d-8f84-e62136a71eac"). InnerVolumeSpecName "kube-api-access-pbjj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.134692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "869aec74-18fd-4c1d-a474-8408353d1208" (UID: "869aec74-18fd-4c1d-a474-8408353d1208"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.136869 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869aec74-18fd-4c1d-a474-8408353d1208-kube-api-access-wf2wz" (OuterVolumeSpecName: "kube-api-access-wf2wz") pod "869aec74-18fd-4c1d-a474-8408353d1208" (UID: "869aec74-18fd-4c1d-a474-8408353d1208"). InnerVolumeSpecName "kube-api-access-wf2wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.139450 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "635e94aa-984b-4340-b733-898071f7a59c" (UID: "635e94aa-984b-4340-b733-898071f7a59c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.144834 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "7e3d4f2d-0556-4a9d-8f84-e62136a71eac" (UID: "7e3d4f2d-0556-4a9d-8f84-e62136a71eac"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.148251 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635e94aa-984b-4340-b733-898071f7a59c-kube-api-access-zbgt4" (OuterVolumeSpecName: "kube-api-access-zbgt4") pod "635e94aa-984b-4340-b733-898071f7a59c" (UID: "635e94aa-984b-4340-b733-898071f7a59c"). InnerVolumeSpecName "kube-api-access-zbgt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.189165 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-config-data" (OuterVolumeSpecName: "config-data") pod "869aec74-18fd-4c1d-a474-8408353d1208" (UID: "869aec74-18fd-4c1d-a474-8408353d1208"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.203760 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-config-data" (OuterVolumeSpecName: "config-data") pod "7e3d4f2d-0556-4a9d-8f84-e62136a71eac" (UID: "7e3d4f2d-0556-4a9d-8f84-e62136a71eac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.219830 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "635e94aa-984b-4340-b733-898071f7a59c" (UID: "635e94aa-984b-4340-b733-898071f7a59c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.224608 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "869aec74-18fd-4c1d-a474-8408353d1208" (UID: "869aec74-18fd-4c1d-a474-8408353d1208"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.225792 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-config-data" (OuterVolumeSpecName: "config-data") pod "635e94aa-984b-4340-b733-898071f7a59c" (UID: "635e94aa-984b-4340-b733-898071f7a59c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.228804 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e3d4f2d-0556-4a9d-8f84-e62136a71eac" (UID: "7e3d4f2d-0556-4a9d-8f84-e62136a71eac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232725 4771 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232760 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbjj9\" (UniqueName: \"kubernetes.io/projected/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-kube-api-access-pbjj9\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232772 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232781 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232791 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232800 4771 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7e3d4f2d-0556-4a9d-8f84-e62136a71eac-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232808 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf2wz\" (UniqueName: \"kubernetes.io/projected/869aec74-18fd-4c1d-a474-8408353d1208-kube-api-access-wf2wz\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232817 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232826 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbgt4\" (UniqueName: \"kubernetes.io/projected/635e94aa-984b-4340-b733-898071f7a59c-kube-api-access-zbgt4\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232834 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232842 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/869aec74-18fd-4c1d-a474-8408353d1208-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.232850 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/635e94aa-984b-4340-b733-898071f7a59c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.452744 4771 generic.go:334] "Generic (PLEG): container finished" podID="7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" containerID="728ac3c348faf0fa71cdc5b9b189f64d6696fa0e96267327ab4cd2075646af50" exitCode=0 Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.455339 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-t9hpm" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.456577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-n7dz6" event={"ID":"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec","Type":"ContainerDied","Data":"728ac3c348faf0fa71cdc5b9b189f64d6696fa0e96267327ab4cd2075646af50"} Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.456635 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-t9hpm" event={"ID":"7e3d4f2d-0556-4a9d-8f84-e62136a71eac","Type":"ContainerDied","Data":"8dff5e47c9441c39b3d419bc10254cfef33bc90f847f04461979947952a98a84"} Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.456657 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dff5e47c9441c39b3d419bc10254cfef33bc90f847f04461979947952a98a84" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.458942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-tmxpz" event={"ID":"869aec74-18fd-4c1d-a474-8408353d1208","Type":"ContainerDied","Data":"7a8b4b0365edb72e796bcd5023c968aad04245cffba3e300a650e69cf4c65381"} Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.458968 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a8b4b0365edb72e796bcd5023c968aad04245cffba3e300a650e69cf4c65381" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.459116 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-tmxpz" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.474642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d2rl4" event={"ID":"635e94aa-984b-4340-b733-898071f7a59c","Type":"ContainerDied","Data":"196a104ec66e0d01b68a72cb88a72105d0f9e04b5ffbebf6085294f6f7ee8119"} Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.474691 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="196a104ec66e0d01b68a72cb88a72105d0f9e04b5ffbebf6085294f6f7ee8119" Feb 20 00:01:06 crc kubenswrapper[4771]: I0220 00:01:06.474756 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d2rl4" Feb 20 00:01:07 crc kubenswrapper[4771]: I0220 00:01:07.859095 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:07 crc kubenswrapper[4771]: I0220 00:01:07.974151 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnpkf\" (UniqueName: \"kubernetes.io/projected/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-kube-api-access-xnpkf\") pod \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " Feb 20 00:01:07 crc kubenswrapper[4771]: I0220 00:01:07.974524 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-config-data\") pod \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " Feb 20 00:01:07 crc kubenswrapper[4771]: I0220 00:01:07.975534 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-combined-ca-bundle\") pod \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " Feb 20 00:01:07 crc kubenswrapper[4771]: I0220 00:01:07.975725 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-db-purge-config-data\") pod \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\" (UID: \"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec\") " Feb 20 00:01:07 crc kubenswrapper[4771]: I0220 00:01:07.983647 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" (UID: "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4771]: I0220 00:01:07.984197 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-kube-api-access-xnpkf" (OuterVolumeSpecName: "kube-api-access-xnpkf") pod "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" (UID: "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec"). InnerVolumeSpecName "kube-api-access-xnpkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.004014 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-config-data" (OuterVolumeSpecName: "config-data") pod "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" (UID: "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.030320 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" (UID: "7cf7e327-cc23-4ebf-a435-b22c21a1d4ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.078427 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnpkf\" (UniqueName: \"kubernetes.io/projected/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-kube-api-access-xnpkf\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.078462 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.078474 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.078483 4771 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/7cf7e327-cc23-4ebf-a435-b22c21a1d4ec-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.492631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-n7dz6" event={"ID":"7cf7e327-cc23-4ebf-a435-b22c21a1d4ec","Type":"ContainerDied","Data":"930c6207e7595153d3a5c4bd47a7c1ec300c2e9ebb4d85d0754f99b43d4a2db1"} Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.492677 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="930c6207e7595153d3a5c4bd47a7c1ec300c2e9ebb4d85d0754f99b43d4a2db1" Feb 20 00:01:08 crc kubenswrapper[4771]: I0220 00:01:08.492749 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-n7dz6" Feb 20 00:01:18 crc kubenswrapper[4771]: I0220 00:01:18.438004 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:01:18 crc kubenswrapper[4771]: E0220 00:01:18.438803 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:01:29 crc kubenswrapper[4771]: I0220 00:01:29.437319 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:01:29 crc kubenswrapper[4771]: E0220 00:01:29.438273 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:01:44 crc kubenswrapper[4771]: I0220 00:01:44.445301 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:01:44 crc kubenswrapper[4771]: I0220 00:01:44.976653 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"4c90920aab93ae317c9d5b4e3c1351eb635c27259dd0f33f09bda1bf6a83023f"} Feb 20 00:01:56 crc kubenswrapper[4771]: I0220 00:01:56.265141 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="88c6e800-2233-40b4-aab1-b4a7ac0fbb13" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 20 00:01:59 crc kubenswrapper[4771]: I0220 00:01:59.699227 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" podUID="1f38fe26-3d54-4e26-bd41-5bf84e7e98fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 00:01:59 crc kubenswrapper[4771]: I0220 00:01:59.699552 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" podUID="1f38fe26-3d54-4e26-bd41-5bf84e7e98fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:01 crc kubenswrapper[4771]: I0220 00:02:01.260965 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="88c6e800-2233-40b4-aab1-b4a7ac0fbb13" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 20 00:02:06 crc kubenswrapper[4771]: I0220 00:02:06.265439 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="88c6e800-2233-40b4-aab1-b4a7ac0fbb13" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 20 00:02:06 crc kubenswrapper[4771]: I0220 00:02:06.265751 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="88c6e800-2233-40b4-aab1-b4a7ac0fbb13" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Feb 20 00:02:06 crc kubenswrapper[4771]: I0220 00:02:06.265850 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Feb 20 00:02:06 crc kubenswrapper[4771]: I0220 00:02:06.266767 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"79a1777560da1332564b4b7579762350d46812ab3ead629a1ea3cb83abed0427"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Feb 20 00:02:06 crc kubenswrapper[4771]: I0220 00:02:06.266884 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88c6e800-2233-40b4-aab1-b4a7ac0fbb13" containerName="ceilometer-central-agent" containerID="cri-o://79a1777560da1332564b4b7579762350d46812ab3ead629a1ea3cb83abed0427" gracePeriod=30 Feb 20 00:02:09 crc kubenswrapper[4771]: I0220 00:02:09.658241 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" podUID="1f38fe26-3d54-4e26-bd41-5bf84e7e98fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:13 crc kubenswrapper[4771]: I0220 00:02:13.278690 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-5tqd2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 00:02:13 crc kubenswrapper[4771]: I0220 00:02:13.278746 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" podUID="aff4fff0-509c-411a-b055-595fc81f61c3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:13 crc kubenswrapper[4771]: I0220 00:02:13.278890 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-5tqd2 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 00:02:13 crc kubenswrapper[4771]: I0220 00:02:13.278914 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" podUID="aff4fff0-509c-411a-b055-595fc81f61c3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:13 crc kubenswrapper[4771]: I0220 00:02:13.803328 4771 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qddxl container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 00:02:13 crc kubenswrapper[4771]: I0220 00:02:13.803626 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qddxl" podUID="44818bf7-6be3-4f7f-97c2-0920e255bbba" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:23 crc kubenswrapper[4771]: I0220 00:02:23.278003 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-5tqd2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 00:02:23 crc kubenswrapper[4771]: I0220 00:02:23.278619 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" podUID="aff4fff0-509c-411a-b055-595fc81f61c3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:23 crc kubenswrapper[4771]: I0220 00:02:23.278066 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-5tqd2 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 00:02:23 crc kubenswrapper[4771]: I0220 00:02:23.278673 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-5tqd2" podUID="aff4fff0-509c-411a-b055-595fc81f61c3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:27 crc kubenswrapper[4771]: I0220 00:02:27.556227 4771 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-286qc container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 00:02:27 crc kubenswrapper[4771]: I0220 00:02:27.556762 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-286qc" podUID="6fa7e479-a1ec-4aca-8172-2ce281048f4a" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 00:02:30 crc kubenswrapper[4771]: E0220 00:02:30.942832 4771 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="22.506s" Feb 20 00:02:31 crc kubenswrapper[4771]: I0220 00:02:31.829135 4771 generic.go:334] "Generic (PLEG): container finished" podID="88c6e800-2233-40b4-aab1-b4a7ac0fbb13" containerID="79a1777560da1332564b4b7579762350d46812ab3ead629a1ea3cb83abed0427" exitCode=0 Feb 20 00:02:31 crc kubenswrapper[4771]: I0220 00:02:31.829253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88c6e800-2233-40b4-aab1-b4a7ac0fbb13","Type":"ContainerDied","Data":"79a1777560da1332564b4b7579762350d46812ab3ead629a1ea3cb83abed0427"} Feb 20 00:02:31 crc kubenswrapper[4771]: I0220 00:02:31.832211 4771 generic.go:334] "Generic (PLEG): container finished" podID="1f38fe26-3d54-4e26-bd41-5bf84e7e98fb" containerID="5db1e69b7d080e28a907eacd49ae2a3e7bdc6d78542eae340069e5865134b7af" exitCode=1 Feb 20 00:02:31 crc kubenswrapper[4771]: I0220 00:02:31.832250 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" event={"ID":"1f38fe26-3d54-4e26-bd41-5bf84e7e98fb","Type":"ContainerDied","Data":"5db1e69b7d080e28a907eacd49ae2a3e7bdc6d78542eae340069e5865134b7af"} Feb 20 00:02:31 crc kubenswrapper[4771]: I0220 00:02:31.833002 4771 scope.go:117] "RemoveContainer" containerID="5db1e69b7d080e28a907eacd49ae2a3e7bdc6d78542eae340069e5865134b7af" Feb 20 00:02:32 crc kubenswrapper[4771]: I0220 00:02:32.845242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" event={"ID":"1f38fe26-3d54-4e26-bd41-5bf84e7e98fb","Type":"ContainerStarted","Data":"2e305cbfec2bd3f6682a3f3ffd65492317c857961e00057f1fade33470bf70e2"} Feb 20 00:02:32 crc kubenswrapper[4771]: I0220 00:02:32.845942 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 20 00:02:33 crc kubenswrapper[4771]: I0220 00:02:33.862670 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88c6e800-2233-40b4-aab1-b4a7ac0fbb13","Type":"ContainerStarted","Data":"7058702267abb36b08908831642371c5efed443bd65679d5795f1675f56458c1"} Feb 20 00:02:38 crc kubenswrapper[4771]: I0220 00:02:38.617479 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-62hbk" Feb 20 00:02:59 crc kubenswrapper[4771]: I0220 00:02:59.537780 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 00:02:59 crc kubenswrapper[4771]: I0220 00:02:59.540443 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="cefa8dc6-536d-442a-a4ea-b003328c2d78" containerName="adoption" containerID="cri-o://522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea" gracePeriod=30 Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.267939 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.405643 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k64nv\" (UniqueName: \"kubernetes.io/projected/cefa8dc6-536d-442a-a4ea-b003328c2d78-kube-api-access-k64nv\") pod \"cefa8dc6-536d-442a-a4ea-b003328c2d78\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.408119 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a\") pod \"cefa8dc6-536d-442a-a4ea-b003328c2d78\" (UID: \"cefa8dc6-536d-442a-a4ea-b003328c2d78\") " Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.415374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefa8dc6-536d-442a-a4ea-b003328c2d78-kube-api-access-k64nv" (OuterVolumeSpecName: "kube-api-access-k64nv") pod "cefa8dc6-536d-442a-a4ea-b003328c2d78" (UID: "cefa8dc6-536d-442a-a4ea-b003328c2d78"). InnerVolumeSpecName "kube-api-access-k64nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.432974 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a" (OuterVolumeSpecName: "mariadb-data") pod "cefa8dc6-536d-442a-a4ea-b003328c2d78" (UID: "cefa8dc6-536d-442a-a4ea-b003328c2d78"). InnerVolumeSpecName "pvc-8774242e-6773-4472-85a3-4f778be5386a". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.512513 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k64nv\" (UniqueName: \"kubernetes.io/projected/cefa8dc6-536d-442a-a4ea-b003328c2d78-kube-api-access-k64nv\") on node \"crc\" DevicePath \"\"" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.512585 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8774242e-6773-4472-85a3-4f778be5386a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a\") on node \"crc\" " Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.516099 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.517393 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cefa8dc6-536d-442a-a4ea-b003328c2d78","Type":"ContainerDied","Data":"522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea"} Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.517489 4771 scope.go:117] "RemoveContainer" containerID="522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.518404 4771 generic.go:334] "Generic (PLEG): container finished" podID="cefa8dc6-536d-442a-a4ea-b003328c2d78" containerID="522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea" exitCode=137 Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.518500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cefa8dc6-536d-442a-a4ea-b003328c2d78","Type":"ContainerDied","Data":"31f5396eff18e48b74aeddb6722a55e01fa8db6526de54ebebacaee4ebb3984c"} Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.553361 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.566778 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.574201 4771 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.574371 4771 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8774242e-6773-4472-85a3-4f778be5386a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a") on node "crc" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.580620 4771 scope.go:117] "RemoveContainer" containerID="522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea" Feb 20 00:03:30 crc kubenswrapper[4771]: E0220 00:03:30.581178 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea\": container with ID starting with 522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea not found: ID does not exist" containerID="522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.581213 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea"} err="failed to get container status \"522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea\": rpc error: code = NotFound desc = could not find container \"522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea\": container with ID starting with 522e08065e2383c7324f7d2b5715059e44fbe55dfbf41c2d6262ff3f1c05ceea not found: ID does not exist" Feb 20 00:03:30 crc kubenswrapper[4771]: I0220 00:03:30.614644 4771 reconciler_common.go:293] "Volume detached for volume \"pvc-8774242e-6773-4472-85a3-4f778be5386a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8774242e-6773-4472-85a3-4f778be5386a\") on node \"crc\" DevicePath \"\"" Feb 20 00:03:31 crc kubenswrapper[4771]: I0220 00:03:31.162066 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 00:03:31 crc kubenswrapper[4771]: I0220 00:03:31.162614 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="0855ef56-7675-40fe-81bb-26a511a7d0ff" containerName="adoption" containerID="cri-o://aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333" gracePeriod=30 Feb 20 00:03:32 crc kubenswrapper[4771]: I0220 00:03:32.468388 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefa8dc6-536d-442a-a4ea-b003328c2d78" path="/var/lib/kubelet/pods/cefa8dc6-536d-442a-a4ea-b003328c2d78/volumes" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.783307 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mfvb2"] Feb 20 00:03:46 crc kubenswrapper[4771]: E0220 00:03:46.786363 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" containerName="cinder-db-purge" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.786407 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" containerName="cinder-db-purge" Feb 20 00:03:46 crc kubenswrapper[4771]: E0220 00:03:46.786470 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635e94aa-984b-4340-b733-898071f7a59c" containerName="keystone-cron" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.786657 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="635e94aa-984b-4340-b733-898071f7a59c" containerName="keystone-cron" Feb 20 00:03:46 crc kubenswrapper[4771]: E0220 00:03:46.786713 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869aec74-18fd-4c1d-a474-8408353d1208" containerName="glance-dbpurge" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.786733 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="869aec74-18fd-4c1d-a474-8408353d1208" containerName="glance-dbpurge" Feb 20 00:03:46 crc kubenswrapper[4771]: E0220 00:03:46.786818 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefa8dc6-536d-442a-a4ea-b003328c2d78" containerName="adoption" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.786836 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefa8dc6-536d-442a-a4ea-b003328c2d78" containerName="adoption" Feb 20 00:03:46 crc kubenswrapper[4771]: E0220 00:03:46.786909 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3d4f2d-0556-4a9d-8f84-e62136a71eac" containerName="heat-dbpurge" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.786937 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3d4f2d-0556-4a9d-8f84-e62136a71eac" containerName="heat-dbpurge" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.790572 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefa8dc6-536d-442a-a4ea-b003328c2d78" containerName="adoption" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.790631 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="635e94aa-984b-4340-b733-898071f7a59c" containerName="keystone-cron" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.790671 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3d4f2d-0556-4a9d-8f84-e62136a71eac" containerName="heat-dbpurge" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.790697 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf7e327-cc23-4ebf-a435-b22c21a1d4ec" containerName="cinder-db-purge" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.790766 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="869aec74-18fd-4c1d-a474-8408353d1208" containerName="glance-dbpurge" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.799761 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.815012 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mfvb2"] Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.821470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxbhc\" (UniqueName: \"kubernetes.io/projected/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-kube-api-access-pxbhc\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.821687 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-utilities\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.821729 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-catalog-content\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.925188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxbhc\" (UniqueName: \"kubernetes.io/projected/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-kube-api-access-pxbhc\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.925534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-utilities\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.925563 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-catalog-content\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.925988 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-utilities\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.927449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-catalog-content\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:46 crc kubenswrapper[4771]: I0220 00:03:46.947450 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxbhc\" (UniqueName: \"kubernetes.io/projected/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-kube-api-access-pxbhc\") pod \"community-operators-mfvb2\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:47 crc kubenswrapper[4771]: I0220 00:03:47.129606 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:47 crc kubenswrapper[4771]: I0220 00:03:47.735480 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mfvb2"] Feb 20 00:03:48 crc kubenswrapper[4771]: I0220 00:03:48.746865 4771 generic.go:334] "Generic (PLEG): container finished" podID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerID="5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416" exitCode=0 Feb 20 00:03:48 crc kubenswrapper[4771]: I0220 00:03:48.746957 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfvb2" event={"ID":"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542","Type":"ContainerDied","Data":"5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416"} Feb 20 00:03:48 crc kubenswrapper[4771]: I0220 00:03:48.747268 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfvb2" event={"ID":"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542","Type":"ContainerStarted","Data":"f0aeefcaa96785868362480db6299bbfebec97869b35767b549990785e0c0d1a"} Feb 20 00:03:48 crc kubenswrapper[4771]: I0220 00:03:48.750605 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 00:03:49 crc kubenswrapper[4771]: I0220 00:03:49.771648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfvb2" event={"ID":"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542","Type":"ContainerStarted","Data":"8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b"} Feb 20 00:03:51 crc kubenswrapper[4771]: I0220 00:03:51.807087 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfvb2" event={"ID":"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542","Type":"ContainerDied","Data":"8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b"} Feb 20 00:03:51 crc kubenswrapper[4771]: I0220 00:03:51.807004 4771 generic.go:334] "Generic (PLEG): container finished" podID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerID="8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b" exitCode=0 Feb 20 00:03:52 crc kubenswrapper[4771]: I0220 00:03:52.821963 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfvb2" event={"ID":"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542","Type":"ContainerStarted","Data":"88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492"} Feb 20 00:03:52 crc kubenswrapper[4771]: I0220 00:03:52.853593 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mfvb2" podStartSLOduration=3.327112589 podStartE2EDuration="6.85357267s" podCreationTimestamp="2026-02-20 00:03:46 +0000 UTC" firstStartedPulling="2026-02-20 00:03:48.750121227 +0000 UTC m=+9329.021563727" lastFinishedPulling="2026-02-20 00:03:52.276581348 +0000 UTC m=+9332.548023808" observedRunningTime="2026-02-20 00:03:52.848671059 +0000 UTC m=+9333.120113550" watchObservedRunningTime="2026-02-20 00:03:52.85357267 +0000 UTC m=+9333.125015160" Feb 20 00:03:57 crc kubenswrapper[4771]: I0220 00:03:57.129906 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:57 crc kubenswrapper[4771]: I0220 00:03:57.130208 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:57 crc kubenswrapper[4771]: I0220 00:03:57.208818 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:57 crc kubenswrapper[4771]: I0220 00:03:57.994013 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:03:58 crc kubenswrapper[4771]: I0220 00:03:58.055261 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mfvb2"] Feb 20 00:03:59 crc kubenswrapper[4771]: I0220 00:03:59.943778 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mfvb2" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="registry-server" containerID="cri-o://88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492" gracePeriod=2 Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.542344 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.674798 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-utilities\") pod \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.674983 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-catalog-content\") pod \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.675696 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-utilities" (OuterVolumeSpecName: "utilities") pod "4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" (UID: "4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.675983 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxbhc\" (UniqueName: \"kubernetes.io/projected/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-kube-api-access-pxbhc\") pod \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\" (UID: \"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542\") " Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.677081 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.724923 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" (UID: "4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.765461 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-kube-api-access-pxbhc" (OuterVolumeSpecName: "kube-api-access-pxbhc") pod "4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" (UID: "4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542"). InnerVolumeSpecName "kube-api-access-pxbhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.779464 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.779506 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxbhc\" (UniqueName: \"kubernetes.io/projected/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542-kube-api-access-pxbhc\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.957219 4771 generic.go:334] "Generic (PLEG): container finished" podID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerID="88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492" exitCode=0 Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.957418 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfvb2" event={"ID":"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542","Type":"ContainerDied","Data":"88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492"} Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.957512 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfvb2" Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.957533 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfvb2" event={"ID":"4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542","Type":"ContainerDied","Data":"f0aeefcaa96785868362480db6299bbfebec97869b35767b549990785e0c0d1a"} Feb 20 00:04:00 crc kubenswrapper[4771]: I0220 00:04:00.957577 4771 scope.go:117] "RemoveContainer" containerID="88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.000315 4771 scope.go:117] "RemoveContainer" containerID="8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.008493 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mfvb2"] Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.024061 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mfvb2"] Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.030758 4771 scope.go:117] "RemoveContainer" containerID="5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.080458 4771 scope.go:117] "RemoveContainer" containerID="88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492" Feb 20 00:04:01 crc kubenswrapper[4771]: E0220 00:04:01.080965 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492\": container with ID starting with 88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492 not found: ID does not exist" containerID="88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.081009 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492"} err="failed to get container status \"88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492\": rpc error: code = NotFound desc = could not find container \"88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492\": container with ID starting with 88e66a5bea90b920aca43f317714372f022d221a5bcf04d33b869a8e55c53492 not found: ID does not exist" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.081052 4771 scope.go:117] "RemoveContainer" containerID="8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b" Feb 20 00:04:01 crc kubenswrapper[4771]: E0220 00:04:01.081450 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b\": container with ID starting with 8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b not found: ID does not exist" containerID="8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.081480 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b"} err="failed to get container status \"8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b\": rpc error: code = NotFound desc = could not find container \"8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b\": container with ID starting with 8e7e8996427590c17f5f4436ee231ede336d24e4012400bf67f8e67e0e3eb48b not found: ID does not exist" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.081499 4771 scope.go:117] "RemoveContainer" containerID="5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416" Feb 20 00:04:01 crc kubenswrapper[4771]: E0220 00:04:01.081943 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416\": container with ID starting with 5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416 not found: ID does not exist" containerID="5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.082012 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416"} err="failed to get container status \"5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416\": rpc error: code = NotFound desc = could not find container \"5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416\": container with ID starting with 5d263ddd002ec8c2df506ece2afde316f4a2b47bb70e3949324bf0b5799e1416 not found: ID does not exist" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.709006 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.800517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0855ef56-7675-40fe-81bb-26a511a7d0ff-ovn-data-cert\") pod \"0855ef56-7675-40fe-81bb-26a511a7d0ff\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.800650 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg769\" (UniqueName: \"kubernetes.io/projected/0855ef56-7675-40fe-81bb-26a511a7d0ff-kube-api-access-tg769\") pod \"0855ef56-7675-40fe-81bb-26a511a7d0ff\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.801468 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\") pod \"0855ef56-7675-40fe-81bb-26a511a7d0ff\" (UID: \"0855ef56-7675-40fe-81bb-26a511a7d0ff\") " Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.809269 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0855ef56-7675-40fe-81bb-26a511a7d0ff-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "0855ef56-7675-40fe-81bb-26a511a7d0ff" (UID: "0855ef56-7675-40fe-81bb-26a511a7d0ff"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.818396 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0855ef56-7675-40fe-81bb-26a511a7d0ff-kube-api-access-tg769" (OuterVolumeSpecName: "kube-api-access-tg769") pod "0855ef56-7675-40fe-81bb-26a511a7d0ff" (UID: "0855ef56-7675-40fe-81bb-26a511a7d0ff"). InnerVolumeSpecName "kube-api-access-tg769". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.828778 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca" (OuterVolumeSpecName: "ovn-data") pod "0855ef56-7675-40fe-81bb-26a511a7d0ff" (UID: "0855ef56-7675-40fe-81bb-26a511a7d0ff"). InnerVolumeSpecName "pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.905160 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\") on node \"crc\" " Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.905203 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0855ef56-7675-40fe-81bb-26a511a7d0ff-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.905220 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg769\" (UniqueName: \"kubernetes.io/projected/0855ef56-7675-40fe-81bb-26a511a7d0ff-kube-api-access-tg769\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.948825 4771 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.949095 4771 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca") on node "crc" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.987586 4771 generic.go:334] "Generic (PLEG): container finished" podID="0855ef56-7675-40fe-81bb-26a511a7d0ff" containerID="aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333" exitCode=137 Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.987638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0855ef56-7675-40fe-81bb-26a511a7d0ff","Type":"ContainerDied","Data":"aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333"} Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.987667 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0855ef56-7675-40fe-81bb-26a511a7d0ff","Type":"ContainerDied","Data":"d1563782ac8daa1ecc2c21a36635d14618205e3cdafa0243b80c822b5ca5327d"} Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.987688 4771 scope.go:117] "RemoveContainer" containerID="aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333" Feb 20 00:04:01 crc kubenswrapper[4771]: I0220 00:04:01.987809 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 00:04:02 crc kubenswrapper[4771]: I0220 00:04:02.006909 4771 reconciler_common.go:293] "Volume detached for volume \"pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01df19f9-b56c-4fb9-bc06-5247ad3981ca\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:02 crc kubenswrapper[4771]: I0220 00:04:02.060654 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 00:04:02 crc kubenswrapper[4771]: I0220 00:04:02.070213 4771 scope.go:117] "RemoveContainer" containerID="aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333" Feb 20 00:04:02 crc kubenswrapper[4771]: E0220 00:04:02.073436 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333\": container with ID starting with aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333 not found: ID does not exist" containerID="aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333" Feb 20 00:04:02 crc kubenswrapper[4771]: I0220 00:04:02.073504 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333"} err="failed to get container status \"aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333\": rpc error: code = NotFound desc = could not find container \"aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333\": container with ID starting with aab92aa7366067bc627a556db98d4d10c1e6be2d8d11773c05e758e70bb5b333 not found: ID does not exist" Feb 20 00:04:02 crc kubenswrapper[4771]: I0220 00:04:02.091834 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 00:04:02 crc kubenswrapper[4771]: I0220 00:04:02.463216 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0855ef56-7675-40fe-81bb-26a511a7d0ff" path="/var/lib/kubelet/pods/0855ef56-7675-40fe-81bb-26a511a7d0ff/volumes" Feb 20 00:04:02 crc kubenswrapper[4771]: I0220 00:04:02.471219 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" path="/var/lib/kubelet/pods/4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542/volumes" Feb 20 00:04:12 crc kubenswrapper[4771]: I0220 00:04:12.956623 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:04:12 crc kubenswrapper[4771]: I0220 00:04:12.957269 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:04:42 crc kubenswrapper[4771]: I0220 00:04:42.956858 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:04:42 crc kubenswrapper[4771]: I0220 00:04:42.957575 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.480302 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxgqk/must-gather-58sl7"] Feb 20 00:05:07 crc kubenswrapper[4771]: E0220 00:05:07.481277 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="extract-utilities" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.481290 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="extract-utilities" Feb 20 00:05:07 crc kubenswrapper[4771]: E0220 00:05:07.481299 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="registry-server" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.481304 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="registry-server" Feb 20 00:05:07 crc kubenswrapper[4771]: E0220 00:05:07.481331 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="extract-content" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.481338 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="extract-content" Feb 20 00:05:07 crc kubenswrapper[4771]: E0220 00:05:07.481364 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0855ef56-7675-40fe-81bb-26a511a7d0ff" containerName="adoption" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.481371 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0855ef56-7675-40fe-81bb-26a511a7d0ff" containerName="adoption" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.481578 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1e6ea1-9c39-4f8c-86c8-20a0cafe9542" containerName="registry-server" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.481608 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0855ef56-7675-40fe-81bb-26a511a7d0ff" containerName="adoption" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.482657 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.501549 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fxgqk"/"openshift-service-ca.crt" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.501652 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fxgqk"/"default-dockercfg-z84kb" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.501573 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fxgqk"/"kube-root-ca.crt" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.513733 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fxgqk/must-gather-58sl7"] Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.618721 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-must-gather-output\") pod \"must-gather-58sl7\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.618800 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbjk\" (UniqueName: \"kubernetes.io/projected/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-kube-api-access-dvbjk\") pod \"must-gather-58sl7\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.721378 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-must-gather-output\") pod \"must-gather-58sl7\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.721538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbjk\" (UniqueName: \"kubernetes.io/projected/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-kube-api-access-dvbjk\") pod \"must-gather-58sl7\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.721735 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-must-gather-output\") pod \"must-gather-58sl7\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.741349 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbjk\" (UniqueName: \"kubernetes.io/projected/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-kube-api-access-dvbjk\") pod \"must-gather-58sl7\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:07 crc kubenswrapper[4771]: I0220 00:05:07.826776 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:05:08 crc kubenswrapper[4771]: I0220 00:05:08.285527 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fxgqk/must-gather-58sl7"] Feb 20 00:05:08 crc kubenswrapper[4771]: I0220 00:05:08.839736 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/must-gather-58sl7" event={"ID":"ace391bd-2fb6-497e-bb5c-ea6135b23e6c","Type":"ContainerStarted","Data":"2ae416dc23ea2da462c5f90fd9f4e55bb562ece08a6c45515b4761021b31efa7"} Feb 20 00:05:12 crc kubenswrapper[4771]: I0220 00:05:12.957269 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:05:12 crc kubenswrapper[4771]: I0220 00:05:12.957860 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:05:12 crc kubenswrapper[4771]: I0220 00:05:12.957907 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 20 00:05:12 crc kubenswrapper[4771]: I0220 00:05:12.958750 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c90920aab93ae317c9d5b4e3c1351eb635c27259dd0f33f09bda1bf6a83023f"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:05:12 crc kubenswrapper[4771]: I0220 00:05:12.958800 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://4c90920aab93ae317c9d5b4e3c1351eb635c27259dd0f33f09bda1bf6a83023f" gracePeriod=600 Feb 20 00:05:13 crc kubenswrapper[4771]: I0220 00:05:13.913946 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="4c90920aab93ae317c9d5b4e3c1351eb635c27259dd0f33f09bda1bf6a83023f" exitCode=0 Feb 20 00:05:13 crc kubenswrapper[4771]: I0220 00:05:13.914089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"4c90920aab93ae317c9d5b4e3c1351eb635c27259dd0f33f09bda1bf6a83023f"} Feb 20 00:05:13 crc kubenswrapper[4771]: I0220 00:05:13.914328 4771 scope.go:117] "RemoveContainer" containerID="929a9cdb890fb06d5512f4e28bfda41c4203c1819b187601f20f6f2ffedddf11" Feb 20 00:05:15 crc kubenswrapper[4771]: I0220 00:05:15.936525 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/must-gather-58sl7" event={"ID":"ace391bd-2fb6-497e-bb5c-ea6135b23e6c","Type":"ContainerStarted","Data":"8bfb0e0e715388ea61f0794c279367c9a6d205f3de29872a0b2177fc91486b8d"} Feb 20 00:05:15 crc kubenswrapper[4771]: I0220 00:05:15.937102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/must-gather-58sl7" event={"ID":"ace391bd-2fb6-497e-bb5c-ea6135b23e6c","Type":"ContainerStarted","Data":"02cc2dc54b11c2872490f129d3b04cc2a35078329679a10105106dfe55256f40"} Feb 20 00:05:15 crc kubenswrapper[4771]: I0220 00:05:15.942846 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862"} Feb 20 00:05:15 crc kubenswrapper[4771]: I0220 00:05:15.968485 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fxgqk/must-gather-58sl7" podStartSLOduration=2.094054241 podStartE2EDuration="8.968453523s" podCreationTimestamp="2026-02-20 00:05:07 +0000 UTC" firstStartedPulling="2026-02-20 00:05:08.295450709 +0000 UTC m=+9408.566893179" lastFinishedPulling="2026-02-20 00:05:15.169849951 +0000 UTC m=+9415.441292461" observedRunningTime="2026-02-20 00:05:15.957918995 +0000 UTC m=+9416.229361485" watchObservedRunningTime="2026-02-20 00:05:15.968453523 +0000 UTC m=+9416.239896003" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.511888 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-29pdl"] Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.513915 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.604140 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-host\") pod \"crc-debug-29pdl\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.604198 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n44g\" (UniqueName: \"kubernetes.io/projected/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-kube-api-access-9n44g\") pod \"crc-debug-29pdl\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.716062 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-host\") pod \"crc-debug-29pdl\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.716123 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n44g\" (UniqueName: \"kubernetes.io/projected/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-kube-api-access-9n44g\") pod \"crc-debug-29pdl\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.716262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-host\") pod \"crc-debug-29pdl\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.753469 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n44g\" (UniqueName: \"kubernetes.io/projected/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-kube-api-access-9n44g\") pod \"crc-debug-29pdl\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:19 crc kubenswrapper[4771]: I0220 00:05:19.832497 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:05:20 crc kubenswrapper[4771]: I0220 00:05:20.003095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" event={"ID":"0a5ffb50-60d8-4451-a79f-1d42bdaa6304","Type":"ContainerStarted","Data":"32815c63da0bee91a89db3313cc29964bb5713703b2fc3ada7c54a02a66d5647"} Feb 20 00:05:31 crc kubenswrapper[4771]: I0220 00:05:31.158661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" event={"ID":"0a5ffb50-60d8-4451-a79f-1d42bdaa6304","Type":"ContainerStarted","Data":"9d096888b8e7f88f380d781460cbf0babe23144e1e82d7b88e6b5215c6c8cadb"} Feb 20 00:05:31 crc kubenswrapper[4771]: I0220 00:05:31.196114 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" podStartSLOduration=1.3373767779999999 podStartE2EDuration="12.196092441s" podCreationTimestamp="2026-02-20 00:05:19 +0000 UTC" firstStartedPulling="2026-02-20 00:05:19.8913382 +0000 UTC m=+9420.162780660" lastFinishedPulling="2026-02-20 00:05:30.750053863 +0000 UTC m=+9431.021496323" observedRunningTime="2026-02-20 00:05:31.184040352 +0000 UTC m=+9431.455482822" watchObservedRunningTime="2026-02-20 00:05:31.196092441 +0000 UTC m=+9431.467534911" Feb 20 00:06:20 crc kubenswrapper[4771]: I0220 00:06:20.694789 4771 generic.go:334] "Generic (PLEG): container finished" podID="0a5ffb50-60d8-4451-a79f-1d42bdaa6304" containerID="9d096888b8e7f88f380d781460cbf0babe23144e1e82d7b88e6b5215c6c8cadb" exitCode=0 Feb 20 00:06:20 crc kubenswrapper[4771]: I0220 00:06:20.695336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" event={"ID":"0a5ffb50-60d8-4451-a79f-1d42bdaa6304","Type":"ContainerDied","Data":"9d096888b8e7f88f380d781460cbf0babe23144e1e82d7b88e6b5215c6c8cadb"} Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.847987 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.892447 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-29pdl"] Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.914634 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-29pdl"] Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.957764 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n44g\" (UniqueName: \"kubernetes.io/projected/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-kube-api-access-9n44g\") pod \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.958125 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-host\") pod \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\" (UID: \"0a5ffb50-60d8-4451-a79f-1d42bdaa6304\") " Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.958211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-host" (OuterVolumeSpecName: "host") pod "0a5ffb50-60d8-4451-a79f-1d42bdaa6304" (UID: "0a5ffb50-60d8-4451-a79f-1d42bdaa6304"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.958769 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-host\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:21 crc kubenswrapper[4771]: I0220 00:06:21.965882 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-kube-api-access-9n44g" (OuterVolumeSpecName: "kube-api-access-9n44g") pod "0a5ffb50-60d8-4451-a79f-1d42bdaa6304" (UID: "0a5ffb50-60d8-4451-a79f-1d42bdaa6304"). InnerVolumeSpecName "kube-api-access-9n44g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:22 crc kubenswrapper[4771]: I0220 00:06:22.060329 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n44g\" (UniqueName: \"kubernetes.io/projected/0a5ffb50-60d8-4451-a79f-1d42bdaa6304-kube-api-access-9n44g\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:22 crc kubenswrapper[4771]: I0220 00:06:22.449263 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a5ffb50-60d8-4451-a79f-1d42bdaa6304" path="/var/lib/kubelet/pods/0a5ffb50-60d8-4451-a79f-1d42bdaa6304/volumes" Feb 20 00:06:22 crc kubenswrapper[4771]: I0220 00:06:22.721526 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-29pdl" Feb 20 00:06:22 crc kubenswrapper[4771]: I0220 00:06:22.721725 4771 scope.go:117] "RemoveContainer" containerID="9d096888b8e7f88f380d781460cbf0babe23144e1e82d7b88e6b5215c6c8cadb" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.085688 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-5sjc4"] Feb 20 00:06:23 crc kubenswrapper[4771]: E0220 00:06:23.086244 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a5ffb50-60d8-4451-a79f-1d42bdaa6304" containerName="container-00" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.086260 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a5ffb50-60d8-4451-a79f-1d42bdaa6304" containerName="container-00" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.086512 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a5ffb50-60d8-4451-a79f-1d42bdaa6304" containerName="container-00" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.087995 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.285407 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-host\") pod \"crc-debug-5sjc4\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.285763 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnhrw\" (UniqueName: \"kubernetes.io/projected/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-kube-api-access-tnhrw\") pod \"crc-debug-5sjc4\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.388511 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-host\") pod \"crc-debug-5sjc4\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.388588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnhrw\" (UniqueName: \"kubernetes.io/projected/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-kube-api-access-tnhrw\") pod \"crc-debug-5sjc4\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.388940 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-host\") pod \"crc-debug-5sjc4\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.407604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnhrw\" (UniqueName: \"kubernetes.io/projected/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-kube-api-access-tnhrw\") pod \"crc-debug-5sjc4\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:23 crc kubenswrapper[4771]: I0220 00:06:23.702077 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:24 crc kubenswrapper[4771]: I0220 00:06:24.775220 4771 generic.go:334] "Generic (PLEG): container finished" podID="9dd62027-6063-45e1-9b17-ce4b38ccd6bf" containerID="abd878965b928074cce4fb29d14e34003bda688cab86d0320b5bf1810180d863" exitCode=0 Feb 20 00:06:24 crc kubenswrapper[4771]: I0220 00:06:24.775264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" event={"ID":"9dd62027-6063-45e1-9b17-ce4b38ccd6bf","Type":"ContainerDied","Data":"abd878965b928074cce4fb29d14e34003bda688cab86d0320b5bf1810180d863"} Feb 20 00:06:24 crc kubenswrapper[4771]: I0220 00:06:24.776165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" event={"ID":"9dd62027-6063-45e1-9b17-ce4b38ccd6bf","Type":"ContainerStarted","Data":"94da56ed29302b6f5e32e34c95996b2b8877d897c0e45b8b6340ac1170544312"} Feb 20 00:06:25 crc kubenswrapper[4771]: I0220 00:06:25.412013 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-5sjc4"] Feb 20 00:06:25 crc kubenswrapper[4771]: I0220 00:06:25.421561 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-5sjc4"] Feb 20 00:06:25 crc kubenswrapper[4771]: I0220 00:06:25.945607 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.046745 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnhrw\" (UniqueName: \"kubernetes.io/projected/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-kube-api-access-tnhrw\") pod \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.046834 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-host\") pod \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\" (UID: \"9dd62027-6063-45e1-9b17-ce4b38ccd6bf\") " Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.047307 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-host" (OuterVolumeSpecName: "host") pod "9dd62027-6063-45e1-9b17-ce4b38ccd6bf" (UID: "9dd62027-6063-45e1-9b17-ce4b38ccd6bf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.047824 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-host\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.058755 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-kube-api-access-tnhrw" (OuterVolumeSpecName: "kube-api-access-tnhrw") pod "9dd62027-6063-45e1-9b17-ce4b38ccd6bf" (UID: "9dd62027-6063-45e1-9b17-ce4b38ccd6bf"). InnerVolumeSpecName "kube-api-access-tnhrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.150710 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnhrw\" (UniqueName: \"kubernetes.io/projected/9dd62027-6063-45e1-9b17-ce4b38ccd6bf-kube-api-access-tnhrw\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.456733 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd62027-6063-45e1-9b17-ce4b38ccd6bf" path="/var/lib/kubelet/pods/9dd62027-6063-45e1-9b17-ce4b38ccd6bf/volumes" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.619788 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-bljcf"] Feb 20 00:06:26 crc kubenswrapper[4771]: E0220 00:06:26.620333 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd62027-6063-45e1-9b17-ce4b38ccd6bf" containerName="container-00" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.620357 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd62027-6063-45e1-9b17-ce4b38ccd6bf" containerName="container-00" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.620630 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd62027-6063-45e1-9b17-ce4b38ccd6bf" containerName="container-00" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.621551 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.764780 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-host\") pod \"crc-debug-bljcf\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.764897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdvr6\" (UniqueName: \"kubernetes.io/projected/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-kube-api-access-zdvr6\") pod \"crc-debug-bljcf\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.798522 4771 scope.go:117] "RemoveContainer" containerID="abd878965b928074cce4fb29d14e34003bda688cab86d0320b5bf1810180d863" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.798555 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-5sjc4" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.872291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-host\") pod \"crc-debug-bljcf\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.872489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdvr6\" (UniqueName: \"kubernetes.io/projected/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-kube-api-access-zdvr6\") pod \"crc-debug-bljcf\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.872792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-host\") pod \"crc-debug-bljcf\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.897697 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdvr6\" (UniqueName: \"kubernetes.io/projected/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-kube-api-access-zdvr6\") pod \"crc-debug-bljcf\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:26 crc kubenswrapper[4771]: I0220 00:06:26.941257 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:27 crc kubenswrapper[4771]: I0220 00:06:27.817140 4771 generic.go:334] "Generic (PLEG): container finished" podID="e1fc4ba1-0682-47d8-b416-f881f80cd7f3" containerID="92590f3b22eff906b680f0b71b5bb63061d820991d62754102458c0974341be3" exitCode=0 Feb 20 00:06:27 crc kubenswrapper[4771]: I0220 00:06:27.817273 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/crc-debug-bljcf" event={"ID":"e1fc4ba1-0682-47d8-b416-f881f80cd7f3","Type":"ContainerDied","Data":"92590f3b22eff906b680f0b71b5bb63061d820991d62754102458c0974341be3"} Feb 20 00:06:27 crc kubenswrapper[4771]: I0220 00:06:27.817656 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/crc-debug-bljcf" event={"ID":"e1fc4ba1-0682-47d8-b416-f881f80cd7f3","Type":"ContainerStarted","Data":"11e6379ed36e645eb6ceeab12ee043c4a171040c66b865bdb27374630ea27943"} Feb 20 00:06:27 crc kubenswrapper[4771]: I0220 00:06:27.876455 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-bljcf"] Feb 20 00:06:27 crc kubenswrapper[4771]: I0220 00:06:27.893081 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxgqk/crc-debug-bljcf"] Feb 20 00:06:28 crc kubenswrapper[4771]: I0220 00:06:28.973251 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.126223 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdvr6\" (UniqueName: \"kubernetes.io/projected/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-kube-api-access-zdvr6\") pod \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.126299 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-host\") pod \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\" (UID: \"e1fc4ba1-0682-47d8-b416-f881f80cd7f3\") " Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.126466 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-host" (OuterVolumeSpecName: "host") pod "e1fc4ba1-0682-47d8-b416-f881f80cd7f3" (UID: "e1fc4ba1-0682-47d8-b416-f881f80cd7f3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.127057 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-host\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.134347 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-kube-api-access-zdvr6" (OuterVolumeSpecName: "kube-api-access-zdvr6") pod "e1fc4ba1-0682-47d8-b416-f881f80cd7f3" (UID: "e1fc4ba1-0682-47d8-b416-f881f80cd7f3"). InnerVolumeSpecName "kube-api-access-zdvr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.229999 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdvr6\" (UniqueName: \"kubernetes.io/projected/e1fc4ba1-0682-47d8-b416-f881f80cd7f3-kube-api-access-zdvr6\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.867354 4771 scope.go:117] "RemoveContainer" containerID="92590f3b22eff906b680f0b71b5bb63061d820991d62754102458c0974341be3" Feb 20 00:06:29 crc kubenswrapper[4771]: I0220 00:06:29.867722 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/crc-debug-bljcf" Feb 20 00:06:30 crc kubenswrapper[4771]: I0220 00:06:30.460566 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1fc4ba1-0682-47d8-b416-f881f80cd7f3" path="/var/lib/kubelet/pods/e1fc4ba1-0682-47d8-b416-f881f80cd7f3/volumes" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.656330 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6kp6q"] Feb 20 00:06:38 crc kubenswrapper[4771]: E0220 00:06:38.657615 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fc4ba1-0682-47d8-b416-f881f80cd7f3" containerName="container-00" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.657636 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fc4ba1-0682-47d8-b416-f881f80cd7f3" containerName="container-00" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.657899 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fc4ba1-0682-47d8-b416-f881f80cd7f3" containerName="container-00" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.659891 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.671601 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kp6q"] Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.797527 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-catalog-content\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.797924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-utilities\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.798367 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k648\" (UniqueName: \"kubernetes.io/projected/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-kube-api-access-9k648\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.899742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-catalog-content\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.899915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-utilities\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.900006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k648\" (UniqueName: \"kubernetes.io/projected/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-kube-api-access-9k648\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.900705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-catalog-content\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.900929 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-utilities\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.921103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k648\" (UniqueName: \"kubernetes.io/projected/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-kube-api-access-9k648\") pod \"redhat-marketplace-6kp6q\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:38 crc kubenswrapper[4771]: I0220 00:06:38.992424 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:39 crc kubenswrapper[4771]: I0220 00:06:39.521663 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kp6q"] Feb 20 00:06:39 crc kubenswrapper[4771]: I0220 00:06:39.989753 4771 generic.go:334] "Generic (PLEG): container finished" podID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerID="24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7" exitCode=0 Feb 20 00:06:39 crc kubenswrapper[4771]: I0220 00:06:39.989802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kp6q" event={"ID":"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662","Type":"ContainerDied","Data":"24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7"} Feb 20 00:06:39 crc kubenswrapper[4771]: I0220 00:06:39.989831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kp6q" event={"ID":"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662","Type":"ContainerStarted","Data":"0ebb49e37a0a7d4917fd74324a6503875638b326772728a1641b62c04c22f405"} Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.026777 4771 generic.go:334] "Generic (PLEG): container finished" podID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerID="d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672" exitCode=0 Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.026890 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kp6q" event={"ID":"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662","Type":"ContainerDied","Data":"d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672"} Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.449318 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4lfnl"] Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.453526 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.468816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4lfnl"] Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.617112 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5mf\" (UniqueName: \"kubernetes.io/projected/60148ad1-92a8-44cd-80f9-99d873fe8f47-kube-api-access-rk5mf\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.617177 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-catalog-content\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.617209 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-utilities\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.719800 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5mf\" (UniqueName: \"kubernetes.io/projected/60148ad1-92a8-44cd-80f9-99d873fe8f47-kube-api-access-rk5mf\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.719864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-catalog-content\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.719884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-utilities\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.720481 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-catalog-content\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.720571 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-utilities\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.740982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5mf\" (UniqueName: \"kubernetes.io/projected/60148ad1-92a8-44cd-80f9-99d873fe8f47-kube-api-access-rk5mf\") pod \"certified-operators-4lfnl\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:43 crc kubenswrapper[4771]: I0220 00:06:43.781849 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:44 crc kubenswrapper[4771]: I0220 00:06:44.053681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kp6q" event={"ID":"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662","Type":"ContainerStarted","Data":"cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5"} Feb 20 00:06:44 crc kubenswrapper[4771]: I0220 00:06:44.084623 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6kp6q" podStartSLOduration=2.6175730379999997 podStartE2EDuration="6.084604769s" podCreationTimestamp="2026-02-20 00:06:38 +0000 UTC" firstStartedPulling="2026-02-20 00:06:39.991370336 +0000 UTC m=+9500.262812806" lastFinishedPulling="2026-02-20 00:06:43.458402057 +0000 UTC m=+9503.729844537" observedRunningTime="2026-02-20 00:06:44.080674855 +0000 UTC m=+9504.352117335" watchObservedRunningTime="2026-02-20 00:06:44.084604769 +0000 UTC m=+9504.356047239" Feb 20 00:06:44 crc kubenswrapper[4771]: I0220 00:06:44.433007 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4lfnl"] Feb 20 00:06:44 crc kubenswrapper[4771]: W0220 00:06:44.481772 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60148ad1_92a8_44cd_80f9_99d873fe8f47.slice/crio-0394196a57843ede9352f8078cbe556ccfa53f9e10bfbc59195e64cd6d9826e3 WatchSource:0}: Error finding container 0394196a57843ede9352f8078cbe556ccfa53f9e10bfbc59195e64cd6d9826e3: Status 404 returned error can't find the container with id 0394196a57843ede9352f8078cbe556ccfa53f9e10bfbc59195e64cd6d9826e3 Feb 20 00:06:45 crc kubenswrapper[4771]: I0220 00:06:45.063889 4771 generic.go:334] "Generic (PLEG): container finished" podID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerID="a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1" exitCode=0 Feb 20 00:06:45 crc kubenswrapper[4771]: I0220 00:06:45.063999 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lfnl" event={"ID":"60148ad1-92a8-44cd-80f9-99d873fe8f47","Type":"ContainerDied","Data":"a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1"} Feb 20 00:06:45 crc kubenswrapper[4771]: I0220 00:06:45.064427 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lfnl" event={"ID":"60148ad1-92a8-44cd-80f9-99d873fe8f47","Type":"ContainerStarted","Data":"0394196a57843ede9352f8078cbe556ccfa53f9e10bfbc59195e64cd6d9826e3"} Feb 20 00:06:46 crc kubenswrapper[4771]: I0220 00:06:46.083835 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lfnl" event={"ID":"60148ad1-92a8-44cd-80f9-99d873fe8f47","Type":"ContainerStarted","Data":"a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e"} Feb 20 00:06:48 crc kubenswrapper[4771]: I0220 00:06:48.109970 4771 generic.go:334] "Generic (PLEG): container finished" podID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerID="a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e" exitCode=0 Feb 20 00:06:48 crc kubenswrapper[4771]: I0220 00:06:48.110238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lfnl" event={"ID":"60148ad1-92a8-44cd-80f9-99d873fe8f47","Type":"ContainerDied","Data":"a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e"} Feb 20 00:06:48 crc kubenswrapper[4771]: I0220 00:06:48.994214 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:48 crc kubenswrapper[4771]: I0220 00:06:48.994529 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:49 crc kubenswrapper[4771]: I0220 00:06:49.079623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:49 crc kubenswrapper[4771]: I0220 00:06:49.124970 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lfnl" event={"ID":"60148ad1-92a8-44cd-80f9-99d873fe8f47","Type":"ContainerStarted","Data":"06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a"} Feb 20 00:06:49 crc kubenswrapper[4771]: I0220 00:06:49.159162 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4lfnl" podStartSLOduration=2.683630112 podStartE2EDuration="6.159137236s" podCreationTimestamp="2026-02-20 00:06:43 +0000 UTC" firstStartedPulling="2026-02-20 00:06:45.066349335 +0000 UTC m=+9505.337791815" lastFinishedPulling="2026-02-20 00:06:48.541856439 +0000 UTC m=+9508.813298939" observedRunningTime="2026-02-20 00:06:49.149408278 +0000 UTC m=+9509.420850818" watchObservedRunningTime="2026-02-20 00:06:49.159137236 +0000 UTC m=+9509.430579726" Feb 20 00:06:49 crc kubenswrapper[4771]: I0220 00:06:49.198515 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:51 crc kubenswrapper[4771]: I0220 00:06:51.440192 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kp6q"] Feb 20 00:06:51 crc kubenswrapper[4771]: I0220 00:06:51.441232 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6kp6q" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="registry-server" containerID="cri-o://cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5" gracePeriod=2 Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.021603 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.147354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-utilities\") pod \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.147731 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k648\" (UniqueName: \"kubernetes.io/projected/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-kube-api-access-9k648\") pod \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.147961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-utilities" (OuterVolumeSpecName: "utilities") pod "42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" (UID: "42ce7cb0-f5d3-4ed3-9d7d-4961875d1662"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.148107 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-catalog-content\") pod \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\" (UID: \"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662\") " Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.149087 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.154296 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-kube-api-access-9k648" (OuterVolumeSpecName: "kube-api-access-9k648") pod "42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" (UID: "42ce7cb0-f5d3-4ed3-9d7d-4961875d1662"). InnerVolumeSpecName "kube-api-access-9k648". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.162432 4771 generic.go:334] "Generic (PLEG): container finished" podID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerID="cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5" exitCode=0 Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.162497 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kp6q" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.162488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kp6q" event={"ID":"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662","Type":"ContainerDied","Data":"cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5"} Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.162556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kp6q" event={"ID":"42ce7cb0-f5d3-4ed3-9d7d-4961875d1662","Type":"ContainerDied","Data":"0ebb49e37a0a7d4917fd74324a6503875638b326772728a1641b62c04c22f405"} Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.162579 4771 scope.go:117] "RemoveContainer" containerID="cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.178236 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" (UID: "42ce7cb0-f5d3-4ed3-9d7d-4961875d1662"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.220053 4771 scope.go:117] "RemoveContainer" containerID="d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.251699 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.251747 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k648\" (UniqueName: \"kubernetes.io/projected/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662-kube-api-access-9k648\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.256318 4771 scope.go:117] "RemoveContainer" containerID="24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.301773 4771 scope.go:117] "RemoveContainer" containerID="cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5" Feb 20 00:06:52 crc kubenswrapper[4771]: E0220 00:06:52.302397 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5\": container with ID starting with cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5 not found: ID does not exist" containerID="cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.302450 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5"} err="failed to get container status \"cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5\": rpc error: code = NotFound desc = could not find container \"cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5\": container with ID starting with cede093979836bddd7662d608254570b00a4480ab399552a72cf1ed92979fff5 not found: ID does not exist" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.302492 4771 scope.go:117] "RemoveContainer" containerID="d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672" Feb 20 00:06:52 crc kubenswrapper[4771]: E0220 00:06:52.303063 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672\": container with ID starting with d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672 not found: ID does not exist" containerID="d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.303096 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672"} err="failed to get container status \"d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672\": rpc error: code = NotFound desc = could not find container \"d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672\": container with ID starting with d1a41718832f43a0b602b80d92bc5cc8dd360ff6764777a9971138b5db0dc672 not found: ID does not exist" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.303121 4771 scope.go:117] "RemoveContainer" containerID="24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7" Feb 20 00:06:52 crc kubenswrapper[4771]: E0220 00:06:52.303606 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7\": container with ID starting with 24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7 not found: ID does not exist" containerID="24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.303663 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7"} err="failed to get container status \"24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7\": rpc error: code = NotFound desc = could not find container \"24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7\": container with ID starting with 24e75d2f618fc4752f7d0e8f5fe7459c096b598c6d2e4b72d6969db4ecb259d7 not found: ID does not exist" Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.490245 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kp6q"] Feb 20 00:06:52 crc kubenswrapper[4771]: I0220 00:06:52.501236 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kp6q"] Feb 20 00:06:53 crc kubenswrapper[4771]: I0220 00:06:53.783559 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:53 crc kubenswrapper[4771]: I0220 00:06:53.785635 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:53 crc kubenswrapper[4771]: I0220 00:06:53.870575 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:54 crc kubenswrapper[4771]: I0220 00:06:54.272621 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:54 crc kubenswrapper[4771]: I0220 00:06:54.458597 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" path="/var/lib/kubelet/pods/42ce7cb0-f5d3-4ed3-9d7d-4961875d1662/volumes" Feb 20 00:06:54 crc kubenswrapper[4771]: I0220 00:06:54.837422 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4lfnl"] Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.210522 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4lfnl" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="registry-server" containerID="cri-o://06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a" gracePeriod=2 Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.705914 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.867279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-utilities\") pod \"60148ad1-92a8-44cd-80f9-99d873fe8f47\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.867486 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk5mf\" (UniqueName: \"kubernetes.io/projected/60148ad1-92a8-44cd-80f9-99d873fe8f47-kube-api-access-rk5mf\") pod \"60148ad1-92a8-44cd-80f9-99d873fe8f47\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.868209 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-catalog-content\") pod \"60148ad1-92a8-44cd-80f9-99d873fe8f47\" (UID: \"60148ad1-92a8-44cd-80f9-99d873fe8f47\") " Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.868217 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-utilities" (OuterVolumeSpecName: "utilities") pod "60148ad1-92a8-44cd-80f9-99d873fe8f47" (UID: "60148ad1-92a8-44cd-80f9-99d873fe8f47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.869515 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.886430 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60148ad1-92a8-44cd-80f9-99d873fe8f47-kube-api-access-rk5mf" (OuterVolumeSpecName: "kube-api-access-rk5mf") pod "60148ad1-92a8-44cd-80f9-99d873fe8f47" (UID: "60148ad1-92a8-44cd-80f9-99d873fe8f47"). InnerVolumeSpecName "kube-api-access-rk5mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.930155 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60148ad1-92a8-44cd-80f9-99d873fe8f47" (UID: "60148ad1-92a8-44cd-80f9-99d873fe8f47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.972267 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk5mf\" (UniqueName: \"kubernetes.io/projected/60148ad1-92a8-44cd-80f9-99d873fe8f47-kube-api-access-rk5mf\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:56 crc kubenswrapper[4771]: I0220 00:06:56.972320 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60148ad1-92a8-44cd-80f9-99d873fe8f47-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.230294 4771 generic.go:334] "Generic (PLEG): container finished" podID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerID="06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a" exitCode=0 Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.230351 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lfnl" event={"ID":"60148ad1-92a8-44cd-80f9-99d873fe8f47","Type":"ContainerDied","Data":"06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a"} Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.231603 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lfnl" event={"ID":"60148ad1-92a8-44cd-80f9-99d873fe8f47","Type":"ContainerDied","Data":"0394196a57843ede9352f8078cbe556ccfa53f9e10bfbc59195e64cd6d9826e3"} Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.230395 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lfnl" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.231647 4771 scope.go:117] "RemoveContainer" containerID="06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.276099 4771 scope.go:117] "RemoveContainer" containerID="a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.282647 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4lfnl"] Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.296582 4771 scope.go:117] "RemoveContainer" containerID="a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.302406 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4lfnl"] Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.346244 4771 scope.go:117] "RemoveContainer" containerID="06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a" Feb 20 00:06:57 crc kubenswrapper[4771]: E0220 00:06:57.346818 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a\": container with ID starting with 06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a not found: ID does not exist" containerID="06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.346875 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a"} err="failed to get container status \"06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a\": rpc error: code = NotFound desc = could not find container \"06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a\": container with ID starting with 06fec84b509fe93326177859239e19d88deb8d46e413df4a79eba8bf4654298a not found: ID does not exist" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.346903 4771 scope.go:117] "RemoveContainer" containerID="a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e" Feb 20 00:06:57 crc kubenswrapper[4771]: E0220 00:06:57.347355 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e\": container with ID starting with a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e not found: ID does not exist" containerID="a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.347390 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e"} err="failed to get container status \"a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e\": rpc error: code = NotFound desc = could not find container \"a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e\": container with ID starting with a801e5a3e2ae1c2887bd0221536d6a5c1763fe2ae91de46bbbe9f3dead5cbe2e not found: ID does not exist" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.347415 4771 scope.go:117] "RemoveContainer" containerID="a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1" Feb 20 00:06:57 crc kubenswrapper[4771]: E0220 00:06:57.347818 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1\": container with ID starting with a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1 not found: ID does not exist" containerID="a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1" Feb 20 00:06:57 crc kubenswrapper[4771]: I0220 00:06:57.347838 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1"} err="failed to get container status \"a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1\": rpc error: code = NotFound desc = could not find container \"a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1\": container with ID starting with a71417e9aded0546d888b97324e9f0ea370ca3b94d1834d050650bfdc832f8b1 not found: ID does not exist" Feb 20 00:06:58 crc kubenswrapper[4771]: I0220 00:06:58.453273 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" path="/var/lib/kubelet/pods/60148ad1-92a8-44cd-80f9-99d873fe8f47/volumes" Feb 20 00:07:42 crc kubenswrapper[4771]: I0220 00:07:42.956611 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:07:42 crc kubenswrapper[4771]: I0220 00:07:42.957410 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:08:12 crc kubenswrapper[4771]: I0220 00:08:12.957106 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:08:12 crc kubenswrapper[4771]: I0220 00:08:12.957979 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:08:42 crc kubenswrapper[4771]: I0220 00:08:42.956879 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:08:42 crc kubenswrapper[4771]: I0220 00:08:42.957597 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:08:42 crc kubenswrapper[4771]: I0220 00:08:42.957665 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 20 00:08:42 crc kubenswrapper[4771]: I0220 00:08:42.958895 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:08:42 crc kubenswrapper[4771]: I0220 00:08:42.958993 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" gracePeriod=600 Feb 20 00:08:43 crc kubenswrapper[4771]: E0220 00:08:43.134382 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:08:44 crc kubenswrapper[4771]: I0220 00:08:44.029863 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" exitCode=0 Feb 20 00:08:44 crc kubenswrapper[4771]: I0220 00:08:44.029962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862"} Feb 20 00:08:44 crc kubenswrapper[4771]: I0220 00:08:44.030520 4771 scope.go:117] "RemoveContainer" containerID="4c90920aab93ae317c9d5b4e3c1351eb635c27259dd0f33f09bda1bf6a83023f" Feb 20 00:08:44 crc kubenswrapper[4771]: I0220 00:08:44.031896 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:08:44 crc kubenswrapper[4771]: E0220 00:08:44.032722 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.543045 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xgrcc"] Feb 20 00:08:51 crc kubenswrapper[4771]: E0220 00:08:51.544616 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="registry-server" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.544645 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="registry-server" Feb 20 00:08:51 crc kubenswrapper[4771]: E0220 00:08:51.544685 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="extract-utilities" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.544702 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="extract-utilities" Feb 20 00:08:51 crc kubenswrapper[4771]: E0220 00:08:51.544728 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="extract-utilities" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.544744 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="extract-utilities" Feb 20 00:08:51 crc kubenswrapper[4771]: E0220 00:08:51.544831 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="registry-server" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.544849 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="registry-server" Feb 20 00:08:51 crc kubenswrapper[4771]: E0220 00:08:51.544881 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="extract-content" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.544897 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="extract-content" Feb 20 00:08:51 crc kubenswrapper[4771]: E0220 00:08:51.544938 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="extract-content" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.544953 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="extract-content" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.545534 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="60148ad1-92a8-44cd-80f9-99d873fe8f47" containerName="registry-server" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.545607 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ce7cb0-f5d3-4ed3-9d7d-4961875d1662" containerName="registry-server" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.549213 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.563940 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xgrcc"] Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.676280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-utilities\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.676522 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96p9c\" (UniqueName: \"kubernetes.io/projected/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-kube-api-access-96p9c\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.676864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-catalog-content\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.779492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-utilities\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.779631 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96p9c\" (UniqueName: \"kubernetes.io/projected/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-kube-api-access-96p9c\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.779723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-catalog-content\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.780039 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-utilities\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.780500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-catalog-content\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.798255 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96p9c\" (UniqueName: \"kubernetes.io/projected/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-kube-api-access-96p9c\") pod \"redhat-operators-xgrcc\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:51 crc kubenswrapper[4771]: I0220 00:08:51.880720 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:08:52 crc kubenswrapper[4771]: I0220 00:08:52.355539 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xgrcc"] Feb 20 00:08:53 crc kubenswrapper[4771]: I0220 00:08:53.147213 4771 generic.go:334] "Generic (PLEG): container finished" podID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerID="08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735" exitCode=0 Feb 20 00:08:53 crc kubenswrapper[4771]: I0220 00:08:53.147422 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgrcc" event={"ID":"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10","Type":"ContainerDied","Data":"08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735"} Feb 20 00:08:53 crc kubenswrapper[4771]: I0220 00:08:53.147473 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgrcc" event={"ID":"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10","Type":"ContainerStarted","Data":"681bf14015b65e92318fac7ca2aa177135da505375dc015aa5f0a0968ec3e659"} Feb 20 00:08:53 crc kubenswrapper[4771]: I0220 00:08:53.150073 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 00:08:54 crc kubenswrapper[4771]: I0220 00:08:54.163282 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgrcc" event={"ID":"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10","Type":"ContainerStarted","Data":"ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458"} Feb 20 00:08:57 crc kubenswrapper[4771]: I0220 00:08:57.437213 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:08:57 crc kubenswrapper[4771]: E0220 00:08:57.438157 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:09:00 crc kubenswrapper[4771]: I0220 00:09:00.246693 4771 generic.go:334] "Generic (PLEG): container finished" podID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerID="ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458" exitCode=0 Feb 20 00:09:00 crc kubenswrapper[4771]: I0220 00:09:00.246758 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgrcc" event={"ID":"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10","Type":"ContainerDied","Data":"ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458"} Feb 20 00:09:01 crc kubenswrapper[4771]: I0220 00:09:01.263789 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgrcc" event={"ID":"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10","Type":"ContainerStarted","Data":"b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5"} Feb 20 00:09:01 crc kubenswrapper[4771]: I0220 00:09:01.302641 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xgrcc" podStartSLOduration=2.776118377 podStartE2EDuration="10.302623286s" podCreationTimestamp="2026-02-20 00:08:51 +0000 UTC" firstStartedPulling="2026-02-20 00:08:53.14978001 +0000 UTC m=+9633.421222490" lastFinishedPulling="2026-02-20 00:09:00.676284879 +0000 UTC m=+9640.947727399" observedRunningTime="2026-02-20 00:09:01.291658186 +0000 UTC m=+9641.563100686" watchObservedRunningTime="2026-02-20 00:09:01.302623286 +0000 UTC m=+9641.574065766" Feb 20 00:09:01 crc kubenswrapper[4771]: I0220 00:09:01.888165 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:09:01 crc kubenswrapper[4771]: I0220 00:09:01.889055 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:09:02 crc kubenswrapper[4771]: I0220 00:09:02.956706 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xgrcc" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="registry-server" probeResult="failure" output=< Feb 20 00:09:02 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 20 00:09:02 crc kubenswrapper[4771]: > Feb 20 00:09:11 crc kubenswrapper[4771]: I0220 00:09:11.437281 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:09:11 crc kubenswrapper[4771]: E0220 00:09:11.438175 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:09:11 crc kubenswrapper[4771]: I0220 00:09:11.955579 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:09:12 crc kubenswrapper[4771]: I0220 00:09:12.044808 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:09:12 crc kubenswrapper[4771]: I0220 00:09:12.206381 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xgrcc"] Feb 20 00:09:13 crc kubenswrapper[4771]: I0220 00:09:13.400581 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xgrcc" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="registry-server" containerID="cri-o://b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5" gracePeriod=2 Feb 20 00:09:13 crc kubenswrapper[4771]: I0220 00:09:13.961656 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.048262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-catalog-content\") pod \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.048325 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-utilities\") pod \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.048366 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96p9c\" (UniqueName: \"kubernetes.io/projected/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-kube-api-access-96p9c\") pod \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\" (UID: \"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10\") " Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.049669 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-utilities" (OuterVolumeSpecName: "utilities") pod "f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" (UID: "f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.055129 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-kube-api-access-96p9c" (OuterVolumeSpecName: "kube-api-access-96p9c") pod "f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" (UID: "f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10"). InnerVolumeSpecName "kube-api-access-96p9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.152926 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.152984 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96p9c\" (UniqueName: \"kubernetes.io/projected/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-kube-api-access-96p9c\") on node \"crc\" DevicePath \"\"" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.221374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" (UID: "f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.255455 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.428824 4771 generic.go:334] "Generic (PLEG): container finished" podID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerID="b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5" exitCode=0 Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.428876 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgrcc" event={"ID":"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10","Type":"ContainerDied","Data":"b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5"} Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.428928 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgrcc" event={"ID":"f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10","Type":"ContainerDied","Data":"681bf14015b65e92318fac7ca2aa177135da505375dc015aa5f0a0968ec3e659"} Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.428930 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgrcc" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.428948 4771 scope.go:117] "RemoveContainer" containerID="b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.461444 4771 scope.go:117] "RemoveContainer" containerID="ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.488582 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xgrcc"] Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.497978 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xgrcc"] Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.500520 4771 scope.go:117] "RemoveContainer" containerID="08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.586264 4771 scope.go:117] "RemoveContainer" containerID="b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5" Feb 20 00:09:14 crc kubenswrapper[4771]: E0220 00:09:14.586749 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5\": container with ID starting with b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5 not found: ID does not exist" containerID="b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.586812 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5"} err="failed to get container status \"b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5\": rpc error: code = NotFound desc = could not find container \"b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5\": container with ID starting with b8226b68de955eaab77935ae22416733ceb51e82d628fc0ae78b902ca7fab9b5 not found: ID does not exist" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.586850 4771 scope.go:117] "RemoveContainer" containerID="ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458" Feb 20 00:09:14 crc kubenswrapper[4771]: E0220 00:09:14.587316 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458\": container with ID starting with ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458 not found: ID does not exist" containerID="ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.587356 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458"} err="failed to get container status \"ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458\": rpc error: code = NotFound desc = could not find container \"ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458\": container with ID starting with ee4ab0868e6042b201589eae47b5eb9a914dda8215ecaf1bf3514623aa45d458 not found: ID does not exist" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.587385 4771 scope.go:117] "RemoveContainer" containerID="08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735" Feb 20 00:09:14 crc kubenswrapper[4771]: E0220 00:09:14.587653 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735\": container with ID starting with 08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735 not found: ID does not exist" containerID="08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735" Feb 20 00:09:14 crc kubenswrapper[4771]: I0220 00:09:14.587703 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735"} err="failed to get container status \"08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735\": rpc error: code = NotFound desc = could not find container \"08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735\": container with ID starting with 08710f4ebe46c2a05eb9baf7e76de294b67d996dd26f197aabf912ee61462735 not found: ID does not exist" Feb 20 00:09:16 crc kubenswrapper[4771]: I0220 00:09:16.447895 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" path="/var/lib/kubelet/pods/f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10/volumes" Feb 20 00:09:26 crc kubenswrapper[4771]: I0220 00:09:26.438120 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:09:26 crc kubenswrapper[4771]: E0220 00:09:26.439116 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:09:38 crc kubenswrapper[4771]: I0220 00:09:38.438663 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:09:38 crc kubenswrapper[4771]: E0220 00:09:38.440142 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:09:49 crc kubenswrapper[4771]: I0220 00:09:49.438248 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:09:49 crc kubenswrapper[4771]: E0220 00:09:49.439343 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:10:01 crc kubenswrapper[4771]: I0220 00:10:01.437103 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:10:01 crc kubenswrapper[4771]: E0220 00:10:01.437942 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:10:12 crc kubenswrapper[4771]: I0220 00:10:12.442497 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:10:12 crc kubenswrapper[4771]: E0220 00:10:12.445107 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:10:25 crc kubenswrapper[4771]: I0220 00:10:25.438889 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:10:25 crc kubenswrapper[4771]: E0220 00:10:25.439962 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:10:38 crc kubenswrapper[4771]: I0220 00:10:38.437663 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:10:38 crc kubenswrapper[4771]: E0220 00:10:38.438612 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:10:49 crc kubenswrapper[4771]: I0220 00:10:49.438197 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:10:49 crc kubenswrapper[4771]: E0220 00:10:49.439812 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:11:03 crc kubenswrapper[4771]: I0220 00:11:03.438037 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:11:03 crc kubenswrapper[4771]: E0220 00:11:03.438961 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:11:17 crc kubenswrapper[4771]: I0220 00:11:17.437593 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:11:17 crc kubenswrapper[4771]: E0220 00:11:17.438287 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:11:32 crc kubenswrapper[4771]: I0220 00:11:32.437329 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:11:32 crc kubenswrapper[4771]: E0220 00:11:32.438182 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:11:44 crc kubenswrapper[4771]: I0220 00:11:44.437850 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:11:44 crc kubenswrapper[4771]: E0220 00:11:44.438732 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:11:55 crc kubenswrapper[4771]: I0220 00:11:55.437905 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:11:55 crc kubenswrapper[4771]: E0220 00:11:55.438544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:12:09 crc kubenswrapper[4771]: I0220 00:12:09.437845 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:12:09 crc kubenswrapper[4771]: E0220 00:12:09.438626 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:12:21 crc kubenswrapper[4771]: I0220 00:12:21.438271 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:12:21 crc kubenswrapper[4771]: E0220 00:12:21.439483 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:12:35 crc kubenswrapper[4771]: I0220 00:12:35.438513 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:12:35 crc kubenswrapper[4771]: E0220 00:12:35.439632 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:12:48 crc kubenswrapper[4771]: I0220 00:12:48.438366 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:12:48 crc kubenswrapper[4771]: E0220 00:12:48.440010 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:13:02 crc kubenswrapper[4771]: I0220 00:13:02.438342 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:13:02 crc kubenswrapper[4771]: E0220 00:13:02.439226 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:13:16 crc kubenswrapper[4771]: I0220 00:13:16.437832 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:13:16 crc kubenswrapper[4771]: E0220 00:13:16.438494 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:13:31 crc kubenswrapper[4771]: I0220 00:13:31.437579 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:13:31 crc kubenswrapper[4771]: E0220 00:13:31.438786 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:13:46 crc kubenswrapper[4771]: I0220 00:13:46.438943 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:13:47 crc kubenswrapper[4771]: I0220 00:13:47.207262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"2411b3e172f053d34174b248400f1a0b44c39c7fcfb46f6cb68fc651f7234056"} Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.551318 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ssczv"] Feb 20 00:13:56 crc kubenswrapper[4771]: E0220 00:13:56.552556 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="extract-utilities" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.552574 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="extract-utilities" Feb 20 00:13:56 crc kubenswrapper[4771]: E0220 00:13:56.552611 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="registry-server" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.552620 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="registry-server" Feb 20 00:13:56 crc kubenswrapper[4771]: E0220 00:13:56.552641 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="extract-content" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.552649 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="extract-content" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.552922 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d3255c-ee9e-48f0-a2c0-1ecaa2375a10" containerName="registry-server" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.554396 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.574209 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssczv"] Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.736276 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-catalog-content\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.736328 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b28pg\" (UniqueName: \"kubernetes.io/projected/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-kube-api-access-b28pg\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.736537 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-utilities\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.838306 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-catalog-content\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.838392 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b28pg\" (UniqueName: \"kubernetes.io/projected/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-kube-api-access-b28pg\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.838486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-utilities\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.839042 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-utilities\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.839311 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-catalog-content\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.858007 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b28pg\" (UniqueName: \"kubernetes.io/projected/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-kube-api-access-b28pg\") pod \"community-operators-ssczv\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:56 crc kubenswrapper[4771]: I0220 00:13:56.906937 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:13:57 crc kubenswrapper[4771]: I0220 00:13:57.550328 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssczv"] Feb 20 00:13:58 crc kubenswrapper[4771]: I0220 00:13:58.336370 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerID="2c47303cb48f290638d5a9861f30c08f23431a7a041ba53b3858a6506bc9f0bc" exitCode=0 Feb 20 00:13:58 crc kubenswrapper[4771]: I0220 00:13:58.336837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssczv" event={"ID":"6b450216-da0a-4b1b-b3f8-57eb3bf9a655","Type":"ContainerDied","Data":"2c47303cb48f290638d5a9861f30c08f23431a7a041ba53b3858a6506bc9f0bc"} Feb 20 00:13:58 crc kubenswrapper[4771]: I0220 00:13:58.336899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssczv" event={"ID":"6b450216-da0a-4b1b-b3f8-57eb3bf9a655","Type":"ContainerStarted","Data":"ba29b7c005de6b40897aa244eb424950f66597926295233ddd17510ef8a07cdd"} Feb 20 00:13:58 crc kubenswrapper[4771]: I0220 00:13:58.338706 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 00:14:00 crc kubenswrapper[4771]: I0220 00:14:00.365617 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssczv" event={"ID":"6b450216-da0a-4b1b-b3f8-57eb3bf9a655","Type":"ContainerStarted","Data":"c626745622687bd52467a2c0d72ba695e9dc6a96067db0e02625940b8a260d15"} Feb 20 00:14:01 crc kubenswrapper[4771]: I0220 00:14:01.375598 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerID="c626745622687bd52467a2c0d72ba695e9dc6a96067db0e02625940b8a260d15" exitCode=0 Feb 20 00:14:01 crc kubenswrapper[4771]: I0220 00:14:01.375698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssczv" event={"ID":"6b450216-da0a-4b1b-b3f8-57eb3bf9a655","Type":"ContainerDied","Data":"c626745622687bd52467a2c0d72ba695e9dc6a96067db0e02625940b8a260d15"} Feb 20 00:14:02 crc kubenswrapper[4771]: I0220 00:14:02.388812 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssczv" event={"ID":"6b450216-da0a-4b1b-b3f8-57eb3bf9a655","Type":"ContainerStarted","Data":"76bb3c16c67f16a93792e85ca157937aa289ca3565fecfa9c8a0a35da0e0555a"} Feb 20 00:14:02 crc kubenswrapper[4771]: I0220 00:14:02.418109 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ssczv" podStartSLOduration=2.908877249 podStartE2EDuration="6.418089852s" podCreationTimestamp="2026-02-20 00:13:56 +0000 UTC" firstStartedPulling="2026-02-20 00:13:58.338507415 +0000 UTC m=+9938.609949885" lastFinishedPulling="2026-02-20 00:14:01.847719978 +0000 UTC m=+9942.119162488" observedRunningTime="2026-02-20 00:14:02.41535548 +0000 UTC m=+9942.686797960" watchObservedRunningTime="2026-02-20 00:14:02.418089852 +0000 UTC m=+9942.689532312" Feb 20 00:14:06 crc kubenswrapper[4771]: I0220 00:14:06.907516 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:14:06 crc kubenswrapper[4771]: I0220 00:14:06.908085 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:14:06 crc kubenswrapper[4771]: I0220 00:14:06.997222 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:14:07 crc kubenswrapper[4771]: I0220 00:14:07.510659 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:14:07 crc kubenswrapper[4771]: I0220 00:14:07.840063 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssczv"] Feb 20 00:14:09 crc kubenswrapper[4771]: I0220 00:14:09.475007 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ssczv" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="registry-server" containerID="cri-o://76bb3c16c67f16a93792e85ca157937aa289ca3565fecfa9c8a0a35da0e0555a" gracePeriod=2 Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.488063 4771 generic.go:334] "Generic (PLEG): container finished" podID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerID="76bb3c16c67f16a93792e85ca157937aa289ca3565fecfa9c8a0a35da0e0555a" exitCode=0 Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.488149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssczv" event={"ID":"6b450216-da0a-4b1b-b3f8-57eb3bf9a655","Type":"ContainerDied","Data":"76bb3c16c67f16a93792e85ca157937aa289ca3565fecfa9c8a0a35da0e0555a"} Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.488505 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssczv" event={"ID":"6b450216-da0a-4b1b-b3f8-57eb3bf9a655","Type":"ContainerDied","Data":"ba29b7c005de6b40897aa244eb424950f66597926295233ddd17510ef8a07cdd"} Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.488522 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba29b7c005de6b40897aa244eb424950f66597926295233ddd17510ef8a07cdd" Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.718678 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.879255 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-catalog-content\") pod \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.879631 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-utilities\") pod \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.879686 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b28pg\" (UniqueName: \"kubernetes.io/projected/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-kube-api-access-b28pg\") pod \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\" (UID: \"6b450216-da0a-4b1b-b3f8-57eb3bf9a655\") " Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.880537 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-utilities" (OuterVolumeSpecName: "utilities") pod "6b450216-da0a-4b1b-b3f8-57eb3bf9a655" (UID: "6b450216-da0a-4b1b-b3f8-57eb3bf9a655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.893040 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-kube-api-access-b28pg" (OuterVolumeSpecName: "kube-api-access-b28pg") pod "6b450216-da0a-4b1b-b3f8-57eb3bf9a655" (UID: "6b450216-da0a-4b1b-b3f8-57eb3bf9a655"). InnerVolumeSpecName "kube-api-access-b28pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.935000 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b450216-da0a-4b1b-b3f8-57eb3bf9a655" (UID: "6b450216-da0a-4b1b-b3f8-57eb3bf9a655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.982215 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.982245 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b28pg\" (UniqueName: \"kubernetes.io/projected/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-kube-api-access-b28pg\") on node \"crc\" DevicePath \"\"" Feb 20 00:14:10 crc kubenswrapper[4771]: I0220 00:14:10.982255 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b450216-da0a-4b1b-b3f8-57eb3bf9a655-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:14:11 crc kubenswrapper[4771]: I0220 00:14:11.496170 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssczv" Feb 20 00:14:11 crc kubenswrapper[4771]: I0220 00:14:11.536801 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssczv"] Feb 20 00:14:11 crc kubenswrapper[4771]: I0220 00:14:11.548203 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ssczv"] Feb 20 00:14:12 crc kubenswrapper[4771]: I0220 00:14:12.451494 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" path="/var/lib/kubelet/pods/6b450216-da0a-4b1b-b3f8-57eb3bf9a655/volumes" Feb 20 00:14:17 crc kubenswrapper[4771]: I0220 00:14:17.372366 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bfa02b55-b912-42a2-948f-69d38c0a6532/init-config-reloader/0.log" Feb 20 00:14:17 crc kubenswrapper[4771]: I0220 00:14:17.616687 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bfa02b55-b912-42a2-948f-69d38c0a6532/init-config-reloader/0.log" Feb 20 00:14:17 crc kubenswrapper[4771]: I0220 00:14:17.626473 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bfa02b55-b912-42a2-948f-69d38c0a6532/alertmanager/0.log" Feb 20 00:14:17 crc kubenswrapper[4771]: I0220 00:14:17.711506 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_bfa02b55-b912-42a2-948f-69d38c0a6532/config-reloader/0.log" Feb 20 00:14:17 crc kubenswrapper[4771]: I0220 00:14:17.848587 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f65694fc-cb68-4dfd-9984-7fa66df7ba37/aodh-api/0.log" Feb 20 00:14:17 crc kubenswrapper[4771]: I0220 00:14:17.944704 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f65694fc-cb68-4dfd-9984-7fa66df7ba37/aodh-evaluator/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.048804 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f65694fc-cb68-4dfd-9984-7fa66df7ba37/aodh-listener/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.155419 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f65694fc-cb68-4dfd-9984-7fa66df7ba37/aodh-notifier/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.176941 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55c9c68bd-k2tjn_ef1a2dcb-a395-4516-a64b-0e25c1802d4e/barbican-api/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.267319 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55c9c68bd-k2tjn_ef1a2dcb-a395-4516-a64b-0e25c1802d4e/barbican-api-log/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.407973 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8c545c846-c84s7_09286b82-6a52-4993-99ee-868ad3d84cb7/barbican-keystone-listener/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.600094 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8c545c846-c84s7_09286b82-6a52-4993-99ee-868ad3d84cb7/barbican-keystone-listener-log/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.674038 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b4448d5cc-99fd6_59511b4b-718b-4f8b-8781-1a1c952c0e16/barbican-worker/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.686880 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b4448d5cc-99fd6_59511b4b-718b-4f8b-8781-1a1c952c0e16/barbican-worker-log/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.830667 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-5hlk6_e0006193-39c1-45bb-924e-26c8cb22638b/bootstrap-openstack-openstack-cell1/0.log" Feb 20 00:14:18 crc kubenswrapper[4771]: I0220 00:14:18.913407 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_88c6e800-2233-40b4-aab1-b4a7ac0fbb13/ceilometer-central-agent/1.log" Feb 20 00:14:19 crc kubenswrapper[4771]: I0220 00:14:19.936663 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_88c6e800-2233-40b4-aab1-b4a7ac0fbb13/sg-core/0.log" Feb 20 00:14:19 crc kubenswrapper[4771]: I0220 00:14:19.937872 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_88c6e800-2233-40b4-aab1-b4a7ac0fbb13/ceilometer-central-agent/0.log" Feb 20 00:14:19 crc kubenswrapper[4771]: I0220 00:14:19.957865 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_88c6e800-2233-40b4-aab1-b4a7ac0fbb13/ceilometer-notification-agent/0.log" Feb 20 00:14:19 crc kubenswrapper[4771]: I0220 00:14:19.985581 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_88c6e800-2233-40b4-aab1-b4a7ac0fbb13/proxy-httpd/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.169920 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0e3d47d2-991d-42bc-9789-4a65bfee348f/cinder-api-log/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.191384 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0e3d47d2-991d-42bc-9789-4a65bfee348f/cinder-api/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.344626 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29525761-n7dz6_7cf7e327-cc23-4ebf-a435-b22c21a1d4ec/cinder-db-purge/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.519867 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_00ef58df-5158-4cd3-9280-7e5bb9a054cf/cinder-scheduler/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.573407 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_00ef58df-5158-4cd3-9280-7e5bb9a054cf/probe/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.798532 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-472r9_0b14a02a-66c7-4b9b-a65f-7cc2fc09ffcd/configure-os-openstack-openstack-cell1/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.826491 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-92xj9_0742feef-b258-4e8f-a369-07d7932f47aa/configure-network-openstack-openstack-cell1/0.log" Feb 20 00:14:20 crc kubenswrapper[4771]: I0220 00:14:20.975781 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-564f74d4f7-gtwrv_6182ac80-a485-411b-883b-56b2b74eb9c9/init/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.317897 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-564f74d4f7-gtwrv_6182ac80-a485-411b-883b-56b2b74eb9c9/init/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.434713 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-564f74d4f7-gtwrv_6182ac80-a485-411b-883b-56b2b74eb9c9/dnsmasq-dns/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.460126 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-95ghw_60280b37-1d56-417c-90db-5d8985050b08/download-cache-openstack-openstack-cell1/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.652225 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29525761-tmxpz_869aec74-18fd-4c1d-a474-8408353d1208/glance-dbpurge/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.708792 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f7523e0-9170-4d3a-951b-2aae16fb47d3/glance-httpd/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.788600 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f7523e0-9170-4d3a-951b-2aae16fb47d3/glance-log/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.890837 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ef8da599-c761-483b-8e1f-0c39e63b7476/glance-log/0.log" Feb 20 00:14:21 crc kubenswrapper[4771]: I0220 00:14:21.897467 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ef8da599-c761-483b-8e1f-0c39e63b7476/glance-httpd/0.log" Feb 20 00:14:22 crc kubenswrapper[4771]: I0220 00:14:22.260562 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-purge-29525761-t9hpm_7e3d4f2d-0556-4a9d-8f84-e62136a71eac/heat-dbpurge/0.log" Feb 20 00:14:22 crc kubenswrapper[4771]: I0220 00:14:22.428142 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-f88dbf5b-scj6q_1eaa80a7-4011-400a-b29b-d7f76e69ecaa/heat-engine/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.010322 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6d5f47dcf5-c4wds_26834139-7112-45a0-bd80-ab038140ff2e/heat-api/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.190130 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-659b48b6d7-rrwt7_67720512-861d-4b44-b21f-78a3ed2ec49b/heat-cfnapi/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.234658 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-tnl7l_ca3ac0ad-d049-4272-b7e0-3eb490e7cef0/install-certs-openstack-openstack-cell1/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.267482 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66b7bf74db-l2vb5_46a4c64b-a2fd-4fac-814a-055008a0d27a/horizon/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.464123 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-dnk4m_7ca871d9-7d59-49d1-9794-512c7d52bc8d/install-os-openstack-openstack-cell1/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.721370 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66b7bf74db-l2vb5_46a4c64b-a2fd-4fac-814a-055008a0d27a/horizon-log/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.755782 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7767cf9677-cqzdw_88a37d62-ce17-4ba1-b2fe-5cede6afb25c/keystone-api/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.770975 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525701-jn7pz_1a65030d-bcc9-42d3-80d9-66a100f1882d/keystone-cron/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.932974 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525761-d2rl4_635e94aa-984b-4340-b733-898071f7a59c/keystone-cron/0.log" Feb 20 00:14:23 crc kubenswrapper[4771]: I0220 00:14:23.983309 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5ed8b8ea-3ead-422c-88dd-f7e8421f52b6/kube-state-metrics/0.log" Feb 20 00:14:24 crc kubenswrapper[4771]: I0220 00:14:24.137739 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-4qbx5_7f7c2323-0701-4459-b78c-1e92739da106/libvirt-openstack-openstack-cell1/0.log" Feb 20 00:14:24 crc kubenswrapper[4771]: I0220 00:14:24.442007 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-686cc5555f-67pr9_061e4898-95bd-4626-9869-811dd8055f42/neutron-api/0.log" Feb 20 00:14:24 crc kubenswrapper[4771]: I0220 00:14:24.637592 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-686cc5555f-67pr9_061e4898-95bd-4626-9869-811dd8055f42/neutron-httpd/0.log" Feb 20 00:14:24 crc kubenswrapper[4771]: I0220 00:14:24.890309 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-tszv4_8d83e7d4-689a-4333-9566-348cae7948f2/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 20 00:14:25 crc kubenswrapper[4771]: I0220 00:14:25.134495 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-jm7b8_d888a078-5730-47a9-a40c-1cf3bdb948b0/neutron-metadata-openstack-openstack-cell1/0.log" Feb 20 00:14:25 crc kubenswrapper[4771]: I0220 00:14:25.198989 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-t97tm_39a05c5e-7535-49be-888f-60ca7c4ec532/neutron-sriov-openstack-openstack-cell1/0.log" Feb 20 00:14:25 crc kubenswrapper[4771]: I0220 00:14:25.609823 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_af59e577-5041-4237-9405-d2bc65fa79f2/nova-api-api/0.log" Feb 20 00:14:25 crc kubenswrapper[4771]: I0220 00:14:25.655423 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6d41e848-b21f-4f38-96a2-dc7ae49ffb0b/nova-cell0-conductor-conductor/0.log" Feb 20 00:14:25 crc kubenswrapper[4771]: I0220 00:14:25.657319 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_af59e577-5041-4237-9405-d2bc65fa79f2/nova-api-log/0.log" Feb 20 00:14:25 crc kubenswrapper[4771]: I0220 00:14:25.811636 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29525760-xhvxn_92423829-83ed-4681-8053-35aa047de5dd/nova-manage/0.log" Feb 20 00:14:26 crc kubenswrapper[4771]: I0220 00:14:26.102863 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3056f313-9503-4088-afaa-ea7e28203a49/nova-cell1-conductor-conductor/0.log" Feb 20 00:14:26 crc kubenswrapper[4771]: I0220 00:14:26.189158 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29525760-wg5bq_b7ea1045-2fa9-4fde-b5c8-73704c77d222/nova-manage/0.log" Feb 20 00:14:26 crc kubenswrapper[4771]: I0220 00:14:26.444836 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5be949dd-1063-4695-8a38-9b2fb21f9cb4/nova-cell1-novncproxy-novncproxy/0.log" Feb 20 00:14:26 crc kubenswrapper[4771]: I0220 00:14:26.469519 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellh48jl_078da17d-fe59-470f-9ab1-e265491c5997/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 20 00:14:26 crc kubenswrapper[4771]: I0220 00:14:26.668514 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-zvngw_3272a901-75aa-4fba-bcff-911ee4166918/nova-cell1-openstack-openstack-cell1/0.log" Feb 20 00:14:26 crc kubenswrapper[4771]: I0220 00:14:26.771469 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_71b1081b-1c98-4691-980a-7035c4996dc3/nova-metadata-log/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.100788 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e25fc39c-df45-4537-b973-ec200e5b7a00/nova-scheduler-scheduler/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.203545 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-c98b99c64-kr7ql_30f2a934-f61e-4c3e-93eb-ed824e644eff/init/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.305492 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_71b1081b-1c98-4691-980a-7035c4996dc3/nova-metadata-metadata/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.371450 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-c98b99c64-kr7ql_30f2a934-f61e-4c3e-93eb-ed824e644eff/init/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.486630 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-c98b99c64-kr7ql_30f2a934-f61e-4c3e-93eb-ed824e644eff/octavia-api-provider-agent/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.605329 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jrzlw_8cb35f59-020c-40c3-b6a8-b03124ee0b08/init/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.651178 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-c98b99c64-kr7ql_30f2a934-f61e-4c3e-93eb-ed824e644eff/octavia-api/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.825602 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jrzlw_8cb35f59-020c-40c3-b6a8-b03124ee0b08/init/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.973491 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jrzlw_8cb35f59-020c-40c3-b6a8-b03124ee0b08/octavia-healthmanager/0.log" Feb 20 00:14:27 crc kubenswrapper[4771]: I0220 00:14:27.975401 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-gxv88_589851d7-1ce4-40a4-ad6a-96fb46673693/init/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.138101 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-gxv88_589851d7-1ce4-40a4-ad6a-96fb46673693/init/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.214815 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-9w77c_e7c1d8f3-838f-4194-8fa9-ca248b474ca7/init/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.226897 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-gxv88_589851d7-1ce4-40a4-ad6a-96fb46673693/octavia-housekeeping/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.553356 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-9w77c_e7c1d8f3-838f-4194-8fa9-ca248b474ca7/init/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.573758 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-pfjcs_0d7deae2-cb15-495d-9146-604c2305fd28/init/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.747966 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-9w77c_e7c1d8f3-838f-4194-8fa9-ca248b474ca7/octavia-amphora-httpd/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.927382 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-pfjcs_0d7deae2-cb15-495d-9146-604c2305fd28/init/0.log" Feb 20 00:14:28 crc kubenswrapper[4771]: I0220 00:14:28.974501 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-zvjfp_d1f51bd5-34dd-4d08-8427-71f91be4b299/init/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.010008 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-pfjcs_0d7deae2-cb15-495d-9146-604c2305fd28/octavia-rsyslog/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.192601 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-zvjfp_d1f51bd5-34dd-4d08-8427-71f91be4b299/init/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.339929 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-zvjfp_d1f51bd5-34dd-4d08-8427-71f91be4b299/octavia-worker/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.362662 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_36dec848-d1c8-4567-be43-2e6f0e10db93/mysql-bootstrap/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.598997 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_01b94015-d7c8-4ce0-ae61-aa68065a4c85/mysql-bootstrap/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.605504 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_36dec848-d1c8-4567-be43-2e6f0e10db93/galera/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.630494 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_36dec848-d1c8-4567-be43-2e6f0e10db93/mysql-bootstrap/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.794449 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_01b94015-d7c8-4ce0-ae61-aa68065a4c85/mysql-bootstrap/0.log" Feb 20 00:14:29 crc kubenswrapper[4771]: I0220 00:14:29.829886 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_01b94015-d7c8-4ce0-ae61-aa68065a4c85/galera/0.log" Feb 20 00:14:30 crc kubenswrapper[4771]: I0220 00:14:30.580972 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_be5018a6-07d3-4cc0-9215-dd2551ef18b1/openstackclient/0.log" Feb 20 00:14:30 crc kubenswrapper[4771]: I0220 00:14:30.597562 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-q4kgb_d1485977-7e7a-41e4-aedc-c208c6763dd2/openstack-network-exporter/0.log" Feb 20 00:14:30 crc kubenswrapper[4771]: I0220 00:14:30.923749 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-npgwb_763b0b3a-ac9e-4528-8cfc-039575701bb2/ovn-controller/0.log" Feb 20 00:14:30 crc kubenswrapper[4771]: I0220 00:14:30.944857 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xtdt7_7a899f18-4f99-4fd0-95c8-5198aa786938/ovsdb-server-init/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.180120 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xtdt7_7a899f18-4f99-4fd0-95c8-5198aa786938/ovsdb-server-init/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.187938 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xtdt7_7a899f18-4f99-4fd0-95c8-5198aa786938/ovsdb-server/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.224778 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xtdt7_7a899f18-4f99-4fd0-95c8-5198aa786938/ovs-vswitchd/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.389690 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c41e2e76-3b47-431c-a559-aaff440321be/openstack-network-exporter/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.400813 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c41e2e76-3b47-431c-a559-aaff440321be/ovn-northd/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.544090 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-gfcl7_ece28209-ff16-4d53-9e40-7c58ae95bd0e/ovn-openstack-openstack-cell1/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.614151 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7ebf8641-9285-4ae4-afc9-0cf3b1bc585e/openstack-network-exporter/0.log" Feb 20 00:14:31 crc kubenswrapper[4771]: I0220 00:14:31.775998 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7ebf8641-9285-4ae4-afc9-0cf3b1bc585e/ovsdbserver-nb/0.log" Feb 20 00:14:32 crc kubenswrapper[4771]: I0220 00:14:32.454895 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_c27e9f00-5617-42ce-ab8b-16002c5ac59b/ovsdbserver-nb/0.log" Feb 20 00:14:32 crc kubenswrapper[4771]: I0220 00:14:32.458857 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_c27e9f00-5617-42ce-ab8b-16002c5ac59b/openstack-network-exporter/0.log" Feb 20 00:14:32 crc kubenswrapper[4771]: I0220 00:14:32.830422 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_e51891da-e934-4e4e-afb0-dd9f7bc4a6ef/openstack-network-exporter/0.log" Feb 20 00:14:32 crc kubenswrapper[4771]: I0220 00:14:32.898432 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_e51891da-e934-4e4e-afb0-dd9f7bc4a6ef/ovsdbserver-nb/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.055320 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_be5a39cb-6ede-4b4e-9b99-239e7bbbf830/openstack-network-exporter/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.079947 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_be5a39cb-6ede-4b4e-9b99-239e7bbbf830/ovsdbserver-sb/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.208530 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a/openstack-network-exporter/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.280239 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_624c8ea5-a94c-4a6d-a0fd-9a05ba8a400a/ovsdbserver-sb/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.401300 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_8a6bd6eb-9513-46d0-be30-a3ca00254bc1/ovsdbserver-sb/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.422408 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_8a6bd6eb-9513-46d0-be30-a3ca00254bc1/openstack-network-exporter/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.664314 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5ff55dfd66-b9dhc_ce1cb23c-e114-4893-8a75-893ea165fd71/placement-log/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.701237 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5ff55dfd66-b9dhc_ce1cb23c-e114-4893-8a75-893ea165fd71/placement-api/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.869512 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-crn2sp_93edb2e0-5438-4b84-80f0-835768c89a64/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 20 00:14:33 crc kubenswrapper[4771]: I0220 00:14:33.928922 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_70d28726-108a-4624-9dfb-8300d74cf9e0/init-config-reloader/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.215931 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_70d28726-108a-4624-9dfb-8300d74cf9e0/config-reloader/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.246437 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_70d28726-108a-4624-9dfb-8300d74cf9e0/thanos-sidecar/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.260516 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_70d28726-108a-4624-9dfb-8300d74cf9e0/init-config-reloader/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.283876 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_70d28726-108a-4624-9dfb-8300d74cf9e0/prometheus/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.453474 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c005771d-7e6d-44f8-93d0-6bf7c1db97d5/setup-container/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.658009 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c005771d-7e6d-44f8-93d0-6bf7c1db97d5/setup-container/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.689485 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c005771d-7e6d-44f8-93d0-6bf7c1db97d5/rabbitmq/0.log" Feb 20 00:14:34 crc kubenswrapper[4771]: I0220 00:14:34.790519 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3/setup-container/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.076161 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3/setup-container/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.129254 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-ccdp6_10bed0e5-6103-4b5a-91b3-cf36d3c6b837/reboot-os-openstack-openstack-cell1/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.153499 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_58bdc8c9-d5b9-4b8f-bd73-44afe58bf6c3/rabbitmq/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.364055 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-7ppv7_506ed946-fa87-4e12-bfd9-0beb5708a8e8/run-os-openstack-openstack-cell1/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.451400 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-glrg2_933abe26-7272-485a-a57b-f52e8c3c81d7/ssh-known-hosts-openstack/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.643406 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-df6b66f66-cg8dd_2e92976d-8555-4275-bf37-3d1e2f56aea1/proxy-server/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.845144 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-x6j2q_bf923325-9073-4b69-91b2-573bcdfbfdb0/swift-ring-rebalance/0.log" Feb 20 00:14:35 crc kubenswrapper[4771]: I0220 00:14:35.900956 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-df6b66f66-cg8dd_2e92976d-8555-4275-bf37-3d1e2f56aea1/proxy-httpd/0.log" Feb 20 00:14:36 crc kubenswrapper[4771]: I0220 00:14:36.066097 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-6648c_24464846-7ee5-4dd2-8362-39c28ecaef08/telemetry-openstack-openstack-cell1/0.log" Feb 20 00:14:36 crc kubenswrapper[4771]: I0220 00:14:36.190153 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-fbng8_a1aa1c24-5f3f-4177-a132-a34da7c18d33/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 20 00:14:36 crc kubenswrapper[4771]: I0220 00:14:36.271169 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-d9lvg_ae484840-51ce-4b35-a888-b3e1653e847b/validate-network-openstack-openstack-cell1/0.log" Feb 20 00:14:38 crc kubenswrapper[4771]: I0220 00:14:38.201698 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ddeaff4d-ab09-4f42-ad29-beb21dd735e2/memcached/0.log" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.149812 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl"] Feb 20 00:15:00 crc kubenswrapper[4771]: E0220 00:15:00.150943 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="extract-utilities" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.150960 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="extract-utilities" Feb 20 00:15:00 crc kubenswrapper[4771]: E0220 00:15:00.150985 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="extract-content" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.150993 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="extract-content" Feb 20 00:15:00 crc kubenswrapper[4771]: E0220 00:15:00.151032 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="registry-server" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.151042 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="registry-server" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.151282 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b450216-da0a-4b1b-b3f8-57eb3bf9a655" containerName="registry-server" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.152214 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.154344 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.155470 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.162284 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl"] Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.264974 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e92a2f5d-0989-4d9c-be37-9639b065ccfa-secret-volume\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.265071 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwfcs\" (UniqueName: \"kubernetes.io/projected/e92a2f5d-0989-4d9c-be37-9639b065ccfa-kube-api-access-rwfcs\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.265121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e92a2f5d-0989-4d9c-be37-9639b065ccfa-config-volume\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.367844 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwfcs\" (UniqueName: \"kubernetes.io/projected/e92a2f5d-0989-4d9c-be37-9639b065ccfa-kube-api-access-rwfcs\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.367948 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e92a2f5d-0989-4d9c-be37-9639b065ccfa-config-volume\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.368291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e92a2f5d-0989-4d9c-be37-9639b065ccfa-secret-volume\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.369204 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e92a2f5d-0989-4d9c-be37-9639b065ccfa-config-volume\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.374634 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e92a2f5d-0989-4d9c-be37-9639b065ccfa-secret-volume\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.384140 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwfcs\" (UniqueName: \"kubernetes.io/projected/e92a2f5d-0989-4d9c-be37-9639b065ccfa-kube-api-access-rwfcs\") pod \"collect-profiles-29525775-lc2sl\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.475204 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:00 crc kubenswrapper[4771]: I0220 00:15:00.959368 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl"] Feb 20 00:15:02 crc kubenswrapper[4771]: I0220 00:15:02.058058 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" event={"ID":"e92a2f5d-0989-4d9c-be37-9639b065ccfa","Type":"ContainerStarted","Data":"2e9454c4a28b67d82188f05c5094fb9c273650bfda78652c7b1e715d061dd43b"} Feb 20 00:15:03 crc kubenswrapper[4771]: I0220 00:15:03.068093 4771 generic.go:334] "Generic (PLEG): container finished" podID="e92a2f5d-0989-4d9c-be37-9639b065ccfa" containerID="dfc5b0ebb803798e48decbaa9e0f6b1fbb35032925d47c5a4771ff967021548e" exitCode=0 Feb 20 00:15:03 crc kubenswrapper[4771]: I0220 00:15:03.068171 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" event={"ID":"e92a2f5d-0989-4d9c-be37-9639b065ccfa","Type":"ContainerDied","Data":"dfc5b0ebb803798e48decbaa9e0f6b1fbb35032925d47c5a4771ff967021548e"} Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.510435 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.582571 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e92a2f5d-0989-4d9c-be37-9639b065ccfa-config-volume\") pod \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.582714 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwfcs\" (UniqueName: \"kubernetes.io/projected/e92a2f5d-0989-4d9c-be37-9639b065ccfa-kube-api-access-rwfcs\") pod \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.582878 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e92a2f5d-0989-4d9c-be37-9639b065ccfa-secret-volume\") pod \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\" (UID: \"e92a2f5d-0989-4d9c-be37-9639b065ccfa\") " Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.583330 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e92a2f5d-0989-4d9c-be37-9639b065ccfa-config-volume" (OuterVolumeSpecName: "config-volume") pod "e92a2f5d-0989-4d9c-be37-9639b065ccfa" (UID: "e92a2f5d-0989-4d9c-be37-9639b065ccfa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.584497 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e92a2f5d-0989-4d9c-be37-9639b065ccfa-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.589966 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92a2f5d-0989-4d9c-be37-9639b065ccfa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e92a2f5d-0989-4d9c-be37-9639b065ccfa" (UID: "e92a2f5d-0989-4d9c-be37-9639b065ccfa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.600865 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92a2f5d-0989-4d9c-be37-9639b065ccfa-kube-api-access-rwfcs" (OuterVolumeSpecName: "kube-api-access-rwfcs") pod "e92a2f5d-0989-4d9c-be37-9639b065ccfa" (UID: "e92a2f5d-0989-4d9c-be37-9639b065ccfa"). InnerVolumeSpecName "kube-api-access-rwfcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.686779 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e92a2f5d-0989-4d9c-be37-9639b065ccfa-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:15:04 crc kubenswrapper[4771]: I0220 00:15:04.686811 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwfcs\" (UniqueName: \"kubernetes.io/projected/e92a2f5d-0989-4d9c-be37-9639b065ccfa-kube-api-access-rwfcs\") on node \"crc\" DevicePath \"\"" Feb 20 00:15:05 crc kubenswrapper[4771]: I0220 00:15:05.088597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" event={"ID":"e92a2f5d-0989-4d9c-be37-9639b065ccfa","Type":"ContainerDied","Data":"2e9454c4a28b67d82188f05c5094fb9c273650bfda78652c7b1e715d061dd43b"} Feb 20 00:15:05 crc kubenswrapper[4771]: I0220 00:15:05.088902 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e9454c4a28b67d82188f05c5094fb9c273650bfda78652c7b1e715d061dd43b" Feb 20 00:15:05 crc kubenswrapper[4771]: I0220 00:15:05.088673 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525775-lc2sl" Feb 20 00:15:05 crc kubenswrapper[4771]: I0220 00:15:05.592415 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz"] Feb 20 00:15:05 crc kubenswrapper[4771]: I0220 00:15:05.604097 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-wcxlz"] Feb 20 00:15:06 crc kubenswrapper[4771]: I0220 00:15:06.451716 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7607e6-697e-42e5-999f-c3d8dd3e0171" path="/var/lib/kubelet/pods/dc7607e6-697e-42e5-999f-c3d8dd3e0171/volumes" Feb 20 00:15:08 crc kubenswrapper[4771]: I0220 00:15:08.913832 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt_f9a3474a-871f-4fac-9776-cf660e3f2824/util/0.log" Feb 20 00:15:09 crc kubenswrapper[4771]: I0220 00:15:09.152473 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt_f9a3474a-871f-4fac-9776-cf660e3f2824/util/0.log" Feb 20 00:15:09 crc kubenswrapper[4771]: I0220 00:15:09.167373 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt_f9a3474a-871f-4fac-9776-cf660e3f2824/pull/0.log" Feb 20 00:15:09 crc kubenswrapper[4771]: I0220 00:15:09.185950 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt_f9a3474a-871f-4fac-9776-cf660e3f2824/pull/0.log" Feb 20 00:15:09 crc kubenswrapper[4771]: I0220 00:15:09.370049 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt_f9a3474a-871f-4fac-9776-cf660e3f2824/util/0.log" Feb 20 00:15:09 crc kubenswrapper[4771]: I0220 00:15:09.375129 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt_f9a3474a-871f-4fac-9776-cf660e3f2824/pull/0.log" Feb 20 00:15:09 crc kubenswrapper[4771]: I0220 00:15:09.413624 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967kkpqt_f9a3474a-871f-4fac-9776-cf660e3f2824/extract/0.log" Feb 20 00:15:09 crc kubenswrapper[4771]: I0220 00:15:09.780616 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-hsg94_93b55752-3270-48d6-a5b2-6fbf12729651/manager/0.log" Feb 20 00:15:10 crc kubenswrapper[4771]: I0220 00:15:10.207518 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-2qr26_7bcef2a5-3cd6-4add-bf80-92e6eb274058/manager/0.log" Feb 20 00:15:11 crc kubenswrapper[4771]: I0220 00:15:11.099399 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-vdnzp_a7392724-153e-4b25-a984-bff0f841aac6/manager/0.log" Feb 20 00:15:11 crc kubenswrapper[4771]: I0220 00:15:11.148444 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-gvxfm_1a3c4ae6-54be-46a2-93e6-db74ebf892e3/manager/0.log" Feb 20 00:15:11 crc kubenswrapper[4771]: I0220 00:15:11.346963 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-jvbwn_1f5bbacb-8d9b-4289-938a-05e191d519f9/manager/0.log" Feb 20 00:15:11 crc kubenswrapper[4771]: I0220 00:15:11.913854 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-g92n5_fc51f462-c6cd-4dfd-a410-a3d7ec7c88dd/manager/0.log" Feb 20 00:15:12 crc kubenswrapper[4771]: I0220 00:15:12.325614 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-mnjh4_2aac2409-afd9-44d5-a205-36d3f85d6ee1/manager/0.log" Feb 20 00:15:12 crc kubenswrapper[4771]: I0220 00:15:12.617478 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-l6wgq_3bfa784f-72fd-4797-b5ec-ea4eaeaefffa/manager/0.log" Feb 20 00:15:12 crc kubenswrapper[4771]: I0220 00:15:12.935346 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-l4w4n_3242e27e-7bae-47f9-a6e4-48d21eead119/manager/0.log" Feb 20 00:15:12 crc kubenswrapper[4771]: I0220 00:15:12.997299 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-krkgh_ab86456c-c478-4751-946d-064eff52fe66/manager/0.log" Feb 20 00:15:13 crc kubenswrapper[4771]: I0220 00:15:13.200874 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-6mdqb_47768578-399d-48b5-bffa-a7942d4c2dcf/manager/0.log" Feb 20 00:15:13 crc kubenswrapper[4771]: I0220 00:15:13.234517 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-6jvq7_747c00a6-dc75-476f-bdcd-b24d58b2fbe8/manager/0.log" Feb 20 00:15:13 crc kubenswrapper[4771]: I0220 00:15:13.364645 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-qfc7w_4fe3eefb-4705-42c0-8107-cf25db83f1d3/manager/0.log" Feb 20 00:15:13 crc kubenswrapper[4771]: I0220 00:15:13.467060 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-mplqb_bfcbf7c3-8784-4275-9838-7f6b9666e49a/manager/0.log" Feb 20 00:15:13 crc kubenswrapper[4771]: I0220 00:15:13.774334 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-hjfl2_ab9851f8-bb6b-4d51-aed3-b67fd5a3044c/operator/0.log" Feb 20 00:15:14 crc kubenswrapper[4771]: I0220 00:15:14.162161 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qw4qh_d4cb01c1-d9ee-4a5a-b125-bf428e3ced10/registry-server/0.log" Feb 20 00:15:14 crc kubenswrapper[4771]: I0220 00:15:14.177503 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-26qgm_d3777627-ddbf-47da-a0f4-33d28a04d0b7/manager/0.log" Feb 20 00:15:14 crc kubenswrapper[4771]: I0220 00:15:14.446980 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-9rjlg_e9edc363-0a5c-46f0-9473-ba2bddd46ad5/manager/0.log" Feb 20 00:15:14 crc kubenswrapper[4771]: I0220 00:15:14.551180 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-tksbc_f1f1ab0f-321a-40a4-bf22-68aa1246b00b/operator/0.log" Feb 20 00:15:14 crc kubenswrapper[4771]: I0220 00:15:14.683777 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-4s8tm_9ec5ee30-d7b1-472c-91e6-6dfa6c2fd4a9/manager/0.log" Feb 20 00:15:15 crc kubenswrapper[4771]: I0220 00:15:15.014881 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-62hbk_1f38fe26-3d54-4e26-bd41-5bf84e7e98fb/manager/1.log" Feb 20 00:15:15 crc kubenswrapper[4771]: I0220 00:15:15.102543 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-62hbk_1f38fe26-3d54-4e26-bd41-5bf84e7e98fb/manager/0.log" Feb 20 00:15:15 crc kubenswrapper[4771]: I0220 00:15:15.216894 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-dlg9h_956aea9e-2072-4556-a787-8da2a7286af1/manager/0.log" Feb 20 00:15:15 crc kubenswrapper[4771]: I0220 00:15:15.290523 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-vvdzm_1b9a2130-fab8-409d-a908-338f07fcd307/manager/0.log" Feb 20 00:15:17 crc kubenswrapper[4771]: I0220 00:15:17.275836 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-t66d7_4425ef75-1f25-4996-872b-1d4e877c0e10/manager/0.log" Feb 20 00:15:17 crc kubenswrapper[4771]: I0220 00:15:17.633678 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-2fvpw_f9700277-9c15-42b6-9ab6-820fb9c6fc91/manager/0.log" Feb 20 00:15:38 crc kubenswrapper[4771]: I0220 00:15:38.969828 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gmjpk_69705410-9808-46bd-8e5c-64a46eedf641/control-plane-machine-set-operator/0.log" Feb 20 00:15:38 crc kubenswrapper[4771]: I0220 00:15:38.992686 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qpfhv_7c9f264f-27e9-419d-8163-3d16ce6c5eda/kube-rbac-proxy/0.log" Feb 20 00:15:39 crc kubenswrapper[4771]: I0220 00:15:39.145839 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qpfhv_7c9f264f-27e9-419d-8163-3d16ce6c5eda/machine-api-operator/0.log" Feb 20 00:15:47 crc kubenswrapper[4771]: I0220 00:15:47.948179 4771 scope.go:117] "RemoveContainer" containerID="19f7de9e816c0656b12fd08e3b9409252ba9f7195934450c2213f0dd8ac72420" Feb 20 00:15:54 crc kubenswrapper[4771]: I0220 00:15:54.584211 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-w5psz_5a33d99d-a153-489c-925b-1fd6a1d0cdf7/cert-manager-controller/0.log" Feb 20 00:15:54 crc kubenswrapper[4771]: I0220 00:15:54.726114 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-n4z5q_483a30ad-c9d6-4198-8cda-f4afb3459f0f/cert-manager-cainjector/0.log" Feb 20 00:15:54 crc kubenswrapper[4771]: I0220 00:15:54.781119 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-s6929_2de006aa-ec1d-4278-b1c6-d0cac9eebe17/cert-manager-webhook/0.log" Feb 20 00:16:10 crc kubenswrapper[4771]: I0220 00:16:10.526290 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-6hqlc_63c374a8-819b-4a41-8836-1a23409d0a12/nmstate-console-plugin/0.log" Feb 20 00:16:10 crc kubenswrapper[4771]: I0220 00:16:10.730905 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hcxck_e61d563d-a115-4346-b98a-ffa4d0ec4393/nmstate-handler/0.log" Feb 20 00:16:10 crc kubenswrapper[4771]: I0220 00:16:10.836148 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-f5vsb_7fe6cc7b-ec96-4fcd-8685-0583ce9d6538/nmstate-metrics/0.log" Feb 20 00:16:10 crc kubenswrapper[4771]: I0220 00:16:10.837269 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-f5vsb_7fe6cc7b-ec96-4fcd-8685-0583ce9d6538/kube-rbac-proxy/0.log" Feb 20 00:16:10 crc kubenswrapper[4771]: I0220 00:16:10.935291 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-4hbnh_0e3f0120-0714-42a2-b7ff-d4c25acd93c4/nmstate-operator/0.log" Feb 20 00:16:11 crc kubenswrapper[4771]: I0220 00:16:11.033817 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-zgzhx_676b32b6-a0a8-4f1a-9c8f-242842320451/nmstate-webhook/0.log" Feb 20 00:16:12 crc kubenswrapper[4771]: I0220 00:16:12.956880 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:16:12 crc kubenswrapper[4771]: I0220 00:16:12.957187 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:16:27 crc kubenswrapper[4771]: I0220 00:16:27.026868 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-jfg88_bc803aa4-495b-490a-9ef3-90862fc268eb/prometheus-operator/0.log" Feb 20 00:16:27 crc kubenswrapper[4771]: I0220 00:16:27.218221 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-666489549b-55zbx_7d2f571a-67fe-4339-9648-ed9924d18c22/prometheus-operator-admission-webhook/0.log" Feb 20 00:16:27 crc kubenswrapper[4771]: I0220 00:16:27.261367 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-666489549b-jr98w_1d026131-d93f-47a1-b7c2-b99751fd63c8/prometheus-operator-admission-webhook/0.log" Feb 20 00:16:27 crc kubenswrapper[4771]: I0220 00:16:27.390476 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-k884s_08c1eeab-24f7-499e-a217-df70c76dbd50/operator/0.log" Feb 20 00:16:27 crc kubenswrapper[4771]: I0220 00:16:27.458861 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-bb52q_629c66a3-218a-43ff-884b-f6490aba1936/perses-operator/0.log" Feb 20 00:16:41 crc kubenswrapper[4771]: I0220 00:16:41.535702 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-sfghb_e7b28acc-9520-4a82-a044-a165756be9e8/kube-rbac-proxy/0.log" Feb 20 00:16:41 crc kubenswrapper[4771]: I0220 00:16:41.800922 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-frr-files/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.026341 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-frr-files/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.030560 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-reloader/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.067745 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-metrics/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.143547 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-sfghb_e7b28acc-9520-4a82-a044-a165756be9e8/controller/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.191925 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-reloader/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.433684 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-frr-files/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.450519 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-metrics/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.474736 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-reloader/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.476887 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-metrics/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.706916 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-frr-files/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.707533 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-reloader/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.721459 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/cp-metrics/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.728597 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/controller/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.887579 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/kube-rbac-proxy/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.894238 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/frr-metrics/0.log" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.956688 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.956749 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:16:42 crc kubenswrapper[4771]: I0220 00:16:42.979975 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/kube-rbac-proxy-frr/0.log" Feb 20 00:16:43 crc kubenswrapper[4771]: I0220 00:16:43.137956 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/reloader/0.log" Feb 20 00:16:43 crc kubenswrapper[4771]: I0220 00:16:43.235494 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-lhn49_c2966750-1a68-42a1-9da5-30c5d5955550/frr-k8s-webhook-server/0.log" Feb 20 00:16:43 crc kubenswrapper[4771]: I0220 00:16:43.451853 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7595995fb8-mwpt6_0305f851-5f0b-4dd2-910b-e331dd45e4b7/manager/0.log" Feb 20 00:16:43 crc kubenswrapper[4771]: I0220 00:16:43.573926 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64f99c9dc8-x7lhj_e16c23ec-4799-4e15-a12e-995a8076b5f7/webhook-server/0.log" Feb 20 00:16:44 crc kubenswrapper[4771]: I0220 00:16:44.010157 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b55hf_083f4636-9855-4b5c-ad7c-685d534b508c/kube-rbac-proxy/0.log" Feb 20 00:16:45 crc kubenswrapper[4771]: I0220 00:16:45.056309 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b55hf_083f4636-9855-4b5c-ad7c-685d534b508c/speaker/0.log" Feb 20 00:16:46 crc kubenswrapper[4771]: I0220 00:16:46.463547 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7wp2l_ff7ca988-1ed4-468a-a531-52c2b45e144c/frr/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.092237 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs_7990fa1c-49c7-43c8-b638-ba2f44a5d0ea/util/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.299439 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs_7990fa1c-49c7-43c8-b638-ba2f44a5d0ea/util/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.377674 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs_7990fa1c-49c7-43c8-b638-ba2f44a5d0ea/pull/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.492019 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs_7990fa1c-49c7-43c8-b638-ba2f44a5d0ea/pull/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.697546 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs_7990fa1c-49c7-43c8-b638-ba2f44a5d0ea/util/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.711341 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs_7990fa1c-49c7-43c8-b638-ba2f44a5d0ea/pull/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.738374 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nh5vs_7990fa1c-49c7-43c8-b638-ba2f44a5d0ea/extract/0.log" Feb 20 00:17:00 crc kubenswrapper[4771]: I0220 00:17:00.932513 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp_9fed3ba9-7458-44d3-a87a-9d9a5957c680/util/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.092658 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp_9fed3ba9-7458-44d3-a87a-9d9a5957c680/util/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.112232 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp_9fed3ba9-7458-44d3-a87a-9d9a5957c680/pull/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.151529 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp_9fed3ba9-7458-44d3-a87a-9d9a5957c680/pull/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.407139 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp_9fed3ba9-7458-44d3-a87a-9d9a5957c680/extract/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.423719 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp_9fed3ba9-7458-44d3-a87a-9d9a5957c680/util/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.469093 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08f8gcp_9fed3ba9-7458-44d3-a87a-9d9a5957c680/pull/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.573637 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r_a609387a-0d3f-4374-90c3-23b37506661c/util/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.759242 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r_a609387a-0d3f-4374-90c3-23b37506661c/pull/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.807217 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r_a609387a-0d3f-4374-90c3-23b37506661c/util/0.log" Feb 20 00:17:01 crc kubenswrapper[4771]: I0220 00:17:01.839125 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r_a609387a-0d3f-4374-90c3-23b37506661c/pull/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.186254 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r_a609387a-0d3f-4374-90c3-23b37506661c/pull/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.232466 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r_a609387a-0d3f-4374-90c3-23b37506661c/extract/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.315879 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rz22r_a609387a-0d3f-4374-90c3-23b37506661c/util/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.516905 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vfcqq_5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd/extract-utilities/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.679630 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vfcqq_5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd/extract-utilities/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.687388 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vfcqq_5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd/extract-content/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.699145 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vfcqq_5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd/extract-content/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.848003 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vfcqq_5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd/extract-content/0.log" Feb 20 00:17:02 crc kubenswrapper[4771]: I0220 00:17:02.855488 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vfcqq_5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd/extract-utilities/0.log" Feb 20 00:17:03 crc kubenswrapper[4771]: I0220 00:17:03.075607 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvrdf_628cca0e-2b72-4434-8fb2-03474a586f19/extract-utilities/0.log" Feb 20 00:17:03 crc kubenswrapper[4771]: I0220 00:17:03.262986 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvrdf_628cca0e-2b72-4434-8fb2-03474a586f19/extract-content/0.log" Feb 20 00:17:03 crc kubenswrapper[4771]: I0220 00:17:03.397178 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvrdf_628cca0e-2b72-4434-8fb2-03474a586f19/extract-utilities/0.log" Feb 20 00:17:03 crc kubenswrapper[4771]: I0220 00:17:03.402672 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvrdf_628cca0e-2b72-4434-8fb2-03474a586f19/extract-content/0.log" Feb 20 00:17:03 crc kubenswrapper[4771]: I0220 00:17:03.542889 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvrdf_628cca0e-2b72-4434-8fb2-03474a586f19/extract-utilities/0.log" Feb 20 00:17:03 crc kubenswrapper[4771]: I0220 00:17:03.685955 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvrdf_628cca0e-2b72-4434-8fb2-03474a586f19/extract-content/0.log" Feb 20 00:17:03 crc kubenswrapper[4771]: I0220 00:17:03.900829 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td_113dbb1c-c7c0-45ce-99d8-86156cbce45c/util/0.log" Feb 20 00:17:04 crc kubenswrapper[4771]: I0220 00:17:04.369882 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vfcqq_5158244b-cdfd-48fa-bbb6-ba1f5dbaf6bd/registry-server/0.log" Feb 20 00:17:04 crc kubenswrapper[4771]: I0220 00:17:04.649298 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td_113dbb1c-c7c0-45ce-99d8-86156cbce45c/util/0.log" Feb 20 00:17:04 crc kubenswrapper[4771]: I0220 00:17:04.675432 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td_113dbb1c-c7c0-45ce-99d8-86156cbce45c/pull/0.log" Feb 20 00:17:04 crc kubenswrapper[4771]: I0220 00:17:04.745648 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td_113dbb1c-c7c0-45ce-99d8-86156cbce45c/pull/0.log" Feb 20 00:17:04 crc kubenswrapper[4771]: I0220 00:17:04.845032 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td_113dbb1c-c7c0-45ce-99d8-86156cbce45c/util/0.log" Feb 20 00:17:04 crc kubenswrapper[4771]: I0220 00:17:04.892960 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td_113dbb1c-c7c0-45ce-99d8-86156cbce45c/pull/0.log" Feb 20 00:17:04 crc kubenswrapper[4771]: I0220 00:17:04.896464 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca666td_113dbb1c-c7c0-45ce-99d8-86156cbce45c/extract/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.157582 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z56lm_c45470de-22dd-4455-8fd6-dbaf60fddece/extract-utilities/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.170137 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r4pg9_426c7c73-e3a3-4430-9610-eb935a493fa8/marketplace-operator/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.265878 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvrdf_628cca0e-2b72-4434-8fb2-03474a586f19/registry-server/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.362194 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z56lm_c45470de-22dd-4455-8fd6-dbaf60fddece/extract-content/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.364738 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z56lm_c45470de-22dd-4455-8fd6-dbaf60fddece/extract-content/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.382642 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z56lm_c45470de-22dd-4455-8fd6-dbaf60fddece/extract-utilities/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.589810 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z56lm_c45470de-22dd-4455-8fd6-dbaf60fddece/extract-content/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.625349 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z56lm_c45470de-22dd-4455-8fd6-dbaf60fddece/extract-utilities/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.670884 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vltjv_807daddd-f92f-47b4-8d12-fdbeab38b6f1/extract-utilities/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.863393 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vltjv_807daddd-f92f-47b4-8d12-fdbeab38b6f1/extract-utilities/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.865357 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vltjv_807daddd-f92f-47b4-8d12-fdbeab38b6f1/extract-content/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.909842 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vltjv_807daddd-f92f-47b4-8d12-fdbeab38b6f1/extract-content/0.log" Feb 20 00:17:05 crc kubenswrapper[4771]: I0220 00:17:05.961364 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z56lm_c45470de-22dd-4455-8fd6-dbaf60fddece/registry-server/0.log" Feb 20 00:17:06 crc kubenswrapper[4771]: I0220 00:17:06.847392 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vltjv_807daddd-f92f-47b4-8d12-fdbeab38b6f1/extract-content/0.log" Feb 20 00:17:06 crc kubenswrapper[4771]: I0220 00:17:06.888975 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vltjv_807daddd-f92f-47b4-8d12-fdbeab38b6f1/extract-utilities/0.log" Feb 20 00:17:07 crc kubenswrapper[4771]: I0220 00:17:07.936996 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vltjv_807daddd-f92f-47b4-8d12-fdbeab38b6f1/registry-server/0.log" Feb 20 00:17:12 crc kubenswrapper[4771]: I0220 00:17:12.958593 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:17:12 crc kubenswrapper[4771]: I0220 00:17:12.959126 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:17:12 crc kubenswrapper[4771]: I0220 00:17:12.959211 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 20 00:17:12 crc kubenswrapper[4771]: I0220 00:17:12.960495 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2411b3e172f053d34174b248400f1a0b44c39c7fcfb46f6cb68fc651f7234056"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:17:12 crc kubenswrapper[4771]: I0220 00:17:12.960589 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://2411b3e172f053d34174b248400f1a0b44c39c7fcfb46f6cb68fc651f7234056" gracePeriod=600 Feb 20 00:17:13 crc kubenswrapper[4771]: I0220 00:17:13.573788 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="2411b3e172f053d34174b248400f1a0b44c39c7fcfb46f6cb68fc651f7234056" exitCode=0 Feb 20 00:17:13 crc kubenswrapper[4771]: I0220 00:17:13.573859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"2411b3e172f053d34174b248400f1a0b44c39c7fcfb46f6cb68fc651f7234056"} Feb 20 00:17:13 crc kubenswrapper[4771]: I0220 00:17:13.574143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerStarted","Data":"350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2"} Feb 20 00:17:13 crc kubenswrapper[4771]: I0220 00:17:13.574173 4771 scope.go:117] "RemoveContainer" containerID="0635a39b16226f5fba180dc0615baadda664ba24be9ee066b1695d525d509862" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.481912 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rv4qr"] Feb 20 00:17:19 crc kubenswrapper[4771]: E0220 00:17:19.483142 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e92a2f5d-0989-4d9c-be37-9639b065ccfa" containerName="collect-profiles" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.483159 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92a2f5d-0989-4d9c-be37-9639b065ccfa" containerName="collect-profiles" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.483448 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e92a2f5d-0989-4d9c-be37-9639b065ccfa" containerName="collect-profiles" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.485405 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.505674 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rv4qr"] Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.519271 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-catalog-content\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.519337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6md\" (UniqueName: \"kubernetes.io/projected/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-kube-api-access-mt6md\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.519878 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-utilities\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.622517 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-catalog-content\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.622576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6md\" (UniqueName: \"kubernetes.io/projected/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-kube-api-access-mt6md\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.622801 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-utilities\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.623459 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-catalog-content\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.623751 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-utilities\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.649097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6md\" (UniqueName: \"kubernetes.io/projected/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-kube-api-access-mt6md\") pod \"redhat-marketplace-rv4qr\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:19 crc kubenswrapper[4771]: I0220 00:17:19.817982 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:20 crc kubenswrapper[4771]: I0220 00:17:20.375341 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rv4qr"] Feb 20 00:17:20 crc kubenswrapper[4771]: I0220 00:17:20.657807 4771 generic.go:334] "Generic (PLEG): container finished" podID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerID="9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b" exitCode=0 Feb 20 00:17:20 crc kubenswrapper[4771]: I0220 00:17:20.657848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rv4qr" event={"ID":"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d","Type":"ContainerDied","Data":"9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b"} Feb 20 00:17:20 crc kubenswrapper[4771]: I0220 00:17:20.657872 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rv4qr" event={"ID":"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d","Type":"ContainerStarted","Data":"e5a4a80454b1f8a7eb593674090eaf6cb933b2ccbcdfd897f8337029065dd5ec"} Feb 20 00:17:21 crc kubenswrapper[4771]: I0220 00:17:21.672999 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rv4qr" event={"ID":"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d","Type":"ContainerStarted","Data":"b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0"} Feb 20 00:17:22 crc kubenswrapper[4771]: I0220 00:17:22.693939 4771 generic.go:334] "Generic (PLEG): container finished" podID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerID="b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0" exitCode=0 Feb 20 00:17:22 crc kubenswrapper[4771]: I0220 00:17:22.694144 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rv4qr" event={"ID":"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d","Type":"ContainerDied","Data":"b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0"} Feb 20 00:17:23 crc kubenswrapper[4771]: I0220 00:17:23.018088 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-666489549b-55zbx_7d2f571a-67fe-4339-9648-ed9924d18c22/prometheus-operator-admission-webhook/0.log" Feb 20 00:17:23 crc kubenswrapper[4771]: I0220 00:17:23.029656 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-666489549b-jr98w_1d026131-d93f-47a1-b7c2-b99751fd63c8/prometheus-operator-admission-webhook/0.log" Feb 20 00:17:23 crc kubenswrapper[4771]: I0220 00:17:23.032346 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-jfg88_bc803aa4-495b-490a-9ef3-90862fc268eb/prometheus-operator/0.log" Feb 20 00:17:23 crc kubenswrapper[4771]: I0220 00:17:23.212257 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-bb52q_629c66a3-218a-43ff-884b-f6490aba1936/perses-operator/0.log" Feb 20 00:17:23 crc kubenswrapper[4771]: I0220 00:17:23.227313 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-k884s_08c1eeab-24f7-499e-a217-df70c76dbd50/operator/0.log" Feb 20 00:17:23 crc kubenswrapper[4771]: I0220 00:17:23.713508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rv4qr" event={"ID":"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d","Type":"ContainerStarted","Data":"5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc"} Feb 20 00:17:23 crc kubenswrapper[4771]: I0220 00:17:23.748392 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rv4qr" podStartSLOduration=2.310859076 podStartE2EDuration="4.748280241s" podCreationTimestamp="2026-02-20 00:17:19 +0000 UTC" firstStartedPulling="2026-02-20 00:17:20.659775341 +0000 UTC m=+10140.931217811" lastFinishedPulling="2026-02-20 00:17:23.097196496 +0000 UTC m=+10143.368638976" observedRunningTime="2026-02-20 00:17:23.733181264 +0000 UTC m=+10144.004623764" watchObservedRunningTime="2026-02-20 00:17:23.748280241 +0000 UTC m=+10144.019722731" Feb 20 00:17:29 crc kubenswrapper[4771]: I0220 00:17:29.818636 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:29 crc kubenswrapper[4771]: I0220 00:17:29.819035 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:29 crc kubenswrapper[4771]: I0220 00:17:29.873867 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:30 crc kubenswrapper[4771]: I0220 00:17:30.853754 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:30 crc kubenswrapper[4771]: I0220 00:17:30.916186 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rv4qr"] Feb 20 00:17:32 crc kubenswrapper[4771]: I0220 00:17:32.814185 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rv4qr" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="registry-server" containerID="cri-o://5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc" gracePeriod=2 Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.357720 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.441869 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-utilities\") pod \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.442298 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-catalog-content\") pod \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.442534 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-utilities" (OuterVolumeSpecName: "utilities") pod "d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" (UID: "d192a5e6-ee92-4699-a00b-2ebaba0d2c9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.444428 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt6md\" (UniqueName: \"kubernetes.io/projected/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-kube-api-access-mt6md\") pod \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\" (UID: \"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d\") " Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.445612 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.449352 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-kube-api-access-mt6md" (OuterVolumeSpecName: "kube-api-access-mt6md") pod "d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" (UID: "d192a5e6-ee92-4699-a00b-2ebaba0d2c9d"). InnerVolumeSpecName "kube-api-access-mt6md". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.496746 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" (UID: "d192a5e6-ee92-4699-a00b-2ebaba0d2c9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.547478 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.547526 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt6md\" (UniqueName: \"kubernetes.io/projected/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d-kube-api-access-mt6md\") on node \"crc\" DevicePath \"\"" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.827126 4771 generic.go:334] "Generic (PLEG): container finished" podID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerID="5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc" exitCode=0 Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.827164 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rv4qr" event={"ID":"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d","Type":"ContainerDied","Data":"5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc"} Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.827189 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rv4qr" event={"ID":"d192a5e6-ee92-4699-a00b-2ebaba0d2c9d","Type":"ContainerDied","Data":"e5a4a80454b1f8a7eb593674090eaf6cb933b2ccbcdfd897f8337029065dd5ec"} Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.827207 4771 scope.go:117] "RemoveContainer" containerID="5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.827545 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rv4qr" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.860165 4771 scope.go:117] "RemoveContainer" containerID="b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.885285 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rv4qr"] Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.900204 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rv4qr"] Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.906339 4771 scope.go:117] "RemoveContainer" containerID="9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.987660 4771 scope.go:117] "RemoveContainer" containerID="5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc" Feb 20 00:17:33 crc kubenswrapper[4771]: E0220 00:17:33.988108 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc\": container with ID starting with 5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc not found: ID does not exist" containerID="5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.988157 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc"} err="failed to get container status \"5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc\": rpc error: code = NotFound desc = could not find container \"5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc\": container with ID starting with 5a8e7965ca82eedbaef276eaa47f1995bfabbfdbf9aaae39b77ea296e7fc9ddc not found: ID does not exist" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.988189 4771 scope.go:117] "RemoveContainer" containerID="b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0" Feb 20 00:17:33 crc kubenswrapper[4771]: E0220 00:17:33.988487 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0\": container with ID starting with b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0 not found: ID does not exist" containerID="b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.988512 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0"} err="failed to get container status \"b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0\": rpc error: code = NotFound desc = could not find container \"b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0\": container with ID starting with b2506652883e751fb764816274565260e05b1dfdca28a793aac7dd715159b6d0 not found: ID does not exist" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.988537 4771 scope.go:117] "RemoveContainer" containerID="9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b" Feb 20 00:17:33 crc kubenswrapper[4771]: E0220 00:17:33.988760 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b\": container with ID starting with 9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b not found: ID does not exist" containerID="9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b" Feb 20 00:17:33 crc kubenswrapper[4771]: I0220 00:17:33.988788 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b"} err="failed to get container status \"9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b\": rpc error: code = NotFound desc = could not find container \"9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b\": container with ID starting with 9db69dad6c3e94bd03fb26f7e2b86b26a838be516dd2c2e12b758e4b88e0c32b not found: ID does not exist" Feb 20 00:17:34 crc kubenswrapper[4771]: I0220 00:17:34.468009 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" path="/var/lib/kubelet/pods/d192a5e6-ee92-4699-a00b-2ebaba0d2c9d/volumes" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.290102 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lcrlq"] Feb 20 00:18:51 crc kubenswrapper[4771]: E0220 00:18:51.291475 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="registry-server" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.291498 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="registry-server" Feb 20 00:18:51 crc kubenswrapper[4771]: E0220 00:18:51.291548 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="extract-utilities" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.291561 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="extract-utilities" Feb 20 00:18:51 crc kubenswrapper[4771]: E0220 00:18:51.291585 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="extract-content" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.291598 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="extract-content" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.292209 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d192a5e6-ee92-4699-a00b-2ebaba0d2c9d" containerName="registry-server" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.297006 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.307310 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcrlq"] Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.324936 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-catalog-content\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.325054 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-utilities\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.325171 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdhk\" (UniqueName: \"kubernetes.io/projected/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-kube-api-access-rtdhk\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.427752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-utilities\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.428109 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdhk\" (UniqueName: \"kubernetes.io/projected/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-kube-api-access-rtdhk\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.428317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-catalog-content\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.428434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-utilities\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.428695 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-catalog-content\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.456789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdhk\" (UniqueName: \"kubernetes.io/projected/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-kube-api-access-rtdhk\") pod \"redhat-operators-lcrlq\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:51 crc kubenswrapper[4771]: I0220 00:18:51.636123 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:18:52 crc kubenswrapper[4771]: I0220 00:18:52.097213 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcrlq"] Feb 20 00:18:52 crc kubenswrapper[4771]: I0220 00:18:52.769123 4771 generic.go:334] "Generic (PLEG): container finished" podID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerID="ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044" exitCode=0 Feb 20 00:18:52 crc kubenswrapper[4771]: I0220 00:18:52.769352 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcrlq" event={"ID":"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7","Type":"ContainerDied","Data":"ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044"} Feb 20 00:18:52 crc kubenswrapper[4771]: I0220 00:18:52.769419 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcrlq" event={"ID":"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7","Type":"ContainerStarted","Data":"2415937ce35ccd6177ca3d94c48eb06d930ef34c10a2abbb302186c3a132fa2a"} Feb 20 00:18:54 crc kubenswrapper[4771]: I0220 00:18:54.791538 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcrlq" event={"ID":"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7","Type":"ContainerStarted","Data":"147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e"} Feb 20 00:18:57 crc kubenswrapper[4771]: I0220 00:18:57.842180 4771 generic.go:334] "Generic (PLEG): container finished" podID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerID="147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e" exitCode=0 Feb 20 00:18:57 crc kubenswrapper[4771]: I0220 00:18:57.842293 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcrlq" event={"ID":"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7","Type":"ContainerDied","Data":"147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e"} Feb 20 00:18:59 crc kubenswrapper[4771]: I0220 00:18:59.870452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcrlq" event={"ID":"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7","Type":"ContainerStarted","Data":"1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5"} Feb 20 00:18:59 crc kubenswrapper[4771]: I0220 00:18:59.919405 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lcrlq" podStartSLOduration=3.080183238 podStartE2EDuration="8.919377753s" podCreationTimestamp="2026-02-20 00:18:51 +0000 UTC" firstStartedPulling="2026-02-20 00:18:52.770939114 +0000 UTC m=+10233.042381584" lastFinishedPulling="2026-02-20 00:18:58.610133629 +0000 UTC m=+10238.881576099" observedRunningTime="2026-02-20 00:18:59.902894712 +0000 UTC m=+10240.174337232" watchObservedRunningTime="2026-02-20 00:18:59.919377753 +0000 UTC m=+10240.190820223" Feb 20 00:19:01 crc kubenswrapper[4771]: I0220 00:19:01.636638 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:19:01 crc kubenswrapper[4771]: I0220 00:19:01.637006 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:19:02 crc kubenswrapper[4771]: I0220 00:19:02.713710 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcrlq" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="registry-server" probeResult="failure" output=< Feb 20 00:19:02 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 20 00:19:02 crc kubenswrapper[4771]: > Feb 20 00:19:11 crc kubenswrapper[4771]: I0220 00:19:11.711152 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:19:11 crc kubenswrapper[4771]: I0220 00:19:11.803088 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:19:11 crc kubenswrapper[4771]: I0220 00:19:11.961845 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcrlq"] Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.041510 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lcrlq" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="registry-server" containerID="cri-o://1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5" gracePeriod=2 Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.686505 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.692348 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-utilities\") pod \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.692483 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-catalog-content\") pod \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.692615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtdhk\" (UniqueName: \"kubernetes.io/projected/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-kube-api-access-rtdhk\") pod \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\" (UID: \"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7\") " Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.694177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-utilities" (OuterVolumeSpecName: "utilities") pod "bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" (UID: "bd21b476-92ce-4bc5-a305-d7bc76d6c2b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.706042 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-kube-api-access-rtdhk" (OuterVolumeSpecName: "kube-api-access-rtdhk") pod "bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" (UID: "bd21b476-92ce-4bc5-a305-d7bc76d6c2b7"). InnerVolumeSpecName "kube-api-access-rtdhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.795074 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.795137 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtdhk\" (UniqueName: \"kubernetes.io/projected/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-kube-api-access-rtdhk\") on node \"crc\" DevicePath \"\"" Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.861183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" (UID: "bd21b476-92ce-4bc5-a305-d7bc76d6c2b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:19:13 crc kubenswrapper[4771]: I0220 00:19:13.895798 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.059722 4771 generic.go:334] "Generic (PLEG): container finished" podID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerID="1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5" exitCode=0 Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.059788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcrlq" event={"ID":"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7","Type":"ContainerDied","Data":"1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5"} Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.060743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcrlq" event={"ID":"bd21b476-92ce-4bc5-a305-d7bc76d6c2b7","Type":"ContainerDied","Data":"2415937ce35ccd6177ca3d94c48eb06d930ef34c10a2abbb302186c3a132fa2a"} Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.059855 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcrlq" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.060775 4771 scope.go:117] "RemoveContainer" containerID="1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.096554 4771 scope.go:117] "RemoveContainer" containerID="147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.106406 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcrlq"] Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.121012 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lcrlq"] Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.142666 4771 scope.go:117] "RemoveContainer" containerID="ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.208928 4771 scope.go:117] "RemoveContainer" containerID="1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5" Feb 20 00:19:14 crc kubenswrapper[4771]: E0220 00:19:14.209506 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5\": container with ID starting with 1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5 not found: ID does not exist" containerID="1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.209553 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5"} err="failed to get container status \"1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5\": rpc error: code = NotFound desc = could not find container \"1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5\": container with ID starting with 1bfe89667873cc6366e8589c4324cf00035217137182947f16c61c0a9eb3fae5 not found: ID does not exist" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.209587 4771 scope.go:117] "RemoveContainer" containerID="147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e" Feb 20 00:19:14 crc kubenswrapper[4771]: E0220 00:19:14.212542 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e\": container with ID starting with 147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e not found: ID does not exist" containerID="147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.212592 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e"} err="failed to get container status \"147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e\": rpc error: code = NotFound desc = could not find container \"147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e\": container with ID starting with 147d36fea69d662e7ac7b0c7ecd6fcc41865aca3b0f3e92edcba90db14974c5e not found: ID does not exist" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.212626 4771 scope.go:117] "RemoveContainer" containerID="ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044" Feb 20 00:19:14 crc kubenswrapper[4771]: E0220 00:19:14.213188 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044\": container with ID starting with ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044 not found: ID does not exist" containerID="ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.213243 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044"} err="failed to get container status \"ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044\": rpc error: code = NotFound desc = could not find container \"ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044\": container with ID starting with ed9ae6c6faf21e4c473562559799dbcefe3231163fd104ece1eb6ad62ad91044 not found: ID does not exist" Feb 20 00:19:14 crc kubenswrapper[4771]: I0220 00:19:14.792173 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" path="/var/lib/kubelet/pods/bd21b476-92ce-4bc5-a305-d7bc76d6c2b7/volumes" Feb 20 00:19:34 crc kubenswrapper[4771]: I0220 00:19:34.336100 4771 generic.go:334] "Generic (PLEG): container finished" podID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerID="02cc2dc54b11c2872490f129d3b04cc2a35078329679a10105106dfe55256f40" exitCode=0 Feb 20 00:19:34 crc kubenswrapper[4771]: I0220 00:19:34.336232 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fxgqk/must-gather-58sl7" event={"ID":"ace391bd-2fb6-497e-bb5c-ea6135b23e6c","Type":"ContainerDied","Data":"02cc2dc54b11c2872490f129d3b04cc2a35078329679a10105106dfe55256f40"} Feb 20 00:19:34 crc kubenswrapper[4771]: I0220 00:19:34.337613 4771 scope.go:117] "RemoveContainer" containerID="02cc2dc54b11c2872490f129d3b04cc2a35078329679a10105106dfe55256f40" Feb 20 00:19:35 crc kubenswrapper[4771]: I0220 00:19:35.180533 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxgqk_must-gather-58sl7_ace391bd-2fb6-497e-bb5c-ea6135b23e6c/gather/0.log" Feb 20 00:19:42 crc kubenswrapper[4771]: I0220 00:19:42.956730 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:19:42 crc kubenswrapper[4771]: I0220 00:19:42.957574 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.169476 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fxgqk/must-gather-58sl7"] Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.170105 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fxgqk/must-gather-58sl7" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerName="copy" containerID="cri-o://8bfb0e0e715388ea61f0794c279367c9a6d205f3de29872a0b2177fc91486b8d" gracePeriod=2 Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.183604 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fxgqk/must-gather-58sl7"] Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.444523 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxgqk_must-gather-58sl7_ace391bd-2fb6-497e-bb5c-ea6135b23e6c/copy/0.log" Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.444899 4771 generic.go:334] "Generic (PLEG): container finished" podID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerID="8bfb0e0e715388ea61f0794c279367c9a6d205f3de29872a0b2177fc91486b8d" exitCode=143 Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.621640 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxgqk_must-gather-58sl7_ace391bd-2fb6-497e-bb5c-ea6135b23e6c/copy/0.log" Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.622365 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.822819 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvbjk\" (UniqueName: \"kubernetes.io/projected/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-kube-api-access-dvbjk\") pod \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.823017 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-must-gather-output\") pod \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\" (UID: \"ace391bd-2fb6-497e-bb5c-ea6135b23e6c\") " Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.830882 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-kube-api-access-dvbjk" (OuterVolumeSpecName: "kube-api-access-dvbjk") pod "ace391bd-2fb6-497e-bb5c-ea6135b23e6c" (UID: "ace391bd-2fb6-497e-bb5c-ea6135b23e6c"). InnerVolumeSpecName "kube-api-access-dvbjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:19:43 crc kubenswrapper[4771]: I0220 00:19:43.926148 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvbjk\" (UniqueName: \"kubernetes.io/projected/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-kube-api-access-dvbjk\") on node \"crc\" DevicePath \"\"" Feb 20 00:19:44 crc kubenswrapper[4771]: I0220 00:19:44.037592 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ace391bd-2fb6-497e-bb5c-ea6135b23e6c" (UID: "ace391bd-2fb6-497e-bb5c-ea6135b23e6c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:19:44 crc kubenswrapper[4771]: I0220 00:19:44.130704 4771 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ace391bd-2fb6-497e-bb5c-ea6135b23e6c-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 00:19:44 crc kubenswrapper[4771]: I0220 00:19:44.474586 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" path="/var/lib/kubelet/pods/ace391bd-2fb6-497e-bb5c-ea6135b23e6c/volumes" Feb 20 00:19:44 crc kubenswrapper[4771]: I0220 00:19:44.481659 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fxgqk_must-gather-58sl7_ace391bd-2fb6-497e-bb5c-ea6135b23e6c/copy/0.log" Feb 20 00:19:44 crc kubenswrapper[4771]: I0220 00:19:44.482216 4771 scope.go:117] "RemoveContainer" containerID="8bfb0e0e715388ea61f0794c279367c9a6d205f3de29872a0b2177fc91486b8d" Feb 20 00:19:44 crc kubenswrapper[4771]: I0220 00:19:44.482410 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fxgqk/must-gather-58sl7" Feb 20 00:19:44 crc kubenswrapper[4771]: I0220 00:19:44.509903 4771 scope.go:117] "RemoveContainer" containerID="02cc2dc54b11c2872490f129d3b04cc2a35078329679a10105106dfe55256f40" Feb 20 00:20:12 crc kubenswrapper[4771]: I0220 00:20:12.957071 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:20:12 crc kubenswrapper[4771]: I0220 00:20:12.957834 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:20:13 crc kubenswrapper[4771]: I0220 00:20:13.271647 4771 trace.go:236] Trace[2069423565]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (20-Feb-2026 00:20:12.251) (total time: 1018ms): Feb 20 00:20:13 crc kubenswrapper[4771]: Trace[2069423565]: [1.018770779s] [1.018770779s] END Feb 20 00:20:42 crc kubenswrapper[4771]: I0220 00:20:42.956861 4771 patch_prober.go:28] interesting pod/machine-config-daemon-q64kb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:20:42 crc kubenswrapper[4771]: I0220 00:20:42.957548 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:20:42 crc kubenswrapper[4771]: I0220 00:20:42.957610 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" Feb 20 00:20:42 crc kubenswrapper[4771]: I0220 00:20:42.958584 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2"} pod="openshift-machine-config-operator/machine-config-daemon-q64kb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:20:42 crc kubenswrapper[4771]: I0220 00:20:42.958714 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerName="machine-config-daemon" containerID="cri-o://350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" gracePeriod=600 Feb 20 00:20:43 crc kubenswrapper[4771]: E0220 00:20:43.090173 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:20:43 crc kubenswrapper[4771]: I0220 00:20:43.226462 4771 generic.go:334] "Generic (PLEG): container finished" podID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" exitCode=0 Feb 20 00:20:43 crc kubenswrapper[4771]: I0220 00:20:43.226508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" event={"ID":"db19bfc5-46dd-47ae-9608-aafec9e35f9e","Type":"ContainerDied","Data":"350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2"} Feb 20 00:20:43 crc kubenswrapper[4771]: I0220 00:20:43.226550 4771 scope.go:117] "RemoveContainer" containerID="2411b3e172f053d34174b248400f1a0b44c39c7fcfb46f6cb68fc651f7234056" Feb 20 00:20:43 crc kubenswrapper[4771]: I0220 00:20:43.227279 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:20:43 crc kubenswrapper[4771]: E0220 00:20:43.227748 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:20:48 crc kubenswrapper[4771]: I0220 00:20:48.177338 4771 scope.go:117] "RemoveContainer" containerID="c626745622687bd52467a2c0d72ba695e9dc6a96067db0e02625940b8a260d15" Feb 20 00:20:48 crc kubenswrapper[4771]: I0220 00:20:48.221220 4771 scope.go:117] "RemoveContainer" containerID="76bb3c16c67f16a93792e85ca157937aa289ca3565fecfa9c8a0a35da0e0555a" Feb 20 00:20:48 crc kubenswrapper[4771]: I0220 00:20:48.296466 4771 scope.go:117] "RemoveContainer" containerID="2c47303cb48f290638d5a9861f30c08f23431a7a041ba53b3858a6506bc9f0bc" Feb 20 00:20:55 crc kubenswrapper[4771]: I0220 00:20:55.437869 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:20:55 crc kubenswrapper[4771]: E0220 00:20:55.438724 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:21:06 crc kubenswrapper[4771]: I0220 00:21:06.438897 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:21:06 crc kubenswrapper[4771]: E0220 00:21:06.440358 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:21:17 crc kubenswrapper[4771]: I0220 00:21:17.438439 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:21:17 crc kubenswrapper[4771]: E0220 00:21:17.439698 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:21:28 crc kubenswrapper[4771]: I0220 00:21:28.438371 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:21:28 crc kubenswrapper[4771]: E0220 00:21:28.439364 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:21:41 crc kubenswrapper[4771]: I0220 00:21:41.438496 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:21:41 crc kubenswrapper[4771]: E0220 00:21:41.439795 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:21:55 crc kubenswrapper[4771]: I0220 00:21:55.436968 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:21:55 crc kubenswrapper[4771]: E0220 00:21:55.437718 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.939333 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dd7k4"] Feb 20 00:22:05 crc kubenswrapper[4771]: E0220 00:22:05.940884 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="extract-utilities" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.940922 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="extract-utilities" Feb 20 00:22:05 crc kubenswrapper[4771]: E0220 00:22:05.940966 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerName="copy" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.940983 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerName="copy" Feb 20 00:22:05 crc kubenswrapper[4771]: E0220 00:22:05.941015 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="registry-server" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.941072 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="registry-server" Feb 20 00:22:05 crc kubenswrapper[4771]: E0220 00:22:05.941119 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerName="gather" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.941136 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerName="gather" Feb 20 00:22:05 crc kubenswrapper[4771]: E0220 00:22:05.941181 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="extract-content" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.941198 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="extract-content" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.941638 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerName="copy" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.941716 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace391bd-2fb6-497e-bb5c-ea6135b23e6c" containerName="gather" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.941738 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd21b476-92ce-4bc5-a305-d7bc76d6c2b7" containerName="registry-server" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.944542 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:05 crc kubenswrapper[4771]: I0220 00:22:05.995961 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dd7k4"] Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.119842 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-utilities\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.120128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg82r\" (UniqueName: \"kubernetes.io/projected/51e22239-f1f7-4464-aa3f-861589d68102-kube-api-access-kg82r\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.120545 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-catalog-content\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.222448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-catalog-content\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.222589 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-utilities\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.222635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg82r\" (UniqueName: \"kubernetes.io/projected/51e22239-f1f7-4464-aa3f-861589d68102-kube-api-access-kg82r\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.223212 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-catalog-content\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.223379 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-utilities\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.240963 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg82r\" (UniqueName: \"kubernetes.io/projected/51e22239-f1f7-4464-aa3f-861589d68102-kube-api-access-kg82r\") pod \"certified-operators-dd7k4\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.277823 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:06 crc kubenswrapper[4771]: I0220 00:22:06.824234 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dd7k4"] Feb 20 00:22:07 crc kubenswrapper[4771]: I0220 00:22:07.350319 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerStarted","Data":"3851a82943bc5849e62507b17b7d1d57a45a97f700fbe257f5db6042f6d67909"} Feb 20 00:22:07 crc kubenswrapper[4771]: I0220 00:22:07.350597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerStarted","Data":"fc562bea9467d393e2dbe8837fa247c61d281f3f8bb1c336d2f54f3237147014"} Feb 20 00:22:07 crc kubenswrapper[4771]: I0220 00:22:07.353723 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 00:22:08 crc kubenswrapper[4771]: I0220 00:22:08.364552 4771 generic.go:334] "Generic (PLEG): container finished" podID="51e22239-f1f7-4464-aa3f-861589d68102" containerID="3851a82943bc5849e62507b17b7d1d57a45a97f700fbe257f5db6042f6d67909" exitCode=0 Feb 20 00:22:08 crc kubenswrapper[4771]: I0220 00:22:08.364598 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerDied","Data":"3851a82943bc5849e62507b17b7d1d57a45a97f700fbe257f5db6042f6d67909"} Feb 20 00:22:08 crc kubenswrapper[4771]: I0220 00:22:08.365260 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerStarted","Data":"261fee8e7da2e6a6f7b9201d46fcb074ea9ff104dad4fe9f62d8dcc54ced9df5"} Feb 20 00:22:10 crc kubenswrapper[4771]: I0220 00:22:10.389036 4771 generic.go:334] "Generic (PLEG): container finished" podID="51e22239-f1f7-4464-aa3f-861589d68102" containerID="261fee8e7da2e6a6f7b9201d46fcb074ea9ff104dad4fe9f62d8dcc54ced9df5" exitCode=0 Feb 20 00:22:10 crc kubenswrapper[4771]: I0220 00:22:10.389370 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerDied","Data":"261fee8e7da2e6a6f7b9201d46fcb074ea9ff104dad4fe9f62d8dcc54ced9df5"} Feb 20 00:22:10 crc kubenswrapper[4771]: I0220 00:22:10.475139 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:22:10 crc kubenswrapper[4771]: E0220 00:22:10.475692 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:22:11 crc kubenswrapper[4771]: I0220 00:22:11.403285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerStarted","Data":"03a637c07dc8ed703a97797a47f57de94270db86780ddd56f60ea21a5ca31894"} Feb 20 00:22:11 crc kubenswrapper[4771]: I0220 00:22:11.438875 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dd7k4" podStartSLOduration=2.96663397 podStartE2EDuration="6.438856411s" podCreationTimestamp="2026-02-20 00:22:05 +0000 UTC" firstStartedPulling="2026-02-20 00:22:07.353430559 +0000 UTC m=+10427.624873029" lastFinishedPulling="2026-02-20 00:22:10.82565296 +0000 UTC m=+10431.097095470" observedRunningTime="2026-02-20 00:22:11.430039465 +0000 UTC m=+10431.701481955" watchObservedRunningTime="2026-02-20 00:22:11.438856411 +0000 UTC m=+10431.710298881" Feb 20 00:22:16 crc kubenswrapper[4771]: I0220 00:22:16.278485 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:16 crc kubenswrapper[4771]: I0220 00:22:16.278798 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:16 crc kubenswrapper[4771]: I0220 00:22:16.329651 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:16 crc kubenswrapper[4771]: I0220 00:22:16.528343 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:16 crc kubenswrapper[4771]: I0220 00:22:16.914620 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dd7k4"] Feb 20 00:22:18 crc kubenswrapper[4771]: I0220 00:22:18.483042 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dd7k4" podUID="51e22239-f1f7-4464-aa3f-861589d68102" containerName="registry-server" containerID="cri-o://03a637c07dc8ed703a97797a47f57de94270db86780ddd56f60ea21a5ca31894" gracePeriod=2 Feb 20 00:22:19 crc kubenswrapper[4771]: I0220 00:22:19.496084 4771 generic.go:334] "Generic (PLEG): container finished" podID="51e22239-f1f7-4464-aa3f-861589d68102" containerID="03a637c07dc8ed703a97797a47f57de94270db86780ddd56f60ea21a5ca31894" exitCode=0 Feb 20 00:22:19 crc kubenswrapper[4771]: I0220 00:22:19.496149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerDied","Data":"03a637c07dc8ed703a97797a47f57de94270db86780ddd56f60ea21a5ca31894"} Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.282277 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.448276 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-catalog-content\") pod \"51e22239-f1f7-4464-aa3f-861589d68102\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.448415 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-utilities\") pod \"51e22239-f1f7-4464-aa3f-861589d68102\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.448441 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg82r\" (UniqueName: \"kubernetes.io/projected/51e22239-f1f7-4464-aa3f-861589d68102-kube-api-access-kg82r\") pod \"51e22239-f1f7-4464-aa3f-861589d68102\" (UID: \"51e22239-f1f7-4464-aa3f-861589d68102\") " Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.450084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-utilities" (OuterVolumeSpecName: "utilities") pod "51e22239-f1f7-4464-aa3f-861589d68102" (UID: "51e22239-f1f7-4464-aa3f-861589d68102"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.461384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e22239-f1f7-4464-aa3f-861589d68102-kube-api-access-kg82r" (OuterVolumeSpecName: "kube-api-access-kg82r") pod "51e22239-f1f7-4464-aa3f-861589d68102" (UID: "51e22239-f1f7-4464-aa3f-861589d68102"). InnerVolumeSpecName "kube-api-access-kg82r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.507712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd7k4" event={"ID":"51e22239-f1f7-4464-aa3f-861589d68102","Type":"ContainerDied","Data":"fc562bea9467d393e2dbe8837fa247c61d281f3f8bb1c336d2f54f3237147014"} Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.507767 4771 scope.go:117] "RemoveContainer" containerID="03a637c07dc8ed703a97797a47f57de94270db86780ddd56f60ea21a5ca31894" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.507902 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd7k4" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.510653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51e22239-f1f7-4464-aa3f-861589d68102" (UID: "51e22239-f1f7-4464-aa3f-861589d68102"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.537155 4771 scope.go:117] "RemoveContainer" containerID="261fee8e7da2e6a6f7b9201d46fcb074ea9ff104dad4fe9f62d8dcc54ced9df5" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.550912 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.551174 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg82r\" (UniqueName: \"kubernetes.io/projected/51e22239-f1f7-4464-aa3f-861589d68102-kube-api-access-kg82r\") on node \"crc\" DevicePath \"\"" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.551183 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e22239-f1f7-4464-aa3f-861589d68102-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.560333 4771 scope.go:117] "RemoveContainer" containerID="3851a82943bc5849e62507b17b7d1d57a45a97f700fbe257f5db6042f6d67909" Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.851072 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dd7k4"] Feb 20 00:22:20 crc kubenswrapper[4771]: I0220 00:22:20.864396 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dd7k4"] Feb 20 00:22:21 crc kubenswrapper[4771]: I0220 00:22:21.437527 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:22:21 crc kubenswrapper[4771]: E0220 00:22:21.438114 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:22:22 crc kubenswrapper[4771]: I0220 00:22:22.458779 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e22239-f1f7-4464-aa3f-861589d68102" path="/var/lib/kubelet/pods/51e22239-f1f7-4464-aa3f-861589d68102/volumes" Feb 20 00:22:32 crc kubenswrapper[4771]: I0220 00:22:32.437653 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:22:32 crc kubenswrapper[4771]: E0220 00:22:32.438543 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:22:45 crc kubenswrapper[4771]: I0220 00:22:45.437849 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:22:45 crc kubenswrapper[4771]: E0220 00:22:45.438959 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:22:56 crc kubenswrapper[4771]: I0220 00:22:56.437386 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:22:56 crc kubenswrapper[4771]: E0220 00:22:56.438386 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:23:09 crc kubenswrapper[4771]: I0220 00:23:09.439334 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:23:09 crc kubenswrapper[4771]: E0220 00:23:09.440778 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:23:21 crc kubenswrapper[4771]: I0220 00:23:21.438554 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:23:21 crc kubenswrapper[4771]: E0220 00:23:21.439407 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:23:33 crc kubenswrapper[4771]: I0220 00:23:33.437577 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:23:33 crc kubenswrapper[4771]: E0220 00:23:33.439362 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:23:44 crc kubenswrapper[4771]: I0220 00:23:44.437303 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:23:44 crc kubenswrapper[4771]: E0220 00:23:44.438265 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" Feb 20 00:23:57 crc kubenswrapper[4771]: I0220 00:23:57.437680 4771 scope.go:117] "RemoveContainer" containerID="350871e11e34529eae13130981b192052225749cc8d20519cd9109da2cb84ae2" Feb 20 00:23:57 crc kubenswrapper[4771]: E0220 00:23:57.438347 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-q64kb_openshift-machine-config-operator(db19bfc5-46dd-47ae-9608-aafec9e35f9e)\"" pod="openshift-machine-config-operator/machine-config-daemon-q64kb" podUID="db19bfc5-46dd-47ae-9608-aafec9e35f9e" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145724655024463 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145724656017401 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145677633016526 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145677634015477 5ustar corecore